Mar 08 19:31:37 crc systemd[1]: Starting Kubernetes Kubelet... Mar 08 19:31:37 crc restorecon[4749]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:37 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 19:31:38 crc restorecon[4749]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 08 19:31:38 crc restorecon[4749]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 08 19:31:39 crc kubenswrapper[4885]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 19:31:39 crc kubenswrapper[4885]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 08 19:31:39 crc kubenswrapper[4885]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 19:31:39 crc kubenswrapper[4885]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 19:31:39 crc kubenswrapper[4885]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 08 19:31:39 crc kubenswrapper[4885]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.056539 4885 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066691 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066753 4885 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066764 4885 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066772 4885 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066778 4885 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066785 4885 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066795 4885 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066803 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066811 4885 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066820 4885 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066828 4885 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066838 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066846 4885 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066854 4885 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066860 4885 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066868 4885 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066874 4885 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066882 4885 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066888 4885 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066894 4885 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066901 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066907 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066913 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066941 4885 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066947 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066952 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066957 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066963 4885 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066969 4885 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066976 4885 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066982 4885 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.066996 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067002 4885 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067009 4885 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067016 4885 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067021 4885 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067027 4885 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067032 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067037 4885 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067042 4885 feature_gate.go:330] unrecognized feature gate: Example Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067047 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067052 4885 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067057 4885 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067063 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067068 4885 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067074 4885 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067079 4885 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067084 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067089 4885 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067094 4885 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067099 4885 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067104 4885 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067111 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067120 4885 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067128 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067136 4885 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067143 4885 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067149 4885 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067156 4885 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067161 4885 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067167 4885 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067173 4885 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067179 4885 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067185 4885 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067189 4885 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067195 4885 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067200 4885 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067205 4885 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067220 4885 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067225 4885 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.067234 4885 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.068875 4885 flags.go:64] FLAG: --address="0.0.0.0" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.068902 4885 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.068937 4885 flags.go:64] FLAG: --anonymous-auth="true" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.068948 4885 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.068957 4885 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.068964 4885 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.068975 4885 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.068985 4885 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.068993 4885 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069001 4885 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069010 4885 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069021 4885 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069028 4885 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069035 4885 flags.go:64] FLAG: --cgroup-root="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069040 4885 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069046 4885 flags.go:64] FLAG: --client-ca-file="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069052 4885 flags.go:64] FLAG: --cloud-config="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069058 4885 flags.go:64] FLAG: --cloud-provider="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069064 4885 flags.go:64] FLAG: --cluster-dns="[]" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069075 4885 flags.go:64] FLAG: --cluster-domain="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069080 4885 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069086 4885 flags.go:64] FLAG: --config-dir="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069092 4885 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069098 4885 flags.go:64] FLAG: --container-log-max-files="5" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069114 4885 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069120 4885 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069127 4885 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069133 4885 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069139 4885 flags.go:64] FLAG: --contention-profiling="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069145 4885 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069151 4885 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069157 4885 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069163 4885 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069172 4885 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069178 4885 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069184 4885 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069191 4885 flags.go:64] FLAG: --enable-load-reader="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069199 4885 flags.go:64] FLAG: --enable-server="true" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069206 4885 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069216 4885 flags.go:64] FLAG: --event-burst="100" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069225 4885 flags.go:64] FLAG: --event-qps="50" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069232 4885 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069240 4885 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069248 4885 flags.go:64] FLAG: --eviction-hard="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069260 4885 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069267 4885 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069275 4885 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069283 4885 flags.go:64] FLAG: --eviction-soft="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069289 4885 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069295 4885 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069301 4885 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069307 4885 flags.go:64] FLAG: --experimental-mounter-path="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069313 4885 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069319 4885 flags.go:64] FLAG: --fail-swap-on="true" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069325 4885 flags.go:64] FLAG: --feature-gates="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069332 4885 flags.go:64] FLAG: --file-check-frequency="20s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069339 4885 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069346 4885 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069354 4885 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069362 4885 flags.go:64] FLAG: --healthz-port="10248" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069370 4885 flags.go:64] FLAG: --help="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069377 4885 flags.go:64] FLAG: --hostname-override="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069385 4885 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069392 4885 flags.go:64] FLAG: --http-check-frequency="20s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069400 4885 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069408 4885 flags.go:64] FLAG: --image-credential-provider-config="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069415 4885 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069423 4885 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069430 4885 flags.go:64] FLAG: --image-service-endpoint="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069437 4885 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069445 4885 flags.go:64] FLAG: --kube-api-burst="100" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069453 4885 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069461 4885 flags.go:64] FLAG: --kube-api-qps="50" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069469 4885 flags.go:64] FLAG: --kube-reserved="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069477 4885 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069484 4885 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069492 4885 flags.go:64] FLAG: --kubelet-cgroups="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069499 4885 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069506 4885 flags.go:64] FLAG: --lock-file="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069514 4885 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069520 4885 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069526 4885 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069536 4885 flags.go:64] FLAG: --log-json-split-stream="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069544 4885 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069549 4885 flags.go:64] FLAG: --log-text-split-stream="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069555 4885 flags.go:64] FLAG: --logging-format="text" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069561 4885 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069568 4885 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069573 4885 flags.go:64] FLAG: --manifest-url="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069579 4885 flags.go:64] FLAG: --manifest-url-header="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069588 4885 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069603 4885 flags.go:64] FLAG: --max-open-files="1000000" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069613 4885 flags.go:64] FLAG: --max-pods="110" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069620 4885 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069628 4885 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069635 4885 flags.go:64] FLAG: --memory-manager-policy="None" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069644 4885 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069656 4885 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069665 4885 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069674 4885 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069695 4885 flags.go:64] FLAG: --node-status-max-images="50" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069703 4885 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069711 4885 flags.go:64] FLAG: --oom-score-adj="-999" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069718 4885 flags.go:64] FLAG: --pod-cidr="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069725 4885 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069739 4885 flags.go:64] FLAG: --pod-manifest-path="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069747 4885 flags.go:64] FLAG: --pod-max-pids="-1" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069755 4885 flags.go:64] FLAG: --pods-per-core="0" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069762 4885 flags.go:64] FLAG: --port="10250" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069770 4885 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069777 4885 flags.go:64] FLAG: --provider-id="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069784 4885 flags.go:64] FLAG: --qos-reserved="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069792 4885 flags.go:64] FLAG: --read-only-port="10255" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069799 4885 flags.go:64] FLAG: --register-node="true" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069806 4885 flags.go:64] FLAG: --register-schedulable="true" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069811 4885 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069832 4885 flags.go:64] FLAG: --registry-burst="10" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069839 4885 flags.go:64] FLAG: --registry-qps="5" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069846 4885 flags.go:64] FLAG: --reserved-cpus="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069855 4885 flags.go:64] FLAG: --reserved-memory="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069865 4885 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069872 4885 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069880 4885 flags.go:64] FLAG: --rotate-certificates="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069887 4885 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069899 4885 flags.go:64] FLAG: --runonce="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069906 4885 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069937 4885 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069945 4885 flags.go:64] FLAG: --seccomp-default="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069953 4885 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069960 4885 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069968 4885 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069976 4885 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069984 4885 flags.go:64] FLAG: --storage-driver-password="root" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069991 4885 flags.go:64] FLAG: --storage-driver-secure="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.069999 4885 flags.go:64] FLAG: --storage-driver-table="stats" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070006 4885 flags.go:64] FLAG: --storage-driver-user="root" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070013 4885 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070022 4885 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070030 4885 flags.go:64] FLAG: --system-cgroups="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070038 4885 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070051 4885 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070059 4885 flags.go:64] FLAG: --tls-cert-file="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070074 4885 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070086 4885 flags.go:64] FLAG: --tls-min-version="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070094 4885 flags.go:64] FLAG: --tls-private-key-file="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070101 4885 flags.go:64] FLAG: --topology-manager-policy="none" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070108 4885 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070116 4885 flags.go:64] FLAG: --topology-manager-scope="container" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070124 4885 flags.go:64] FLAG: --v="2" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070135 4885 flags.go:64] FLAG: --version="false" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070145 4885 flags.go:64] FLAG: --vmodule="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070155 4885 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070163 4885 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070346 4885 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070353 4885 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070362 4885 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070370 4885 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070376 4885 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070381 4885 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070387 4885 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070392 4885 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070397 4885 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070401 4885 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070406 4885 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070413 4885 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070420 4885 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070425 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070430 4885 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070436 4885 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070442 4885 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070447 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070452 4885 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070457 4885 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070463 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070469 4885 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070474 4885 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070479 4885 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070484 4885 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070489 4885 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070494 4885 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070499 4885 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070504 4885 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070510 4885 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070516 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070521 4885 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070526 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070531 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070538 4885 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070545 4885 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070551 4885 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070557 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070567 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070572 4885 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070578 4885 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070583 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070588 4885 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070594 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070599 4885 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070604 4885 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070609 4885 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070615 4885 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070621 4885 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070626 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070633 4885 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070639 4885 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070645 4885 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070654 4885 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070660 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070665 4885 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070671 4885 feature_gate.go:330] unrecognized feature gate: Example Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070676 4885 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070681 4885 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070687 4885 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070692 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070697 4885 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070702 4885 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070707 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070712 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070716 4885 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070721 4885 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070728 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070734 4885 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070739 4885 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.070743 4885 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.070752 4885 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.084619 4885 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.085131 4885 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085296 4885 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085311 4885 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085322 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085331 4885 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085340 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085349 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085359 4885 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085371 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085380 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085388 4885 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085396 4885 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085407 4885 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085419 4885 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085428 4885 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085436 4885 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085445 4885 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085454 4885 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085463 4885 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085471 4885 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085479 4885 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085487 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085496 4885 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085503 4885 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085512 4885 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085520 4885 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085528 4885 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085536 4885 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085543 4885 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085552 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085572 4885 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085582 4885 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085594 4885 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085603 4885 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085614 4885 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085624 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085634 4885 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085645 4885 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085653 4885 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085661 4885 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085670 4885 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085678 4885 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085686 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085693 4885 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085701 4885 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085710 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085718 4885 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085726 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085733 4885 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085742 4885 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085750 4885 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085758 4885 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085767 4885 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085774 4885 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085782 4885 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085791 4885 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085800 4885 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085808 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085817 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085825 4885 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085834 4885 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085842 4885 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085850 4885 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085858 4885 feature_gate.go:330] unrecognized feature gate: Example Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085866 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085873 4885 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085899 4885 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085907 4885 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085915 4885 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085948 4885 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085956 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.085964 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.085978 4885 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086283 4885 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086306 4885 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086322 4885 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086334 4885 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086345 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086354 4885 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086364 4885 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086374 4885 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086385 4885 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086398 4885 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086411 4885 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086423 4885 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086433 4885 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086441 4885 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086449 4885 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086458 4885 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086465 4885 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086473 4885 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086482 4885 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086490 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086497 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086505 4885 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086513 4885 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086521 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086529 4885 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086537 4885 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086545 4885 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086553 4885 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086561 4885 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086573 4885 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086581 4885 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086589 4885 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086598 4885 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086606 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086614 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086622 4885 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086630 4885 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086638 4885 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086646 4885 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086655 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086662 4885 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086670 4885 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086678 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086685 4885 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086693 4885 feature_gate.go:330] unrecognized feature gate: Example Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086701 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086709 4885 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086717 4885 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086725 4885 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086733 4885 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086741 4885 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086749 4885 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086756 4885 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086764 4885 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086771 4885 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086780 4885 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086788 4885 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086795 4885 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086803 4885 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086811 4885 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086822 4885 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086830 4885 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086838 4885 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086846 4885 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086862 4885 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086881 4885 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086891 4885 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086900 4885 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086909 4885 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086952 4885 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.086965 4885 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.086980 4885 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.087274 4885 server.go:940] "Client rotation is on, will bootstrap in background" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.092772 4885 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.097853 4885 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.098048 4885 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.099801 4885 server.go:997] "Starting client certificate rotation" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.099851 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.100181 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.129761 4885 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.132774 4885 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.134161 4885 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.153422 4885 log.go:25] "Validated CRI v1 runtime API" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.200177 4885 log.go:25] "Validated CRI v1 image API" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.203184 4885 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.210169 4885 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-08-19-26-52-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.210240 4885 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.241824 4885 manager.go:217] Machine: {Timestamp:2026-03-08 19:31:39.237961889 +0000 UTC m=+0.634015972 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7aa01b7c-4329-4abc-97e1-626c363cfaee BootID:4c2a725e-e9fd-471d-962e-34eaf38ef5ae Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a6:76:c1 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a6:76:c1 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:4c:c9:4c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:1b:bf:03 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:1a:d4:cb Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:e9:a4:33 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:b0:6f:ef Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ea:d0:db:d4:f6:44 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3e:bd:24:dc:eb:8c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.242302 4885 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.242646 4885 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.245059 4885 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.245450 4885 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.245508 4885 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.245996 4885 topology_manager.go:138] "Creating topology manager with none policy" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.246018 4885 container_manager_linux.go:303] "Creating device plugin manager" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.246554 4885 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.246613 4885 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.246977 4885 state_mem.go:36] "Initialized new in-memory state store" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.247137 4885 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.252130 4885 kubelet.go:418] "Attempting to sync node with API server" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.252175 4885 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.252207 4885 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.252234 4885 kubelet.go:324] "Adding apiserver pod source" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.252255 4885 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.257541 4885 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.258338 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.258671 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.258360 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.259036 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.259094 4885 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.263202 4885 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.265066 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.265255 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.265375 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.265482 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.265596 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.265700 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.265802 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.265966 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.266113 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.266228 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.266355 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.266464 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.268756 4885 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.269713 4885 server.go:1280] "Started kubelet" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.270397 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:39 crc systemd[1]: Started Kubernetes Kubelet. Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.275952 4885 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.275982 4885 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.277005 4885 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.283464 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.283525 4885 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.284215 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.285001 4885 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.286619 4885 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.286402 4885 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.287878 4885 server.go:460] "Adding debug handlers to kubelet server" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.288336 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="200ms" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.284800 4885 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.80:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189af48f37573b5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.269675871 +0000 UTC m=+0.665729934,LastTimestamp:2026-03-08 19:31:39.269675871 +0000 UTC m=+0.665729934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.288422 4885 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.288485 4885 factory.go:55] Registering systemd factory Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.288514 4885 factory.go:221] Registration of the systemd container factory successfully Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.288913 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.289203 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.294588 4885 factory.go:153] Registering CRI-O factory Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.294641 4885 factory.go:221] Registration of the crio container factory successfully Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.294687 4885 factory.go:103] Registering Raw factory Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.294720 4885 manager.go:1196] Started watching for new ooms in manager Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.296022 4885 manager.go:319] Starting recovery of all containers Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302192 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302275 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302299 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302322 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302343 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302362 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302382 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302401 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302435 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302466 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302490 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302513 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302530 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302556 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302581 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302605 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302633 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302658 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302725 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302749 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302776 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302800 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302826 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302850 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302872 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302890 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302951 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.302981 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303043 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303073 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303168 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303199 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303226 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303252 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303277 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303302 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303329 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303357 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303384 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303410 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303478 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303503 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303530 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303558 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303585 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303610 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303637 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303662 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303692 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303760 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303787 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303812 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303850 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303883 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303915 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.303995 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304026 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304051 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304079 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304105 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304130 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304158 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304184 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304209 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304240 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304266 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304293 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304323 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304348 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304374 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304402 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304431 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304460 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304487 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304513 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304542 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304568 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304594 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304622 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304653 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304683 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304713 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304742 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304769 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304798 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304826 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.304855 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305100 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305126 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305149 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305169 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305191 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305212 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305232 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305251 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305271 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305293 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305385 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305408 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305433 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305455 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305475 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305494 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305515 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305556 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305602 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305633 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305668 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305700 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305732 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305762 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305796 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305828 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305859 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305887 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305914 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.305982 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306009 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306039 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306065 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306092 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306126 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306153 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306178 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306204 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306229 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306254 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306282 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306311 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306339 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306365 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306394 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306422 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306447 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306473 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306498 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306524 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306549 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306574 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306601 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306627 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306655 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306681 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306708 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306736 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306765 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306793 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306821 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306852 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.306881 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309067 4885 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309111 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309135 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309160 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309179 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309204 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309225 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309245 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309268 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309287 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309309 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309328 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309391 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309418 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309445 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309502 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309528 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309550 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309573 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309593 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309615 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309635 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309656 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309686 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309708 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309728 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309750 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309771 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309791 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309812 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309831 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309852 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309871 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309891 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309912 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309960 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.309980 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310000 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310020 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310039 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310060 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310081 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310102 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310122 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310142 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310171 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310195 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310215 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310234 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310254 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310274 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310294 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310312 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310332 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310351 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310371 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310395 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310415 4885 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310435 4885 reconstruct.go:97] "Volume reconstruction finished" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.310448 4885 reconciler.go:26] "Reconciler: start to sync state" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.342626 4885 manager.go:324] Recovery completed Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.359346 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.362885 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.362950 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.362969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.362971 4885 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.364183 4885 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.364220 4885 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.364257 4885 state_mem.go:36] "Initialized new in-memory state store" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.366621 4885 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.366736 4885 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.366820 4885 kubelet.go:2335] "Starting kubelet main sync loop" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.366975 4885 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 08 19:31:39 crc kubenswrapper[4885]: W0308 19:31:39.368208 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.368340 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.384390 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.386241 4885 policy_none.go:49] "None policy: Start" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.387151 4885 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.387187 4885 state_mem.go:35] "Initializing new in-memory state store" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.468243 4885 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.471548 4885 manager.go:334] "Starting Device Plugin manager" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.471676 4885 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.471700 4885 server.go:79] "Starting device plugin registration server" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.472409 4885 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.472444 4885 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.473134 4885 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.473299 4885 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.473323 4885 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.476874 4885 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.80:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189af48f37573b5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.269675871 +0000 UTC m=+0.665729934,LastTimestamp:2026-03-08 19:31:39.269675871 +0000 UTC m=+0.665729934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.483163 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.489509 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="400ms" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.574699 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.578655 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.578752 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.578778 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.578833 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.579952 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.669192 4885 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.669374 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.671851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.671960 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.671981 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.672227 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.673123 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.673197 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.673896 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.674016 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.674039 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.674379 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.674536 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.674604 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.675100 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.675167 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.675188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.677379 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.677429 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.677449 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.677444 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.677615 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.677627 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.677629 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.677771 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.677813 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.679282 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.679329 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.679348 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.679352 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.679397 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.679415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.679598 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.679779 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.679824 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.683124 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.683203 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.683229 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.683677 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.683770 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.688063 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.688150 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.688169 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.688386 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.688440 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.688461 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718068 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718136 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718180 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718216 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718250 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718283 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718364 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718467 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718508 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718543 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718579 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718637 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718816 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.718849 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.780397 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.782986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.783078 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.783100 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.783143 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.783831 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.820563 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.820783 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.820803 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.821151 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.821317 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.821464 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.821604 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.821796 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.822044 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.822226 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.821626 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.821677 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.821405 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.821138 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.821238 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.821971 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.822794 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.823004 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.823245 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.823066 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.823015 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.823363 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.823679 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.823908 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.824110 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.824174 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.824297 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.824519 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.824565 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: I0308 19:31:39.824609 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:39 crc kubenswrapper[4885]: E0308 19:31:39.891646 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="800ms" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.029245 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.047912 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.059432 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.080263 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.084697 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:40 crc kubenswrapper[4885]: W0308 19:31:40.091899 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-0ff98a00c972cc6fbe3ebca5901f0ddcff7c3dbddba3fc430e270883dc8c7f4d WatchSource:0}: Error finding container 0ff98a00c972cc6fbe3ebca5901f0ddcff7c3dbddba3fc430e270883dc8c7f4d: Status 404 returned error can't find the container with id 0ff98a00c972cc6fbe3ebca5901f0ddcff7c3dbddba3fc430e270883dc8c7f4d Mar 08 19:31:40 crc kubenswrapper[4885]: W0308 19:31:40.095508 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-9e57ac97ded07286aa76b930cf0bcd06d41fc2972eafe8d1f95e9c4ae863d9de WatchSource:0}: Error finding container 9e57ac97ded07286aa76b930cf0bcd06d41fc2972eafe8d1f95e9c4ae863d9de: Status 404 returned error can't find the container with id 9e57ac97ded07286aa76b930cf0bcd06d41fc2972eafe8d1f95e9c4ae863d9de Mar 08 19:31:40 crc kubenswrapper[4885]: W0308 19:31:40.105590 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-25fd057e84a174b7a296049d5b9ff10b2152fa4531833fdfcd0463f9afce0854 WatchSource:0}: Error finding container 25fd057e84a174b7a296049d5b9ff10b2152fa4531833fdfcd0463f9afce0854: Status 404 returned error can't find the container with id 25fd057e84a174b7a296049d5b9ff10b2152fa4531833fdfcd0463f9afce0854 Mar 08 19:31:40 crc kubenswrapper[4885]: W0308 19:31:40.116543 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-8453780cda141227b73c3c41ab6a4c5d74f914c3916abff9c6a3a1e25e4c9385 WatchSource:0}: Error finding container 8453780cda141227b73c3c41ab6a4c5d74f914c3916abff9c6a3a1e25e4c9385: Status 404 returned error can't find the container with id 8453780cda141227b73c3c41ab6a4c5d74f914c3916abff9c6a3a1e25e4c9385 Mar 08 19:31:40 crc kubenswrapper[4885]: W0308 19:31:40.120493 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-4b409e909eb145f0f39f1ce40fd6d75f71bc7f7a4d96ddfe96b770d4fec0f947 WatchSource:0}: Error finding container 4b409e909eb145f0f39f1ce40fd6d75f71bc7f7a4d96ddfe96b770d4fec0f947: Status 404 returned error can't find the container with id 4b409e909eb145f0f39f1ce40fd6d75f71bc7f7a4d96ddfe96b770d4fec0f947 Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.184238 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.186579 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.186635 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.186656 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.186693 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:31:40 crc kubenswrapper[4885]: E0308 19:31:40.187426 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Mar 08 19:31:40 crc kubenswrapper[4885]: W0308 19:31:40.199632 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:40 crc kubenswrapper[4885]: E0308 19:31:40.199736 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.272284 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:40 crc kubenswrapper[4885]: W0308 19:31:40.357386 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:40 crc kubenswrapper[4885]: E0308 19:31:40.357524 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.372553 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"25fd057e84a174b7a296049d5b9ff10b2152fa4531833fdfcd0463f9afce0854"} Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.373995 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9e57ac97ded07286aa76b930cf0bcd06d41fc2972eafe8d1f95e9c4ae863d9de"} Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.375534 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0ff98a00c972cc6fbe3ebca5901f0ddcff7c3dbddba3fc430e270883dc8c7f4d"} Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.377036 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4b409e909eb145f0f39f1ce40fd6d75f71bc7f7a4d96ddfe96b770d4fec0f947"} Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.378732 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8453780cda141227b73c3c41ab6a4c5d74f914c3916abff9c6a3a1e25e4c9385"} Mar 08 19:31:40 crc kubenswrapper[4885]: W0308 19:31:40.621648 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:40 crc kubenswrapper[4885]: E0308 19:31:40.621810 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:31:40 crc kubenswrapper[4885]: E0308 19:31:40.693071 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="1.6s" Mar 08 19:31:40 crc kubenswrapper[4885]: W0308 19:31:40.921313 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:40 crc kubenswrapper[4885]: E0308 19:31:40.921446 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.987832 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.989970 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.990075 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.990104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:40 crc kubenswrapper[4885]: I0308 19:31:40.990155 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:31:40 crc kubenswrapper[4885]: E0308 19:31:40.991024 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.193100 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 19:31:41 crc kubenswrapper[4885]: E0308 19:31:41.194635 4885 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.272103 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.385623 4885 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135" exitCode=0 Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.385717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135"} Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.385787 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.388310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.388377 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.388407 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.390365 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b"} Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.390434 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52"} Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.392955 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d" exitCode=0 Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.393027 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d"} Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.393112 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.394776 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.394842 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.394862 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.395941 4885 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975" exitCode=0 Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.395993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975"} Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.396097 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.397195 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.397913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.397997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.398016 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.398644 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.398728 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.398755 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.398983 4885 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a" exitCode=0 Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.399039 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a"} Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.399104 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.400521 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.400575 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:41 crc kubenswrapper[4885]: I0308 19:31:41.400605 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.271768 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:42 crc kubenswrapper[4885]: E0308 19:31:42.294863 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="3.2s" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.405645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178"} Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.405715 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98"} Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.405734 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39"} Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.405791 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.407366 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.407414 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.407431 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.411002 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917"} Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.411213 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b"} Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.411110 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.412768 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.412963 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.413089 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.418821 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65"} Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.418893 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0"} Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.418916 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa"} Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.418963 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad"} Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.424301 4885 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6" exitCode=0 Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.424457 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6"} Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.424510 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.425557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.425619 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.425639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.426672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a"} Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.426813 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.427917 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.427993 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.428014 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:42 crc kubenswrapper[4885]: W0308 19:31:42.494837 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:31:42 crc kubenswrapper[4885]: E0308 19:31:42.494980 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.591265 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.593163 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.593241 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.593255 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:42 crc kubenswrapper[4885]: I0308 19:31:42.593287 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:31:42 crc kubenswrapper[4885]: E0308 19:31:42.594465 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.80:6443: connect: connection refused" node="crc" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.434011 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dc7c276f635b4934fefdf47d178e900fdab19e641bc2a2268a3866783121710c"} Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.434172 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.435568 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.435701 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.435810 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.437607 4885 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533" exitCode=0 Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.437740 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.437776 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.437774 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533"} Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.437778 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.437875 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.437811 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.439471 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.439514 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.439529 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.439873 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.439878 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.440013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.440073 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.440023 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.440129 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.440092 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.439953 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.440323 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:43 crc kubenswrapper[4885]: I0308 19:31:43.928580 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.446778 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286"} Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.446837 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.446869 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af"} Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.446897 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c"} Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.447028 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.447118 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.448453 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.448496 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.448532 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.448548 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.448534 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.448659 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.666812 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.667131 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.668847 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.668943 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.668995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:44 crc kubenswrapper[4885]: I0308 19:31:44.677776 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.387142 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.456284 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569"} Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.456376 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f"} Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.456467 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.456484 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.456500 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.456532 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.456662 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.458166 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.458223 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.458237 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.458450 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.458518 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.458542 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.458522 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.458662 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.458686 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.795492 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.797572 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.797655 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.797676 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.797716 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:31:45 crc kubenswrapper[4885]: I0308 19:31:45.951112 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:46 crc kubenswrapper[4885]: I0308 19:31:46.459889 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:46 crc kubenswrapper[4885]: I0308 19:31:46.459907 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:46 crc kubenswrapper[4885]: I0308 19:31:46.460169 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:46 crc kubenswrapper[4885]: I0308 19:31:46.461799 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:46 crc kubenswrapper[4885]: I0308 19:31:46.461859 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:46 crc kubenswrapper[4885]: I0308 19:31:46.461886 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:46 crc kubenswrapper[4885]: I0308 19:31:46.461894 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:46 crc kubenswrapper[4885]: I0308 19:31:46.461944 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:46 crc kubenswrapper[4885]: I0308 19:31:46.461955 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:46 crc kubenswrapper[4885]: I0308 19:31:46.462174 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:46 crc kubenswrapper[4885]: I0308 19:31:46.462223 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:46 crc kubenswrapper[4885]: I0308 19:31:46.462245 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:47 crc kubenswrapper[4885]: I0308 19:31:47.597624 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:47 crc kubenswrapper[4885]: I0308 19:31:47.597861 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:47 crc kubenswrapper[4885]: I0308 19:31:47.600015 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:47 crc kubenswrapper[4885]: I0308 19:31:47.600071 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:47 crc kubenswrapper[4885]: I0308 19:31:47.600090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:48 crc kubenswrapper[4885]: I0308 19:31:48.397311 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 08 19:31:48 crc kubenswrapper[4885]: I0308 19:31:48.397577 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:48 crc kubenswrapper[4885]: I0308 19:31:48.399578 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:48 crc kubenswrapper[4885]: I0308 19:31:48.399646 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:48 crc kubenswrapper[4885]: I0308 19:31:48.399669 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:49 crc kubenswrapper[4885]: E0308 19:31:49.483337 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:31:50 crc kubenswrapper[4885]: I0308 19:31:50.245068 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:50 crc kubenswrapper[4885]: I0308 19:31:50.245334 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:50 crc kubenswrapper[4885]: I0308 19:31:50.247382 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:50 crc kubenswrapper[4885]: I0308 19:31:50.247461 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:50 crc kubenswrapper[4885]: I0308 19:31:50.247488 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:50 crc kubenswrapper[4885]: I0308 19:31:50.251783 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:50 crc kubenswrapper[4885]: I0308 19:31:50.433817 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:31:50 crc kubenswrapper[4885]: I0308 19:31:50.474573 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:50 crc kubenswrapper[4885]: I0308 19:31:50.476165 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:50 crc kubenswrapper[4885]: I0308 19:31:50.476226 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:50 crc kubenswrapper[4885]: I0308 19:31:50.476268 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:51 crc kubenswrapper[4885]: I0308 19:31:51.478131 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:51 crc kubenswrapper[4885]: I0308 19:31:51.480628 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:51 crc kubenswrapper[4885]: I0308 19:31:51.480705 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:51 crc kubenswrapper[4885]: I0308 19:31:51.480727 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:52 crc kubenswrapper[4885]: I0308 19:31:52.010998 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 08 19:31:52 crc kubenswrapper[4885]: I0308 19:31:52.011366 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:52 crc kubenswrapper[4885]: I0308 19:31:52.013383 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:52 crc kubenswrapper[4885]: I0308 19:31:52.013451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:52 crc kubenswrapper[4885]: I0308 19:31:52.013472 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:53 crc kubenswrapper[4885]: W0308 19:31:53.210373 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 08 19:31:53 crc kubenswrapper[4885]: I0308 19:31:53.210525 4885 trace.go:236] Trace[1422133869]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Mar-2026 19:31:43.208) (total time: 10002ms): Mar 08 19:31:53 crc kubenswrapper[4885]: Trace[1422133869]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (19:31:53.210) Mar 08 19:31:53 crc kubenswrapper[4885]: Trace[1422133869]: [10.002430885s] [10.002430885s] END Mar 08 19:31:53 crc kubenswrapper[4885]: E0308 19:31:53.210562 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 08 19:31:53 crc kubenswrapper[4885]: I0308 19:31:53.273832 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 08 19:31:53 crc kubenswrapper[4885]: I0308 19:31:53.434437 4885 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:31:53 crc kubenswrapper[4885]: I0308 19:31:53.434522 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 19:31:53 crc kubenswrapper[4885]: W0308 19:31:53.549486 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 08 19:31:53 crc kubenswrapper[4885]: I0308 19:31:53.549589 4885 trace.go:236] Trace[741344643]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Mar-2026 19:31:43.547) (total time: 10001ms): Mar 08 19:31:53 crc kubenswrapper[4885]: Trace[741344643]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (19:31:53.549) Mar 08 19:31:53 crc kubenswrapper[4885]: Trace[741344643]: [10.001730215s] [10.001730215s] END Mar 08 19:31:53 crc kubenswrapper[4885]: E0308 19:31:53.549612 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 08 19:31:53 crc kubenswrapper[4885]: W0308 19:31:53.851022 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 08 19:31:53 crc kubenswrapper[4885]: I0308 19:31:53.851175 4885 trace.go:236] Trace[519929954]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (08-Mar-2026 19:31:43.850) (total time: 10001ms): Mar 08 19:31:53 crc kubenswrapper[4885]: Trace[519929954]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (19:31:53.851) Mar 08 19:31:53 crc kubenswrapper[4885]: Trace[519929954]: [10.0011137s] [10.0011137s] END Mar 08 19:31:53 crc kubenswrapper[4885]: E0308 19:31:53.851216 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 08 19:31:53 crc kubenswrapper[4885]: I0308 19:31:53.928886 4885 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:31:53 crc kubenswrapper[4885]: I0308 19:31:53.929006 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 19:31:54 crc kubenswrapper[4885]: E0308 19:31:54.903228 4885 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:54Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189af48f37573b5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.269675871 +0000 UTC m=+0.665729934,LastTimestamp:2026-03-08 19:31:39.269675871 +0000 UTC m=+0.665729934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:31:54 crc kubenswrapper[4885]: I0308 19:31:54.920304 4885 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 19:31:54 crc kubenswrapper[4885]: I0308 19:31:54.920958 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 08 19:31:54 crc kubenswrapper[4885]: I0308 19:31:54.924386 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:54Z is after 2026-02-23T05:33:13Z Mar 08 19:31:54 crc kubenswrapper[4885]: E0308 19:31:54.927708 4885 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 19:31:54 crc kubenswrapper[4885]: E0308 19:31:54.929333 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:54Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 08 19:31:54 crc kubenswrapper[4885]: W0308 19:31:54.933747 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:54Z is after 2026-02-23T05:33:13Z Mar 08 19:31:54 crc kubenswrapper[4885]: E0308 19:31:54.933842 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:54Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 19:31:54 crc kubenswrapper[4885]: I0308 19:31:54.942835 4885 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38860->192.168.126.11:17697: read: connection reset by peer" start-of-body= Mar 08 19:31:54 crc kubenswrapper[4885]: I0308 19:31:54.943848 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38860->192.168.126.11:17697: read: connection reset by peer" Mar 08 19:31:54 crc kubenswrapper[4885]: E0308 19:31:54.943871 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:54Z is after 2026-02-23T05:33:13Z" node="crc" Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.274759 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:55Z is after 2026-02-23T05:33:13Z Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.491816 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.494494 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dc7c276f635b4934fefdf47d178e900fdab19e641bc2a2268a3866783121710c" exitCode=255 Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.494546 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dc7c276f635b4934fefdf47d178e900fdab19e641bc2a2268a3866783121710c"} Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.494718 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.495706 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.495772 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.495796 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:55 crc kubenswrapper[4885]: I0308 19:31:55.496786 4885 scope.go:117] "RemoveContainer" containerID="dc7c276f635b4934fefdf47d178e900fdab19e641bc2a2268a3866783121710c" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.275430 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:56Z is after 2026-02-23T05:33:13Z Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.500314 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.501189 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.505081 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" exitCode=255 Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.505163 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8"} Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.505323 4885 scope.go:117] "RemoveContainer" containerID="dc7c276f635b4934fefdf47d178e900fdab19e641bc2a2268a3866783121710c" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.505491 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.507053 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.507112 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.507124 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:56 crc kubenswrapper[4885]: I0308 19:31:56.507742 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:31:56 crc kubenswrapper[4885]: E0308 19:31:56.507997 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:31:57 crc kubenswrapper[4885]: I0308 19:31:57.276871 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:57Z is after 2026-02-23T05:33:13Z Mar 08 19:31:57 crc kubenswrapper[4885]: I0308 19:31:57.511475 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 19:31:58 crc kubenswrapper[4885]: W0308 19:31:58.005556 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:58Z is after 2026-02-23T05:33:13Z Mar 08 19:31:58 crc kubenswrapper[4885]: E0308 19:31:58.005671 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.275614 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:58Z is after 2026-02-23T05:33:13Z Mar 08 19:31:58 crc kubenswrapper[4885]: W0308 19:31:58.895810 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:58Z is after 2026-02-23T05:33:13Z Mar 08 19:31:58 crc kubenswrapper[4885]: E0308 19:31:58.895961 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.938463 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.938665 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.940268 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.940358 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.940384 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.941571 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:31:58 crc kubenswrapper[4885]: E0308 19:31:58.941996 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:31:58 crc kubenswrapper[4885]: I0308 19:31:58.946156 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:31:59 crc kubenswrapper[4885]: I0308 19:31:59.276755 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:59Z is after 2026-02-23T05:33:13Z Mar 08 19:31:59 crc kubenswrapper[4885]: W0308 19:31:59.418162 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:59Z is after 2026-02-23T05:33:13Z Mar 08 19:31:59 crc kubenswrapper[4885]: E0308 19:31:59.418299 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:31:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 08 19:31:59 crc kubenswrapper[4885]: E0308 19:31:59.483634 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:31:59 crc kubenswrapper[4885]: I0308 19:31:59.520192 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:31:59 crc kubenswrapper[4885]: I0308 19:31:59.521527 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:31:59 crc kubenswrapper[4885]: I0308 19:31:59.521584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:31:59 crc kubenswrapper[4885]: I0308 19:31:59.521604 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:31:59 crc kubenswrapper[4885]: I0308 19:31:59.522483 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:31:59 crc kubenswrapper[4885]: E0308 19:31:59.522775 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:00 crc kubenswrapper[4885]: I0308 19:32:00.280823 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:32:00Z is after 2026-02-23T05:33:13Z Mar 08 19:32:01 crc kubenswrapper[4885]: I0308 19:32:01.275947 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:32:01Z is after 2026-02-23T05:33:13Z Mar 08 19:32:01 crc kubenswrapper[4885]: E0308 19:32:01.335017 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:32:01Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 08 19:32:01 crc kubenswrapper[4885]: I0308 19:32:01.344816 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:01 crc kubenswrapper[4885]: I0308 19:32:01.346523 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:01 crc kubenswrapper[4885]: I0308 19:32:01.346572 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:01 crc kubenswrapper[4885]: I0308 19:32:01.346589 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:01 crc kubenswrapper[4885]: I0308 19:32:01.346622 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:01 crc kubenswrapper[4885]: E0308 19:32:01.351449 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:32:01Z is after 2026-02-23T05:33:13Z" node="crc" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.049434 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.049774 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.052783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.052916 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.052987 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.069437 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.276442 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:32:02Z is after 2026-02-23T05:33:13Z Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.530052 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.531603 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.531674 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:02 crc kubenswrapper[4885]: I0308 19:32:02.531694 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.095483 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.113805 4885 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.280135 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.434958 4885 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.435106 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.677077 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.678366 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.680132 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.680220 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.680249 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:03 crc kubenswrapper[4885]: I0308 19:32:03.681155 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:32:03 crc kubenswrapper[4885]: E0308 19:32:03.681459 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:04 crc kubenswrapper[4885]: W0308 19:32:04.181904 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.182010 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:04 crc kubenswrapper[4885]: I0308 19:32:04.278340 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.909572 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f37573b5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.269675871 +0000 UTC m=+0.665729934,LastTimestamp:2026-03-08 19:31:39.269675871 +0000 UTC m=+0.665729934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.914078 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.916093 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.921095 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.927648 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f43b81836 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.477350454 +0000 UTC m=+0.873404517,LastTimestamp:2026-03-08 19:31:39.477350454 +0000 UTC m=+0.873404517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.933256 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.578724893 +0000 UTC m=+0.974778976,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.939875 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.578770334 +0000 UTC m=+0.974824397,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.946635 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6e1ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.578788615 +0000 UTC m=+0.974842678,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.953734 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.671899753 +0000 UTC m=+1.067953816,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.961218 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.671973225 +0000 UTC m=+1.068027288,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.968012 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6e1ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.671993255 +0000 UTC m=+1.068047318,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.975040 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.673980601 +0000 UTC m=+1.070034654,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.981864 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.674031562 +0000 UTC m=+1.070085625,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.988629 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6e1ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.674050643 +0000 UTC m=+1.070104696,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:04 crc kubenswrapper[4885]: E0308 19:32:04.995507 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.675140353 +0000 UTC m=+1.071194386,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.002505 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.675175234 +0000 UTC m=+1.071229267,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.009454 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6e1ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.675194715 +0000 UTC m=+1.071248748,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.016335 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.677409206 +0000 UTC m=+1.073463279,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.022688 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.677441967 +0000 UTC m=+1.073496030,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.029705 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6e1ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.677461207 +0000 UTC m=+1.073515270,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.037363 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.677600891 +0000 UTC m=+1.073654924,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.043721 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.677625442 +0000 UTC m=+1.073679475,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.050403 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6e1ff\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6e1ff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362976255 +0000 UTC m=+0.759030288,LastTimestamp:2026-03-08 19:31:39.677768416 +0000 UTC m=+1.073822469,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.057109 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce63f88\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce63f88 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362934664 +0000 UTC m=+0.758988697,LastTimestamp:2026-03-08 19:31:39.679311009 +0000 UTC m=+1.075365062,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.063623 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189af48f3ce6acfd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189af48f3ce6acfd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:39.362962685 +0000 UTC m=+0.759016718,LastTimestamp:2026-03-08 19:31:39.67934131 +0000 UTC m=+1.075395363,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.072581 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48f691bcea5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.104642213 +0000 UTC m=+1.500696276,LastTimestamp:2026-03-08 19:31:40.104642213 +0000 UTC m=+1.500696276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.079152 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189af48f691c98c3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.104693955 +0000 UTC m=+1.500748018,LastTimestamp:2026-03-08 19:31:40.104693955 +0000 UTC m=+1.500748018,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.085337 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48f6955607a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.108415098 +0000 UTC m=+1.504469161,LastTimestamp:2026-03-08 19:31:40.108415098 +0000 UTC m=+1.504469161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.092326 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48f6a602f83 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.125900675 +0000 UTC m=+1.521954728,LastTimestamp:2026-03-08 19:31:40.125900675 +0000 UTC m=+1.521954728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.098796 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48f6a61b716 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.126000918 +0000 UTC m=+1.522054971,LastTimestamp:2026-03-08 19:31:40.126000918 +0000 UTC m=+1.522054971,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.106442 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48f9470a20b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.831621643 +0000 UTC m=+2.227675706,LastTimestamp:2026-03-08 19:31:40.831621643 +0000 UTC m=+2.227675706,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.113209 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48f9485d0ce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.83300987 +0000 UTC m=+2.229063933,LastTimestamp:2026-03-08 19:31:40.83300987 +0000 UTC m=+2.229063933,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.119762 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48f94a3efd3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.834983891 +0000 UTC m=+2.231037954,LastTimestamp:2026-03-08 19:31:40.834983891 +0000 UTC m=+2.231037954,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.126006 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48f94e3ce55 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.839169621 +0000 UTC m=+2.235223694,LastTimestamp:2026-03-08 19:31:40.839169621 +0000 UTC m=+2.235223694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.132116 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189af48f94e604df openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.839314655 +0000 UTC m=+2.235368718,LastTimestamp:2026-03-08 19:31:40.839314655 +0000 UTC m=+2.235368718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.138213 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48f9581ec46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.849531974 +0000 UTC m=+2.245586037,LastTimestamp:2026-03-08 19:31:40.849531974 +0000 UTC m=+2.245586037,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.143597 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48f959cc289 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.851290761 +0000 UTC m=+2.247344804,LastTimestamp:2026-03-08 19:31:40.851290761 +0000 UTC m=+2.247344804,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.149716 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48f959f1dad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.851445165 +0000 UTC m=+2.247499218,LastTimestamp:2026-03-08 19:31:40.851445165 +0000 UTC m=+2.247499218,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.156058 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48f95a0b27e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.851548798 +0000 UTC m=+2.247602861,LastTimestamp:2026-03-08 19:31:40.851548798 +0000 UTC m=+2.247602861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.160777 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189af48f95e71e77 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.856163959 +0000 UTC m=+2.252218022,LastTimestamp:2026-03-08 19:31:40.856163959 +0000 UTC m=+2.252218022,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.166636 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48f9650ea1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:40.863097372 +0000 UTC m=+2.259151425,LastTimestamp:2026-03-08 19:31:40.863097372 +0000 UTC m=+2.259151425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.173040 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48faa1afe87 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.195107975 +0000 UTC m=+2.591162028,LastTimestamp:2026-03-08 19:31:41.195107975 +0000 UTC m=+2.591162028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.179225 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fab11e3f1 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.211288561 +0000 UTC m=+2.607342614,LastTimestamp:2026-03-08 19:31:41.211288561 +0000 UTC m=+2.607342614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.186200 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fab284366 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.21275479 +0000 UTC m=+2.608808853,LastTimestamp:2026-03-08 19:31:41.21275479 +0000 UTC m=+2.608808853,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.193495 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fb5c34fba openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.390688186 +0000 UTC m=+2.786742249,LastTimestamp:2026-03-08 19:31:41.390688186 +0000 UTC m=+2.786742249,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.200561 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fb623abb8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.397003192 +0000 UTC m=+2.793057225,LastTimestamp:2026-03-08 19:31:41.397003192 +0000 UTC m=+2.793057225,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.207636 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48fb648c834 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.399435316 +0000 UTC m=+2.795489359,LastTimestamp:2026-03-08 19:31:41.399435316 +0000 UTC m=+2.795489359,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.217743 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189af48fb696ddc1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.404552641 +0000 UTC m=+2.800606674,LastTimestamp:2026-03-08 19:31:41.404552641 +0000 UTC m=+2.800606674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.223985 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fbaa4e274 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.472580212 +0000 UTC m=+2.868634245,LastTimestamp:2026-03-08 19:31:41.472580212 +0000 UTC m=+2.868634245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.227445 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fbbc6c119 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.491577113 +0000 UTC m=+2.887631146,LastTimestamp:2026-03-08 19:31:41.491577113 +0000 UTC m=+2.887631146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.233736 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fbbdabaab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.492886187 +0000 UTC m=+2.888940220,LastTimestamp:2026-03-08 19:31:41.492886187 +0000 UTC m=+2.888940220,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.240312 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48fc6e9c7cb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.678421963 +0000 UTC m=+3.074476026,LastTimestamp:2026-03-08 19:31:41.678421963 +0000 UTC m=+3.074476026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.246730 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fc6f929d4 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.6794301 +0000 UTC m=+3.075484123,LastTimestamp:2026-03-08 19:31:41.6794301 +0000 UTC m=+3.075484123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.253048 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189af48fc71435b8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.681202616 +0000 UTC m=+3.077256649,LastTimestamp:2026-03-08 19:31:41.681202616 +0000 UTC m=+3.077256649,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.259104 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fc71b755b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.681677659 +0000 UTC m=+3.077731682,LastTimestamp:2026-03-08 19:31:41.681677659 +0000 UTC m=+3.077731682,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.266004 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fc81211ae openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.697839534 +0000 UTC m=+3.093893557,LastTimestamp:2026-03-08 19:31:41.697839534 +0000 UTC m=+3.093893557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.270576 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fc8316ff9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.699895289 +0000 UTC m=+3.095949312,LastTimestamp:2026-03-08 19:31:41.699895289 +0000 UTC m=+3.095949312,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.275220 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.275328 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189af48fc860b756 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.70299375 +0000 UTC m=+3.099047803,LastTimestamp:2026-03-08 19:31:41.70299375 +0000 UTC m=+3.099047803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.277764 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48fc88b2120 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.705773344 +0000 UTC m=+3.101827387,LastTimestamp:2026-03-08 19:31:41.705773344 +0000 UTC m=+3.101827387,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.282887 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fc88f8401 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.706060801 +0000 UTC m=+3.102114864,LastTimestamp:2026-03-08 19:31:41.706060801 +0000 UTC m=+3.102114864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.289278 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fc8e022c0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.71134432 +0000 UTC m=+3.107398343,LastTimestamp:2026-03-08 19:31:41.71134432 +0000 UTC m=+3.107398343,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.293313 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fcdfd9bd4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.79716194 +0000 UTC m=+3.193215963,LastTimestamp:2026-03-08 19:31:41.79716194 +0000 UTC m=+3.193215963,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.299219 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af48fceb46a0e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.809142286 +0000 UTC m=+3.205196329,LastTimestamp:2026-03-08 19:31:41.809142286 +0000 UTC m=+3.205196329,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.305078 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fd38e9152 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.89054805 +0000 UTC m=+3.286602063,LastTimestamp:2026-03-08 19:31:41.89054805 +0000 UTC m=+3.286602063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.311857 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fd3bfd98c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.893777804 +0000 UTC m=+3.289831847,LastTimestamp:2026-03-08 19:31:41.893777804 +0000 UTC m=+3.289831847,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.318128 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fd4529776 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.903394678 +0000 UTC m=+3.299448711,LastTimestamp:2026-03-08 19:31:41.903394678 +0000 UTC m=+3.299448711,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.322071 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fd4646cf9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.904563449 +0000 UTC m=+3.300617472,LastTimestamp:2026-03-08 19:31:41.904563449 +0000 UTC m=+3.300617472,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.327677 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fd46f26a1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.905266337 +0000 UTC m=+3.301320360,LastTimestamp:2026-03-08 19:31:41.905266337 +0000 UTC m=+3.301320360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.331204 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fd47fe418 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:41.906363416 +0000 UTC m=+3.302417439,LastTimestamp:2026-03-08 19:31:41.906363416 +0000 UTC m=+3.302417439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.334402 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fe2621673 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.139291251 +0000 UTC m=+3.535345274,LastTimestamp:2026-03-08 19:31:42.139291251 +0000 UTC m=+3.535345274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.340209 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fe2935d63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.142520675 +0000 UTC m=+3.538574708,LastTimestamp:2026-03-08 19:31:42.142520675 +0000 UTC m=+3.538574708,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.346051 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fe38ecc98 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.15899868 +0000 UTC m=+3.555052703,LastTimestamp:2026-03-08 19:31:42.15899868 +0000 UTC m=+3.555052703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.351845 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fe39a262d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.159742509 +0000 UTC m=+3.555796532,LastTimestamp:2026-03-08 19:31:42.159742509 +0000 UTC m=+3.555796532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.357878 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189af48fe39d3142 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.159941954 +0000 UTC m=+3.555995977,LastTimestamp:2026-03-08 19:31:42.159941954 +0000 UTC m=+3.555995977,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.363888 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fee40ee05 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.338444805 +0000 UTC m=+3.734498848,LastTimestamp:2026-03-08 19:31:42.338444805 +0000 UTC m=+3.734498848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.369875 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48feef82f49 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.350454601 +0000 UTC m=+3.746508634,LastTimestamp:2026-03-08 19:31:42.350454601 +0000 UTC m=+3.746508634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.376436 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fef09a959 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.351599961 +0000 UTC m=+3.747653994,LastTimestamp:2026-03-08 19:31:42.351599961 +0000 UTC m=+3.747653994,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.384410 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af48ff3a00668 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.428563048 +0000 UTC m=+3.824617111,LastTimestamp:2026-03-08 19:31:42.428563048 +0000 UTC m=+3.824617111,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.392021 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48ffcc977d6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.582274006 +0000 UTC m=+3.978328039,LastTimestamp:2026-03-08 19:31:42.582274006 +0000 UTC m=+3.978328039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.399085 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48ffe02c97d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.602807677 +0000 UTC m=+3.998861710,LastTimestamp:2026-03-08 19:31:42.602807677 +0000 UTC m=+3.998861710,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.406782 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49000f69ed4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.652341972 +0000 UTC m=+4.048396035,LastTimestamp:2026-03-08 19:31:42.652341972 +0000 UTC m=+4.048396035,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.413539 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49001b75ebe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.664974014 +0000 UTC m=+4.061028077,LastTimestamp:2026-03-08 19:31:42.664974014 +0000 UTC m=+4.061028077,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.421384 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af490300e4bd2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:43.442422738 +0000 UTC m=+4.838476801,LastTimestamp:2026-03-08 19:31:43.442422738 +0000 UTC m=+4.838476801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.428233 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af4903feb6971 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:43.708572017 +0000 UTC m=+5.104626080,LastTimestamp:2026-03-08 19:31:43.708572017 +0000 UTC m=+5.104626080,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.434594 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49040a3d91a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:43.720659226 +0000 UTC m=+5.116713289,LastTimestamp:2026-03-08 19:31:43.720659226 +0000 UTC m=+5.116713289,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.440727 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49040bcf9e3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:43.722306019 +0000 UTC m=+5.118360072,LastTimestamp:2026-03-08 19:31:43.722306019 +0000 UTC m=+5.118360072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.446507 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49050bda758 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:43.99078588 +0000 UTC m=+5.386839903,LastTimestamp:2026-03-08 19:31:43.99078588 +0000 UTC m=+5.386839903,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.450157 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49051cabc65 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.008420453 +0000 UTC m=+5.404474516,LastTimestamp:2026-03-08 19:31:44.008420453 +0000 UTC m=+5.404474516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.454996 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49051e886da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.010372826 +0000 UTC m=+5.406426849,LastTimestamp:2026-03-08 19:31:44.010372826 +0000 UTC m=+5.406426849,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.460411 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af490612c9e79 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.266493561 +0000 UTC m=+5.662547614,LastTimestamp:2026-03-08 19:31:44.266493561 +0000 UTC m=+5.662547614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.466229 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49062209e75 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.282484341 +0000 UTC m=+5.678538404,LastTimestamp:2026-03-08 19:31:44.282484341 +0000 UTC m=+5.678538404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.470848 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49062397be6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.284113894 +0000 UTC m=+5.680167957,LastTimestamp:2026-03-08 19:31:44.284113894 +0000 UTC m=+5.680167957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.476054 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49071e2babc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.546863804 +0000 UTC m=+5.942917867,LastTimestamp:2026-03-08 19:31:44.546863804 +0000 UTC m=+5.942917867,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.480810 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49072d06f32 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.562442034 +0000 UTC m=+5.958496097,LastTimestamp:2026-03-08 19:31:44.562442034 +0000 UTC m=+5.958496097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.485457 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49072ea761b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.564147739 +0000 UTC m=+5.960201802,LastTimestamp:2026-03-08 19:31:44.564147739 +0000 UTC m=+5.960201802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.491766 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49082ff62e6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.833954534 +0000 UTC m=+6.230008597,LastTimestamp:2026-03-08 19:31:44.833954534 +0000 UTC m=+6.230008597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.497967 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189af49083fcfcb0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:44.850574512 +0000 UTC m=+6.246628565,LastTimestamp:2026-03-08 19:31:44.850574512 +0000 UTC m=+6.246628565,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.507452 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 19:32:05 crc kubenswrapper[4885]: &Event{ObjectMeta:{kube-controller-manager-crc.189af49283a15245 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 19:32:05 crc kubenswrapper[4885]: body: Mar 08 19:32:05 crc kubenswrapper[4885]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:53.434501701 +0000 UTC m=+14.830555724,LastTimestamp:2026-03-08 19:31:53.434501701 +0000 UTC m=+14.830555724,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 19:32:05 crc kubenswrapper[4885]: > Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.513884 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af49283a23206 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:53.434558982 +0000 UTC m=+14.830613005,LastTimestamp:2026-03-08 19:31:53.434558982 +0000 UTC m=+14.830613005,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.518876 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 08 19:32:05 crc kubenswrapper[4885]: &Event{ObjectMeta:{kube-apiserver-crc.189af492a11a4f45 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:6443/livez": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 19:32:05 crc kubenswrapper[4885]: body: Mar 08 19:32:05 crc kubenswrapper[4885]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:53.928970053 +0000 UTC m=+15.325024086,LastTimestamp:2026-03-08 19:31:53.928970053 +0000 UTC m=+15.325024086,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 19:32:05 crc kubenswrapper[4885]: > Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.523590 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af492a11b5436 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:6443/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:53.929036854 +0000 UTC m=+15.325090887,LastTimestamp:2026-03-08 19:31:53.929036854 +0000 UTC m=+15.325090887,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.528841 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 08 19:32:05 crc kubenswrapper[4885]: &Event{ObjectMeta:{kube-apiserver-crc.189af492dc3986f0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 08 19:32:05 crc kubenswrapper[4885]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 08 19:32:05 crc kubenswrapper[4885]: Mar 08 19:32:05 crc kubenswrapper[4885]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:54.920871664 +0000 UTC m=+16.316925797,LastTimestamp:2026-03-08 19:31:54.920871664 +0000 UTC m=+16.316925797,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 19:32:05 crc kubenswrapper[4885]: > Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.533682 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af492dc3e24f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:54.921174263 +0000 UTC m=+16.317228346,LastTimestamp:2026-03-08 19:31:54.921174263 +0000 UTC m=+16.317228346,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.540536 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 08 19:32:05 crc kubenswrapper[4885]: &Event{ObjectMeta:{kube-apiserver-crc.189af492dd9773b2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": read tcp 192.168.126.11:38860->192.168.126.11:17697: read: connection reset by peer Mar 08 19:32:05 crc kubenswrapper[4885]: body: Mar 08 19:32:05 crc kubenswrapper[4885]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:54.943804338 +0000 UTC m=+16.339858441,LastTimestamp:2026-03-08 19:31:54.943804338 +0000 UTC m=+16.339858441,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 19:32:05 crc kubenswrapper[4885]: > Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.544450 4885 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af492dd9b640b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:38860->192.168.126.11:17697: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:54.944062475 +0000 UTC m=+16.340116538,LastTimestamp:2026-03-08 19:31:54.944062475 +0000 UTC m=+16.340116538,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.550500 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189af48fef09a959\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189af48fef09a959 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:42.351599961 +0000 UTC m=+3.747653994,LastTimestamp:2026-03-08 19:31:55.498344682 +0000 UTC m=+16.894398735,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.559360 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189af49283a15245\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 08 19:32:05 crc kubenswrapper[4885]: &Event{ObjectMeta:{kube-controller-manager-crc.189af49283a15245 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 08 19:32:05 crc kubenswrapper[4885]: body: Mar 08 19:32:05 crc kubenswrapper[4885]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:53.434501701 +0000 UTC m=+14.830555724,LastTimestamp:2026-03-08 19:32:03.435069787 +0000 UTC m=+24.831123860,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 08 19:32:05 crc kubenswrapper[4885]: > Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.565855 4885 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189af49283a23206\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189af49283a23206 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:31:53.434558982 +0000 UTC m=+14.830613005,LastTimestamp:2026-03-08 19:32:03.435154949 +0000 UTC m=+24.831209012,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.952075 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.952300 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.953811 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.953862 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.953881 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:05 crc kubenswrapper[4885]: I0308 19:32:05.954632 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:32:05 crc kubenswrapper[4885]: E0308 19:32:05.955042 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:06 crc kubenswrapper[4885]: I0308 19:32:06.281187 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:06 crc kubenswrapper[4885]: W0308 19:32:06.337711 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 08 19:32:06 crc kubenswrapper[4885]: E0308 19:32:06.337779 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:07 crc kubenswrapper[4885]: I0308 19:32:07.279081 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:07 crc kubenswrapper[4885]: W0308 19:32:07.601751 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:07 crc kubenswrapper[4885]: E0308 19:32:07.601824 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:08 crc kubenswrapper[4885]: I0308 19:32:08.278739 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:08 crc kubenswrapper[4885]: E0308 19:32:08.344786 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 19:32:08 crc kubenswrapper[4885]: I0308 19:32:08.351567 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:08 crc kubenswrapper[4885]: I0308 19:32:08.353426 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:08 crc kubenswrapper[4885]: I0308 19:32:08.353496 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:08 crc kubenswrapper[4885]: I0308 19:32:08.353525 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:08 crc kubenswrapper[4885]: I0308 19:32:08.353574 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:08 crc kubenswrapper[4885]: E0308 19:32:08.355344 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 19:32:09 crc kubenswrapper[4885]: I0308 19:32:09.279148 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:09 crc kubenswrapper[4885]: E0308 19:32:09.483980 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:32:10 crc kubenswrapper[4885]: W0308 19:32:10.046018 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 08 19:32:10 crc kubenswrapper[4885]: E0308 19:32:10.046092 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:10 crc kubenswrapper[4885]: I0308 19:32:10.279090 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.258774 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.258955 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.260248 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.260308 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.260320 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.262635 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.277697 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.554156 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.555565 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.555629 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:11 crc kubenswrapper[4885]: I0308 19:32:11.555653 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:12 crc kubenswrapper[4885]: I0308 19:32:12.277912 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:13 crc kubenswrapper[4885]: I0308 19:32:13.277586 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:14 crc kubenswrapper[4885]: I0308 19:32:14.278643 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:15 crc kubenswrapper[4885]: I0308 19:32:15.278604 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:15 crc kubenswrapper[4885]: E0308 19:32:15.353163 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 19:32:15 crc kubenswrapper[4885]: I0308 19:32:15.356183 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:15 crc kubenswrapper[4885]: I0308 19:32:15.358237 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:15 crc kubenswrapper[4885]: I0308 19:32:15.358319 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:15 crc kubenswrapper[4885]: I0308 19:32:15.358345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:15 crc kubenswrapper[4885]: I0308 19:32:15.358391 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:15 crc kubenswrapper[4885]: E0308 19:32:15.360737 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 19:32:16 crc kubenswrapper[4885]: I0308 19:32:16.279277 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:17 crc kubenswrapper[4885]: I0308 19:32:17.274753 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:18 crc kubenswrapper[4885]: I0308 19:32:18.279455 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.278229 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.367653 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.368844 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.368937 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.368977 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.369869 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:32:19 crc kubenswrapper[4885]: E0308 19:32:19.484244 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:32:19 crc kubenswrapper[4885]: I0308 19:32:19.582680 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.275816 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.591667 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.592448 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.596070 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" exitCode=255 Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.596128 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204"} Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.596216 4885 scope.go:117] "RemoveContainer" containerID="b88f25e1b74d3544e4027c5c554b8c9fbb7882ca688872ecde33075d0f4a3ad8" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.596540 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.601117 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.601173 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.601196 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:20 crc kubenswrapper[4885]: I0308 19:32:20.602362 4885 scope.go:117] "RemoveContainer" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" Mar 08 19:32:20 crc kubenswrapper[4885]: E0308 19:32:20.602659 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:20 crc kubenswrapper[4885]: W0308 19:32:20.604805 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:20 crc kubenswrapper[4885]: E0308 19:32:20.604857 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:21 crc kubenswrapper[4885]: I0308 19:32:21.278555 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:21 crc kubenswrapper[4885]: I0308 19:32:21.602245 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 19:32:22 crc kubenswrapper[4885]: I0308 19:32:22.278347 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:22 crc kubenswrapper[4885]: I0308 19:32:22.361678 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:22 crc kubenswrapper[4885]: E0308 19:32:22.362383 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 19:32:22 crc kubenswrapper[4885]: I0308 19:32:22.363348 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:22 crc kubenswrapper[4885]: I0308 19:32:22.363407 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:22 crc kubenswrapper[4885]: I0308 19:32:22.363426 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:22 crc kubenswrapper[4885]: I0308 19:32:22.363469 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:22 crc kubenswrapper[4885]: E0308 19:32:22.370032 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 19:32:22 crc kubenswrapper[4885]: W0308 19:32:22.898807 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 08 19:32:22 crc kubenswrapper[4885]: E0308 19:32:22.898880 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.279512 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.677177 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.677451 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.678987 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.679035 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.679053 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:23 crc kubenswrapper[4885]: I0308 19:32:23.679720 4885 scope.go:117] "RemoveContainer" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" Mar 08 19:32:23 crc kubenswrapper[4885]: E0308 19:32:23.680015 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:24 crc kubenswrapper[4885]: I0308 19:32:24.278511 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:24 crc kubenswrapper[4885]: W0308 19:32:24.687835 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 08 19:32:24 crc kubenswrapper[4885]: E0308 19:32:24.687955 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.279369 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.952160 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.952422 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.953870 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.953934 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.953945 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:25 crc kubenswrapper[4885]: I0308 19:32:25.954674 4885 scope.go:117] "RemoveContainer" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" Mar 08 19:32:25 crc kubenswrapper[4885]: E0308 19:32:25.955103 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:26 crc kubenswrapper[4885]: I0308 19:32:26.279805 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:27 crc kubenswrapper[4885]: I0308 19:32:27.279791 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:28 crc kubenswrapper[4885]: I0308 19:32:28.278307 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:29 crc kubenswrapper[4885]: I0308 19:32:29.279206 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:29 crc kubenswrapper[4885]: E0308 19:32:29.369838 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 19:32:29 crc kubenswrapper[4885]: I0308 19:32:29.370954 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:29 crc kubenswrapper[4885]: I0308 19:32:29.372546 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:29 crc kubenswrapper[4885]: I0308 19:32:29.372591 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:29 crc kubenswrapper[4885]: I0308 19:32:29.372608 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:29 crc kubenswrapper[4885]: I0308 19:32:29.372642 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:29 crc kubenswrapper[4885]: E0308 19:32:29.379778 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 19:32:29 crc kubenswrapper[4885]: E0308 19:32:29.484669 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:32:30 crc kubenswrapper[4885]: I0308 19:32:30.279065 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:31 crc kubenswrapper[4885]: I0308 19:32:31.046283 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 08 19:32:31 crc kubenswrapper[4885]: I0308 19:32:31.046471 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:31 crc kubenswrapper[4885]: I0308 19:32:31.047844 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:31 crc kubenswrapper[4885]: I0308 19:32:31.047902 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:31 crc kubenswrapper[4885]: I0308 19:32:31.047981 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:31 crc kubenswrapper[4885]: I0308 19:32:31.279376 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:32 crc kubenswrapper[4885]: I0308 19:32:32.277698 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:32 crc kubenswrapper[4885]: W0308 19:32:32.398109 4885 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 08 19:32:32 crc kubenswrapper[4885]: E0308 19:32:32.398205 4885 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 08 19:32:33 crc kubenswrapper[4885]: I0308 19:32:33.278584 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:34 crc kubenswrapper[4885]: I0308 19:32:34.279640 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:35 crc kubenswrapper[4885]: I0308 19:32:35.278321 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:36 crc kubenswrapper[4885]: I0308 19:32:36.278642 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:36 crc kubenswrapper[4885]: E0308 19:32:36.372671 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 19:32:36 crc kubenswrapper[4885]: I0308 19:32:36.380335 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:36 crc kubenswrapper[4885]: I0308 19:32:36.381946 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:36 crc kubenswrapper[4885]: I0308 19:32:36.382002 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:36 crc kubenswrapper[4885]: I0308 19:32:36.382021 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:36 crc kubenswrapper[4885]: I0308 19:32:36.382054 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:36 crc kubenswrapper[4885]: E0308 19:32:36.388330 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 19:32:37 crc kubenswrapper[4885]: I0308 19:32:37.278518 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:37 crc kubenswrapper[4885]: I0308 19:32:37.367653 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:37 crc kubenswrapper[4885]: I0308 19:32:37.370013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:37 crc kubenswrapper[4885]: I0308 19:32:37.370229 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:37 crc kubenswrapper[4885]: I0308 19:32:37.370419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:37 crc kubenswrapper[4885]: I0308 19:32:37.371520 4885 scope.go:117] "RemoveContainer" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" Mar 08 19:32:37 crc kubenswrapper[4885]: E0308 19:32:37.372048 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:38 crc kubenswrapper[4885]: I0308 19:32:38.277822 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:39 crc kubenswrapper[4885]: I0308 19:32:39.279100 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:39 crc kubenswrapper[4885]: E0308 19:32:39.485687 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:32:40 crc kubenswrapper[4885]: I0308 19:32:40.286415 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:41 crc kubenswrapper[4885]: I0308 19:32:41.279345 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:42 crc kubenswrapper[4885]: I0308 19:32:42.278223 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:43 crc kubenswrapper[4885]: I0308 19:32:43.278817 4885 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 08 19:32:43 crc kubenswrapper[4885]: E0308 19:32:43.380966 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 08 19:32:43 crc kubenswrapper[4885]: I0308 19:32:43.389292 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:43 crc kubenswrapper[4885]: I0308 19:32:43.391797 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:43 crc kubenswrapper[4885]: I0308 19:32:43.391857 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:43 crc kubenswrapper[4885]: I0308 19:32:43.391872 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:43 crc kubenswrapper[4885]: I0308 19:32:43.391950 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:43 crc kubenswrapper[4885]: E0308 19:32:43.398996 4885 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 08 19:32:44 crc kubenswrapper[4885]: I0308 19:32:44.077277 4885 csr.go:261] certificate signing request csr-kbpxr is approved, waiting to be issued Mar 08 19:32:44 crc kubenswrapper[4885]: I0308 19:32:44.085429 4885 csr.go:257] certificate signing request csr-kbpxr is issued Mar 08 19:32:44 crc kubenswrapper[4885]: I0308 19:32:44.100879 4885 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 08 19:32:44 crc kubenswrapper[4885]: I0308 19:32:44.133668 4885 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 08 19:32:45 crc kubenswrapper[4885]: I0308 19:32:45.087378 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-15 13:19:33.108853933 +0000 UTC Mar 08 19:32:45 crc kubenswrapper[4885]: I0308 19:32:45.087435 4885 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6761h46m48.02142347s for next certificate rotation Mar 08 19:32:49 crc kubenswrapper[4885]: E0308 19:32:49.486967 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.367159 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.368482 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.368511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.368520 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.399543 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.400511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.400549 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.400559 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.400637 4885 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.407904 4885 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.408158 4885 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.408180 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.411385 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.411445 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.411455 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.411471 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.411484 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:32:50Z","lastTransitionTime":"2026-03-08T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.432178 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.442680 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.442728 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.442746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.442771 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.442788 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:32:50Z","lastTransitionTime":"2026-03-08T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.457107 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.465271 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.465311 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.465327 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.465346 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.465368 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:32:50Z","lastTransitionTime":"2026-03-08T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.482362 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.492640 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.492684 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.492695 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.492716 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:32:50 crc kubenswrapper[4885]: I0308 19:32:50.492732 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:32:50Z","lastTransitionTime":"2026-03-08T19:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.509688 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.509910 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.509983 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.610756 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.711677 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.812227 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:50 crc kubenswrapper[4885]: E0308 19:32:50.912811 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.013410 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.113975 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.214424 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.315132 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.368202 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.369694 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.369966 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.370157 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.371435 4885 scope.go:117] "RemoveContainer" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.416897 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.517790 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.618294 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.692881 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.696463 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e"} Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.696654 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.698264 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.698327 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:51 crc kubenswrapper[4885]: I0308 19:32:51.698346 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.718644 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.819327 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:51 crc kubenswrapper[4885]: E0308 19:32:51.920282 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.021152 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.121500 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.221901 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.325864 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.426015 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.526755 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.627362 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.702405 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.703271 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.706012 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" exitCode=255 Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.706072 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e"} Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.706142 4885 scope.go:117] "RemoveContainer" containerID="13b0a255507194474d3f671b720a5ee8e68a78f42e0a8c00a07ac26790802204" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.706287 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.707678 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.707795 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.707821 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:52 crc kubenswrapper[4885]: I0308 19:32:52.708753 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.709189 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.728110 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.828578 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:52 crc kubenswrapper[4885]: E0308 19:32:52.928891 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.030055 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.130192 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.231216 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.331415 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.432413 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.533216 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.634172 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.677777 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.711518 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.714757 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.716322 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.716386 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.716405 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:53 crc kubenswrapper[4885]: I0308 19:32:53.717630 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.717988 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.734576 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.835166 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:53 crc kubenswrapper[4885]: E0308 19:32:53.936222 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.036412 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.137351 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.237784 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.338692 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.439770 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.539896 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.640347 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.741136 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.841518 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:54 crc kubenswrapper[4885]: E0308 19:32:54.942171 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.042628 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.143746 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.244195 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.344726 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.445170 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.545489 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.646670 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.747587 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.847885 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.948312 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:55 crc kubenswrapper[4885]: I0308 19:32:55.951655 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:32:55 crc kubenswrapper[4885]: I0308 19:32:55.951999 4885 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 08 19:32:55 crc kubenswrapper[4885]: I0308 19:32:55.956372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:32:55 crc kubenswrapper[4885]: I0308 19:32:55.956448 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:32:55 crc kubenswrapper[4885]: I0308 19:32:55.956474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:32:55 crc kubenswrapper[4885]: I0308 19:32:55.959028 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:32:55 crc kubenswrapper[4885]: E0308 19:32:55.959779 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.048460 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.149299 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.250441 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.351033 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.452101 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.552603 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.653745 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.753911 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.854366 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:56 crc kubenswrapper[4885]: E0308 19:32:56.954463 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.055556 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.156206 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.256786 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.357563 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.458364 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.558723 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.659364 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.760509 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.861624 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:57 crc kubenswrapper[4885]: E0308 19:32:57.962825 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.063028 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.163521 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.264486 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.365324 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.466435 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.567842 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.669849 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.770830 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.871895 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:58 crc kubenswrapper[4885]: E0308 19:32:58.972912 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.073299 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.174426 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.275309 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.375868 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.476426 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.487817 4885 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.576749 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.678471 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.779330 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.880416 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:32:59 crc kubenswrapper[4885]: E0308 19:32:59.980717 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.081685 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.182104 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.282883 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.383349 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.483869 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.584253 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.655913 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.663998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.664095 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.664123 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.664164 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.664192 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:00Z","lastTransitionTime":"2026-03-08T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.681252 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.692416 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.692464 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.692483 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.692510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.692529 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:00Z","lastTransitionTime":"2026-03-08T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.709095 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.719590 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.719678 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.719700 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.719732 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.719758 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:00Z","lastTransitionTime":"2026-03-08T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.735694 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.747420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.747698 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.748002 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.748263 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:00 crc kubenswrapper[4885]: I0308 19:33:00.748512 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:00Z","lastTransitionTime":"2026-03-08T19:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.766601 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.767318 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.767486 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.867722 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:00 crc kubenswrapper[4885]: E0308 19:33:00.968495 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.069018 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.169214 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.270593 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.371544 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.471661 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.572916 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.673601 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.775279 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.875480 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:01 crc kubenswrapper[4885]: E0308 19:33:01.976994 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.078205 4885 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.161777 4885 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.184623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.185083 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.185316 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.185507 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.185669 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.289901 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.290339 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.290497 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.290665 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.290804 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.334113 4885 apiserver.go:52] "Watching apiserver" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.347723 4885 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.348815 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt","openshift-multus/multus-additional-cni-plugins-25vxd","openshift-multus/multus-ff7b4","openshift-network-diagnostics/network-check-target-xd92c","openshift-dns/node-resolver-w5lms","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-ovn-kubernetes/ovnkube-node-bssfh","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-image-registry/node-ca-57qch","openshift-machine-config-operator/machine-config-daemon-ttb97","openshift-multus/network-metrics-daemon-jps4r","openshift-network-operator/iptables-alerter-4ln5h"] Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.349545 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.349678 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.350893 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.351013 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.351054 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.351438 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.351579 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.351774 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.351813 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.352067 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.352125 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.352535 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.354227 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.354420 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.354291 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.354599 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.355503 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.355706 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.360615 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.360635 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.361116 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.361317 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.361390 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.361703 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.362046 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.362323 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.362589 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.362709 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.362607 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.362875 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.363149 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.363620 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.363662 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.363854 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.363907 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.363861 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.364026 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.364172 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.364274 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.364278 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.364373 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.364781 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365207 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365401 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365412 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365429 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365547 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365614 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365774 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365850 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.365959 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.366129 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.366397 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.366526 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.366660 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.382453 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389246 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-cnibin\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389310 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-log-socket\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389352 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-rootfs\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389393 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b33b5271-bda3-41ca-81a3-d47fff657c27-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389435 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-system-cni-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389454 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-daemon-config\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389533 4885 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-config\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389615 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cfac2d6-6888-4b2d-982e-826f583396e8-host\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389654 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr8jn\" (UniqueName: \"kubernetes.io/projected/2f639c4e-64b8-45e9-bf33-c1d8c376b438-kube-api-access-mr8jn\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389684 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389722 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pllt\" (UniqueName: \"kubernetes.io/projected/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-kube-api-access-7pllt\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389756 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6kv8\" (UniqueName: \"kubernetes.io/projected/bc890659-71a7-4024-bae6-e1e1ef563f17-kube-api-access-d6kv8\") pod \"node-resolver-w5lms\" (UID: \"bc890659-71a7-4024-bae6-e1e1ef563f17\") " pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389789 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-ovn\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389810 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-kubelet\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389834 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-env-overrides\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389879 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389914 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-node-log\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.389967 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-script-lib\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390003 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bpng\" (UniqueName: \"kubernetes.io/projected/b33b5271-bda3-41ca-81a3-d47fff657c27-kube-api-access-2bpng\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390034 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-cni-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390059 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-conf-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390086 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc890659-71a7-4024-bae6-e1e1ef563f17-hosts-file\") pod \"node-resolver-w5lms\" (UID: \"bc890659-71a7-4024-bae6-e1e1ef563f17\") " pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390109 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-netns\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390137 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390161 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-etc-kubernetes\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390186 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-ovn-kubernetes\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovn-node-metrics-cert\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390269 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-proxy-tls\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390295 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390327 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390356 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-socket-dir-parent\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390407 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-hostroot\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390432 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-slash\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390461 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390486 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390507 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-os-release\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390532 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390557 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390582 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390606 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390630 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390681 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-systemd-units\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390702 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-netns\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390746 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-cni-bin\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390769 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-mcd-auth-proxy-config\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390789 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-k8s-cni-cncf-io\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390827 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-cni-multus\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390851 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-var-lib-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390874 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-bin\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390893 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0cfac2d6-6888-4b2d-982e-826f583396e8-serviceca\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390913 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-system-cni-dir\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390951 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-systemd\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.390977 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391007 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391030 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391057 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-kubelet\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391081 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njr92\" (UniqueName: \"kubernetes.io/projected/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-kube-api-access-njr92\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391103 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391125 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b33b5271-bda3-41ca-81a3-d47fff657c27-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391145 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-cni-binary-copy\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mlvt\" (UniqueName: \"kubernetes.io/projected/dedec2a4-d864-4f30-8a2d-b3168817ea34-kube-api-access-5mlvt\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391195 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-netd\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391216 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r95ct\" (UniqueName: \"kubernetes.io/projected/0cfac2d6-6888-4b2d-982e-826f583396e8-kube-api-access-r95ct\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391239 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b33b5271-bda3-41ca-81a3-d47fff657c27-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391261 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-multus-certs\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.391284 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-etc-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.391751 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.392148 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.392608 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:02.892106735 +0000 UTC m=+84.288160798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.392641 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:02.892625409 +0000 UTC m=+84.288679472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.393058 4885 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.393693 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.394554 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.395459 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.397627 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.399866 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.399908 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.399937 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.399958 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.399972 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.406410 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.411201 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.411252 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.411277 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.411356 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:02.911331532 +0000 UTC m=+84.307385595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.415173 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.415213 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.415232 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.415297 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:02.91527818 +0000 UTC m=+84.311332243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.415435 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.417239 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.421797 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.424154 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.425107 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.432399 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.457383 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.464121 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.471182 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.485764 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-a0b4213f6e1b55ac850ea4fd9ceaff623bc4fbc0699f7f8abd2524235c6c34a3 WatchSource:0}: Error finding container a0b4213f6e1b55ac850ea4fd9ceaff623bc4fbc0699f7f8abd2524235c6c34a3: Status 404 returned error can't find the container with id a0b4213f6e1b55ac850ea4fd9ceaff623bc4fbc0699f7f8abd2524235c6c34a3 Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.489833 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491481 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491522 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491543 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491566 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491593 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491613 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491633 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491653 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491678 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491700 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491724 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491743 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491764 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491786 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491809 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491883 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491909 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.491990 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492052 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492076 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492118 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493022 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493052 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493078 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493102 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493124 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493146 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493168 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493189 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493218 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493240 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493263 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493356 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493382 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493403 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493425 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493446 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493478 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493500 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493527 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493554 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493574 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493598 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493624 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493651 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493672 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493693 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493717 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493740 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493766 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493789 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493813 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493836 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493861 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493886 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.493856 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f "/env/_master" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: source "/env/_master" Mar 08 19:33:02 crc kubenswrapper[4885]: set +o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 08 19:33:02 crc kubenswrapper[4885]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 08 19:33:02 crc kubenswrapper[4885]: ho_enable="--enable-hybrid-overlay" Mar 08 19:33:02 crc kubenswrapper[4885]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 08 19:33:02 crc kubenswrapper[4885]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 08 19:33:02 crc kubenswrapper[4885]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --webhook-host=127.0.0.1 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --webhook-port=9743 \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${ho_enable} \ Mar 08 19:33:02 crc kubenswrapper[4885]: --enable-interconnect \ Mar 08 19:33:02 crc kubenswrapper[4885]: --disable-approver \ Mar 08 19:33:02 crc kubenswrapper[4885]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --wait-for-kubernetes-api=200s \ Mar 08 19:33:02 crc kubenswrapper[4885]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --loglevel="${LOGLEVEL}" Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493913 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494032 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494073 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494118 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494161 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494204 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494240 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494277 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494321 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494357 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494398 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494435 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494471 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494509 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494544 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494578 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494614 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494651 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494690 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494729 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494763 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494796 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494833 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494866 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494899 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495003 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495048 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495083 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495128 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495167 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495205 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495245 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495280 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495317 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495354 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495387 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495422 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495680 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495763 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495802 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495840 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495875 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495909 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496112 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496158 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496201 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496249 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496336 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492155 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496381 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492198 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496430 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492531 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492615 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492704 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.492968 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496479 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496526 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496571 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496613 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496657 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496700 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496744 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496819 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496856 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496896 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496954 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496993 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497030 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497069 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497149 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497185 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497224 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497300 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497337 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497374 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497409 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497445 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497484 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497524 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.498987 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499030 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499068 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499107 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499236 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499279 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499314 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499350 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499387 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499425 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499460 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499495 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499537 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499581 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499617 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499655 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499692 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499726 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499765 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499801 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499837 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499874 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499915 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499979 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500019 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500059 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500096 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500139 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500177 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500214 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500253 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500291 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500332 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500372 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500410 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500450 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500489 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500529 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500569 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500608 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501090 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501143 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501182 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501218 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501335 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501378 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501421 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501460 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501497 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501538 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501582 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501620 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501660 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501695 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501730 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501768 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501806 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501841 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501885 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501952 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501990 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502112 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-cni-bin\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502166 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ac600107-0c97-4ec8-89f6-598b40c166ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502210 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-systemd-units\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502539 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-netns\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502628 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-var-lib-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502671 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-mcd-auth-proxy-config\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502712 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-k8s-cni-cncf-io\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502749 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-cni-multus\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502790 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-systemd\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502827 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-bin\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502904 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0cfac2d6-6888-4b2d-982e-826f583396e8-serviceca\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503022 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-system-cni-dir\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503693 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-os-release\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.504674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-kubelet\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.504724 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505029 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mlvt\" (UniqueName: \"kubernetes.io/projected/dedec2a4-d864-4f30-8a2d-b3168817ea34-kube-api-access-5mlvt\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.504075 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505569 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505587 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505598 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.506073 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.506003 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njr92\" (UniqueName: \"kubernetes.io/projected/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-kube-api-access-njr92\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.506458 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.508962 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b33b5271-bda3-41ca-81a3-d47fff657c27-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509078 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-cni-binary-copy\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509178 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-etc-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509279 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-netd\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509357 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r95ct\" (UniqueName: \"kubernetes.io/projected/0cfac2d6-6888-4b2d-982e-826f583396e8-kube-api-access-r95ct\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509397 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b33b5271-bda3-41ca-81a3-d47fff657c27-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509642 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-multus-certs\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493182 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493266 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493568 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493554 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493611 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.493976 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494109 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494163 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494187 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494382 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509905 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494454 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.494536 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495049 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495729 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.495823 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496355 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496499 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496945 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.496968 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497611 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497852 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497958 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497952 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.498069 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.497646 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.498408 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.498760 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.498770 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.498806 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499506 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.499513 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500001 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500040 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500451 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500471 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.500461 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.501849 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502296 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502311 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.502897 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503025 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503256 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503383 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503482 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.503399 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.504343 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.504426 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.504492 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505057 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505652 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505792 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505814 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505835 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505856 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.505862 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.506600 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.507086 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.507344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.508050 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.508512 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.508590 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.508955 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509490 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.510464 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.510517 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.510532 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.510856 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.511084 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.511165 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.511263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.511686 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.512686 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.512989 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.513112 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.513223 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.513195 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.513571 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.513909 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514024 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514209 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514236 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514310 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514488 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514578 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.514429 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b33b5271-bda3-41ca-81a3-d47fff657c27-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.515908 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.515996 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.516054 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-etc-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.516174 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-netd\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.516168 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.517028 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.517115 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.517218 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.517674 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.517695 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.517716 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:33:03.017666915 +0000 UTC m=+84.413720948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.517806 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-multus-certs\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.518146 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.518297 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.518867 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.519132 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.519148 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.519493 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.519509 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.519550 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.519975 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.520065 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.520293 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.520398 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.520753 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-cni-binary-copy\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521179 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b33b5271-bda3-41ca-81a3-d47fff657c27-env-overrides\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521588 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521607 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521625 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521627 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521796 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521913 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.521996 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.522147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.522337 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.522724 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.523009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.524601 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.525246 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.525476 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-systemd-units\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.525521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-netns\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.525568 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-var-lib-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.526251 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-run-k8s-cni-cncf-io\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.526298 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-cni-multus\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.526340 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-systemd\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.526389 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-bin\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.527599 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.527432 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b33b5271-bda3-41ca-81a3-d47fff657c27-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.525227 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-cni-bin\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.527715 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-system-cni-dir\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.527865 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-kubelet\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.509751 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b33b5271-bda3-41ca-81a3-d47fff657c27-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528012 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-cnibin\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528091 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zkpk\" (UniqueName: \"kubernetes.io/projected/ac600107-0c97-4ec8-89f6-598b40c166ee-kube-api-access-2zkpk\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528157 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-log-socket\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528218 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-rootfs\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528282 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr8jn\" (UniqueName: \"kubernetes.io/projected/2f639c4e-64b8-45e9-bf33-c1d8c376b438-kube-api-access-mr8jn\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528305 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528347 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-system-cni-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528409 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-daemon-config\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528469 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac600107-0c97-4ec8-89f6-598b40c166ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528532 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-config\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528557 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528597 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cfac2d6-6888-4b2d-982e-826f583396e8-host\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528656 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-ovn\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528716 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528804 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pllt\" (UniqueName: \"kubernetes.io/projected/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-kube-api-access-7pllt\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528819 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-mcd-auth-proxy-config\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.528896 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-cnibin\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6kv8\" (UniqueName: \"kubernetes.io/projected/bc890659-71a7-4024-bae6-e1e1ef563f17-kube-api-access-d6kv8\") pod \"node-resolver-w5lms\" (UID: \"bc890659-71a7-4024-bae6-e1e1ef563f17\") " pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529321 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-kubelet\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529386 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-ovn\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529388 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-env-overrides\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529450 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-netns\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529506 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-node-log\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-script-lib\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529690 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bpng\" (UniqueName: \"kubernetes.io/projected/b33b5271-bda3-41ca-81a3-d47fff657c27-kube-api-access-2bpng\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529707 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-cnibin\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529752 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-cni-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529811 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-conf-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529867 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc890659-71a7-4024-bae6-e1e1ef563f17-hosts-file\") pod \"node-resolver-w5lms\" (UID: \"bc890659-71a7-4024-bae6-e1e1ef563f17\") " pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.529961 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-proxy-tls\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530278 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530535 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-netns\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530534 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-etc-kubernetes\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530588 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530601 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-etc-kubernetes\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-ovn-kubernetes\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530651 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovn-node-metrics-cert\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530676 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-slash\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530702 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530747 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-socket-dir-parent\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-hostroot\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530801 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-os-release\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530895 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.530996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-openvswitch\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531033 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-config\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531096 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-log-socket\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531123 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-node-log\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531167 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-host-var-lib-kubelet\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-rootfs\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531489 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-system-cni-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531549 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-cni-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531588 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bc890659-71a7-4024-bae6-e1e1ef563f17-hosts-file\") pod \"node-resolver-w5lms\" (UID: \"bc890659-71a7-4024-bae6-e1e1ef563f17\") " pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.531625 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.531717 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:03.031687159 +0000 UTC m=+84.427741392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531743 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-script-lib\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531764 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-env-overrides\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531836 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-ovn-kubernetes\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.531612 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-conf-dir\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.533301 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-slash\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.533311 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-daemon-config\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.533323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.533453 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-multus-socket-dir-parent\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.533514 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-hostroot\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.533595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-os-release\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.534168 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r95ct\" (UniqueName: \"kubernetes.io/projected/0cfac2d6-6888-4b2d-982e-826f583396e8-kube-api-access-r95ct\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.534224 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0cfac2d6-6888-4b2d-982e-826f583396e8-host\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.534261 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535249 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535313 4885 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535350 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535382 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535410 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535438 4885 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535467 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535495 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535520 4885 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535552 4885 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535579 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535610 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535640 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535668 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535698 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535729 4885 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535759 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535792 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535820 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535847 4885 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535875 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535905 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535967 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.535997 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536025 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536231 4885 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536263 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536277 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536290 4885 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536307 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536321 4885 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536336 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536427 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536446 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536460 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536479 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536494 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536509 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536522 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.536536 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.537562 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovn-node-metrics-cert\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.539743 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0cfac2d6-6888-4b2d-982e-826f583396e8-serviceca\") pod \"node-ca-57qch\" (UID: \"0cfac2d6-6888-4b2d-982e-826f583396e8\") " pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547782 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547815 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547830 4885 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547846 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547862 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547877 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547895 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547908 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547941 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547957 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547968 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547981 4885 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.547994 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548006 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548024 4885 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548038 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548053 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548039 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548066 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548132 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548146 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548158 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548170 4885 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548182 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548194 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548205 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548224 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548237 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548254 4885 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548266 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548276 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548292 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548303 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548313 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548324 4885 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548333 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548343 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548353 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548362 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548373 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548448 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548777 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-proxy-tls\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.548894 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njr92\" (UniqueName: \"kubernetes.io/projected/3c5dda3b-3e01-4bb4-af02-b0f4eeadda58-kube-api-access-njr92\") pod \"machine-config-daemon-ttb97\" (UID: \"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\") " pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.550666 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.551009 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mlvt\" (UniqueName: \"kubernetes.io/projected/dedec2a4-d864-4f30-8a2d-b3168817ea34-kube-api-access-5mlvt\") pod \"ovnkube-node-bssfh\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.551155 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f "/env/_master" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: source "/env/_master" Mar 08 19:33:02 crc kubenswrapper[4885]: set +o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --disable-webhook \ Mar 08 19:33:02 crc kubenswrapper[4885]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --loglevel="${LOGLEVEL}" Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.551384 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.551873 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.551903 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.552009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.552263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.552327 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.552820 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.552969 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.552955 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bpng\" (UniqueName: \"kubernetes.io/projected/b33b5271-bda3-41ca-81a3-d47fff657c27-kube-api-access-2bpng\") pod \"ovnkube-control-plane-749d76644c-t2brt\" (UID: \"b33b5271-bda3-41ca-81a3-d47fff657c27\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.553028 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.553595 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.553733 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.553822 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554001 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554018 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554027 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554209 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554340 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554411 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554410 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554496 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554562 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6kv8\" (UniqueName: \"kubernetes.io/projected/bc890659-71a7-4024-bae6-e1e1ef563f17-kube-api-access-d6kv8\") pod \"node-resolver-w5lms\" (UID: \"bc890659-71a7-4024-bae6-e1e1ef563f17\") " pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554583 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554851 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554974 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.554987 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.555300 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.555484 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.555327 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.555982 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.556159 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr8jn\" (UniqueName: \"kubernetes.io/projected/2f639c4e-64b8-45e9-bf33-c1d8c376b438-kube-api-access-mr8jn\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.556716 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.557056 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.557244 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.557743 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.557890 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558048 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558072 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558155 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558250 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558300 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558352 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558428 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.558394 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.559159 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pllt\" (UniqueName: \"kubernetes.io/projected/9ac72c25-d3e6-4dda-8444-6cd4442af7e4-kube-api-access-7pllt\") pod \"multus-ff7b4\" (UID: \"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\") " pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.559323 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.559334 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.559580 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.560080 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.560603 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.560801 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.560956 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.560983 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.561413 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562127 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562224 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562513 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562680 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562790 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562852 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.562891 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.563036 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.563274 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.563302 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.563720 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.563911 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.564052 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.569884 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.573071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.578652 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.586200 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.595912 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.595997 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.601962 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.608494 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.608527 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.608539 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.608557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.608570 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650028 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650167 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ac600107-0c97-4ec8-89f6-598b40c166ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650245 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-os-release\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650419 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zkpk\" (UniqueName: \"kubernetes.io/projected/ac600107-0c97-4ec8-89f6-598b40c166ee-kube-api-access-2zkpk\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650517 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac600107-0c97-4ec8-89f6-598b40c166ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650581 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-cnibin\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650689 4885 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650721 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650744 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650765 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650785 4885 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650803 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650822 4885 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650842 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650861 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650881 4885 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650900 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650954 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650974 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.650993 4885 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651012 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651032 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651103 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-cnibin\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651106 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651155 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651176 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651194 4885 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651211 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651227 4885 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651243 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651261 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651285 4885 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651303 4885 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652100 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652134 4885 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652155 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652177 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651116 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-tuning-conf-dir\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652210 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652231 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652254 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652277 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652298 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652319 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.651323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ac600107-0c97-4ec8-89f6-598b40c166ee-os-release\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652340 4885 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652360 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ac600107-0c97-4ec8-89f6-598b40c166ee-cni-binary-copy\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ac600107-0c97-4ec8-89f6-598b40c166ee-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652439 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652516 4885 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652541 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652560 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652617 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652638 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652691 4885 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652789 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652811 4885 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652830 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652883 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652903 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.652995 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653022 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653084 4885 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653105 4885 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653128 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653185 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653203 4885 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653220 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653276 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653294 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653312 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653373 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653392 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653412 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653468 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653485 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653503 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653559 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653576 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653595 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653653 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653673 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653690 4885 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653750 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653768 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653785 4885 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653842 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653859 4885 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653877 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.653993 4885 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654023 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654086 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654103 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654123 4885 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654183 4885 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654202 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654257 4885 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654276 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654293 4885 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654345 4885 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654367 4885 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654385 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654402 4885 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654419 4885 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654438 4885 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654455 4885 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654515 4885 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654533 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654613 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654631 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654648 4885 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654665 4885 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654682 4885 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654700 4885 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654717 4885 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654734 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654750 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654802 4885 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654822 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654839 4885 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654855 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.654905 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.655312 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.655341 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.655359 4885 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.673855 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zkpk\" (UniqueName: \"kubernetes.io/projected/ac600107-0c97-4ec8-89f6-598b40c166ee-kube-api-access-2zkpk\") pod \"multus-additional-cni-plugins-25vxd\" (UID: \"ac600107-0c97-4ec8-89f6-598b40c166ee\") " pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.685488 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.699786 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-w5lms" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.706161 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-32129bf4ebb6810c5df5bbbd2b59359e58530376cc22115b360aa8fa3d75043c WatchSource:0}: Error finding container 32129bf4ebb6810c5df5bbbd2b59359e58530376cc22115b360aa8fa3d75043c: Status 404 returned error can't find the container with id 32129bf4ebb6810c5df5bbbd2b59359e58530376cc22115b360aa8fa3d75043c Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.711895 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.711967 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.711988 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.712014 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.712033 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.712245 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: source /etc/kubernetes/apiserver-url.env Mar 08 19:33:02 crc kubenswrapper[4885]: else Mar 08 19:33:02 crc kubenswrapper[4885]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 19:33:02 crc kubenswrapper[4885]: exit 1 Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.713521 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.725353 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc890659_71a7_4024_bae6_e1e1ef563f17.slice/crio-618bff7c84ae8c1591d90608b9d6fb9bb1e6dfda38c21c153f9db9b66aa40ae8 WatchSource:0}: Error finding container 618bff7c84ae8c1591d90608b9d6fb9bb1e6dfda38c21c153f9db9b66aa40ae8: Status 404 returned error can't find the container with id 618bff7c84ae8c1591d90608b9d6fb9bb1e6dfda38c21c153f9db9b66aa40ae8 Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.728940 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 08 19:33:02 crc kubenswrapper[4885]: set -uo pipefail Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 08 19:33:02 crc kubenswrapper[4885]: HOSTS_FILE="/etc/hosts" Mar 08 19:33:02 crc kubenswrapper[4885]: TEMP_FILE="/etc/hosts.tmp" Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # Make a temporary file with the old hosts file's attributes. Mar 08 19:33:02 crc kubenswrapper[4885]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 08 19:33:02 crc kubenswrapper[4885]: echo "Failed to preserve hosts file. Exiting." Mar 08 19:33:02 crc kubenswrapper[4885]: exit 1 Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: while true; do Mar 08 19:33:02 crc kubenswrapper[4885]: declare -A svc_ips Mar 08 19:33:02 crc kubenswrapper[4885]: for svc in "${services[@]}"; do Mar 08 19:33:02 crc kubenswrapper[4885]: # Fetch service IP from cluster dns if present. We make several tries Mar 08 19:33:02 crc kubenswrapper[4885]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 08 19:33:02 crc kubenswrapper[4885]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 08 19:33:02 crc kubenswrapper[4885]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 08 19:33:02 crc kubenswrapper[4885]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 19:33:02 crc kubenswrapper[4885]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 19:33:02 crc kubenswrapper[4885]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 19:33:02 crc kubenswrapper[4885]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 08 19:33:02 crc kubenswrapper[4885]: for i in ${!cmds[*]} Mar 08 19:33:02 crc kubenswrapper[4885]: do Mar 08 19:33:02 crc kubenswrapper[4885]: ips=($(eval "${cmds[i]}")) Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: svc_ips["${svc}"]="${ips[@]}" Mar 08 19:33:02 crc kubenswrapper[4885]: break Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # Update /etc/hosts only if we get valid service IPs Mar 08 19:33:02 crc kubenswrapper[4885]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 08 19:33:02 crc kubenswrapper[4885]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 08 19:33:02 crc kubenswrapper[4885]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 08 19:33:02 crc kubenswrapper[4885]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait Mar 08 19:33:02 crc kubenswrapper[4885]: continue Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # Append resolver entries for services Mar 08 19:33:02 crc kubenswrapper[4885]: rc=0 Mar 08 19:33:02 crc kubenswrapper[4885]: for svc in "${!svc_ips[@]}"; do Mar 08 19:33:02 crc kubenswrapper[4885]: for ip in ${svc_ips[${svc}]}; do Mar 08 19:33:02 crc kubenswrapper[4885]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ $rc -ne 0 ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait Mar 08 19:33:02 crc kubenswrapper[4885]: continue Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 08 19:33:02 crc kubenswrapper[4885]: # Replace /etc/hosts with our modified version if needed Mar 08 19:33:02 crc kubenswrapper[4885]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 08 19:33:02 crc kubenswrapper[4885]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait Mar 08 19:33:02 crc kubenswrapper[4885]: unset svc_ips Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d6kv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-w5lms_openshift-dns(bc890659-71a7-4024-bae6-e1e1ef563f17): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.730111 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-w5lms" podUID="bc890659-71a7-4024-bae6-e1e1ef563f17" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.743621 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.744351 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w5lms" event={"ID":"bc890659-71a7-4024-bae6-e1e1ef563f17","Type":"ContainerStarted","Data":"618bff7c84ae8c1591d90608b9d6fb9bb1e6dfda38c21c153f9db9b66aa40ae8"} Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.746269 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 08 19:33:02 crc kubenswrapper[4885]: set -uo pipefail Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 08 19:33:02 crc kubenswrapper[4885]: HOSTS_FILE="/etc/hosts" Mar 08 19:33:02 crc kubenswrapper[4885]: TEMP_FILE="/etc/hosts.tmp" Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # Make a temporary file with the old hosts file's attributes. Mar 08 19:33:02 crc kubenswrapper[4885]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 08 19:33:02 crc kubenswrapper[4885]: echo "Failed to preserve hosts file. Exiting." Mar 08 19:33:02 crc kubenswrapper[4885]: exit 1 Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: while true; do Mar 08 19:33:02 crc kubenswrapper[4885]: declare -A svc_ips Mar 08 19:33:02 crc kubenswrapper[4885]: for svc in "${services[@]}"; do Mar 08 19:33:02 crc kubenswrapper[4885]: # Fetch service IP from cluster dns if present. We make several tries Mar 08 19:33:02 crc kubenswrapper[4885]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 08 19:33:02 crc kubenswrapper[4885]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 08 19:33:02 crc kubenswrapper[4885]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 08 19:33:02 crc kubenswrapper[4885]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 19:33:02 crc kubenswrapper[4885]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 19:33:02 crc kubenswrapper[4885]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 08 19:33:02 crc kubenswrapper[4885]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 08 19:33:02 crc kubenswrapper[4885]: for i in ${!cmds[*]} Mar 08 19:33:02 crc kubenswrapper[4885]: do Mar 08 19:33:02 crc kubenswrapper[4885]: ips=($(eval "${cmds[i]}")) Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: svc_ips["${svc}"]="${ips[@]}" Mar 08 19:33:02 crc kubenswrapper[4885]: break Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # Update /etc/hosts only if we get valid service IPs Mar 08 19:33:02 crc kubenswrapper[4885]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 08 19:33:02 crc kubenswrapper[4885]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 08 19:33:02 crc kubenswrapper[4885]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 08 19:33:02 crc kubenswrapper[4885]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait Mar 08 19:33:02 crc kubenswrapper[4885]: continue Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # Append resolver entries for services Mar 08 19:33:02 crc kubenswrapper[4885]: rc=0 Mar 08 19:33:02 crc kubenswrapper[4885]: for svc in "${!svc_ips[@]}"; do Mar 08 19:33:02 crc kubenswrapper[4885]: for ip in ${svc_ips[${svc}]}; do Mar 08 19:33:02 crc kubenswrapper[4885]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ $rc -ne 0 ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait Mar 08 19:33:02 crc kubenswrapper[4885]: continue Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 08 19:33:02 crc kubenswrapper[4885]: # Replace /etc/hosts with our modified version if needed Mar 08 19:33:02 crc kubenswrapper[4885]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 08 19:33:02 crc kubenswrapper[4885]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait Mar 08 19:33:02 crc kubenswrapper[4885]: unset svc_ips Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d6kv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-w5lms_openshift-dns(bc890659-71a7-4024-bae6-e1e1ef563f17): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.746418 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"32129bf4ebb6810c5df5bbbd2b59359e58530376cc22115b360aa8fa3d75043c"} Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.747469 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-w5lms" podUID="bc890659-71a7-4024-bae6-e1e1ef563f17" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.749558 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a0b4213f6e1b55ac850ea4fd9ceaff623bc4fbc0699f7f8abd2524235c6c34a3"} Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.752049 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: source /etc/kubernetes/apiserver-url.env Mar 08 19:33:02 crc kubenswrapper[4885]: else Mar 08 19:33:02 crc kubenswrapper[4885]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 08 19:33:02 crc kubenswrapper[4885]: exit 1 Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.752782 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f "/env/_master" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: source "/env/_master" Mar 08 19:33:02 crc kubenswrapper[4885]: set +o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 08 19:33:02 crc kubenswrapper[4885]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 08 19:33:02 crc kubenswrapper[4885]: ho_enable="--enable-hybrid-overlay" Mar 08 19:33:02 crc kubenswrapper[4885]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 08 19:33:02 crc kubenswrapper[4885]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 08 19:33:02 crc kubenswrapper[4885]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --webhook-host=127.0.0.1 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --webhook-port=9743 \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${ho_enable} \ Mar 08 19:33:02 crc kubenswrapper[4885]: --enable-interconnect \ Mar 08 19:33:02 crc kubenswrapper[4885]: --disable-approver \ Mar 08 19:33:02 crc kubenswrapper[4885]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --wait-for-kubernetes-api=200s \ Mar 08 19:33:02 crc kubenswrapper[4885]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --loglevel="${LOGLEVEL}" Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.753253 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.754540 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ff7b4" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.760018 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f "/env/_master" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: source "/env/_master" Mar 08 19:33:02 crc kubenswrapper[4885]: set +o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --disable-webhook \ Mar 08 19:33:02 crc kubenswrapper[4885]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --loglevel="${LOGLEVEL}" Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.761785 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.762678 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.766914 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-91535920cc74a837cdd04ae33741adccd66ba5dbbe0b8079fc1da551b4b2ffc8 WatchSource:0}: Error finding container 91535920cc74a837cdd04ae33741adccd66ba5dbbe0b8079fc1da551b4b2ffc8: Status 404 returned error can't find the container with id 91535920cc74a837cdd04ae33741adccd66ba5dbbe0b8079fc1da551b4b2ffc8 Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.772411 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.772610 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.773829 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.780864 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ac72c25_d3e6_4dda_8444_6cd4442af7e4.slice/crio-b756091d164726bb7b3357e4fa113e037011faab93456cc52a0ef3704483935e WatchSource:0}: Error finding container b756091d164726bb7b3357e4fa113e037011faab93456cc52a0ef3704483935e: Status 404 returned error can't find the container with id b756091d164726bb7b3357e4fa113e037011faab93456cc52a0ef3704483935e Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.781671 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.792957 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 08 19:33:02 crc kubenswrapper[4885]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 08 19:33:02 crc kubenswrapper[4885]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7pllt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-ff7b4_openshift-multus(9ac72c25-d3e6-4dda-8444-6cd4442af7e4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.794953 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-ff7b4" podUID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.796476 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.802137 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.811441 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 08 19:33:02 crc kubenswrapper[4885]: apiVersion: v1 Mar 08 19:33:02 crc kubenswrapper[4885]: clusters: Mar 08 19:33:02 crc kubenswrapper[4885]: - cluster: Mar 08 19:33:02 crc kubenswrapper[4885]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 08 19:33:02 crc kubenswrapper[4885]: server: https://api-int.crc.testing:6443 Mar 08 19:33:02 crc kubenswrapper[4885]: name: default-cluster Mar 08 19:33:02 crc kubenswrapper[4885]: contexts: Mar 08 19:33:02 crc kubenswrapper[4885]: - context: Mar 08 19:33:02 crc kubenswrapper[4885]: cluster: default-cluster Mar 08 19:33:02 crc kubenswrapper[4885]: namespace: default Mar 08 19:33:02 crc kubenswrapper[4885]: user: default-auth Mar 08 19:33:02 crc kubenswrapper[4885]: name: default-context Mar 08 19:33:02 crc kubenswrapper[4885]: current-context: default-context Mar 08 19:33:02 crc kubenswrapper[4885]: kind: Config Mar 08 19:33:02 crc kubenswrapper[4885]: preferences: {} Mar 08 19:33:02 crc kubenswrapper[4885]: users: Mar 08 19:33:02 crc kubenswrapper[4885]: - name: default-auth Mar 08 19:33:02 crc kubenswrapper[4885]: user: Mar 08 19:33:02 crc kubenswrapper[4885]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 08 19:33:02 crc kubenswrapper[4885]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 08 19:33:02 crc kubenswrapper[4885]: EOF Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mlvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.811730 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.812559 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.814879 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.815091 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.815115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.815146 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.815167 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.817192 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.820584 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb33b5271_bda3_41ca_81a3_d47fff657c27.slice/crio-f3227465f55346847f5550884a45b9b226294df4f4196094a95b4362ce78201b WatchSource:0}: Error finding container f3227465f55346847f5550884a45b9b226294df4f4196094a95b4362ce78201b: Status 404 returned error can't find the container with id f3227465f55346847f5550884a45b9b226294df4f4196094a95b4362ce78201b Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.824706 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.827959 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 08 19:33:02 crc kubenswrapper[4885]: set -euo pipefail Mar 08 19:33:02 crc kubenswrapper[4885]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 08 19:33:02 crc kubenswrapper[4885]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 08 19:33:02 crc kubenswrapper[4885]: # As the secret mount is optional we must wait for the files to be present. Mar 08 19:33:02 crc kubenswrapper[4885]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 08 19:33:02 crc kubenswrapper[4885]: TS=$(date +%s) Mar 08 19:33:02 crc kubenswrapper[4885]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 08 19:33:02 crc kubenswrapper[4885]: HAS_LOGGED_INFO=0 Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: log_missing_certs(){ Mar 08 19:33:02 crc kubenswrapper[4885]: CUR_TS=$(date +%s) Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 08 19:33:02 crc kubenswrapper[4885]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 08 19:33:02 crc kubenswrapper[4885]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 08 19:33:02 crc kubenswrapper[4885]: HAS_LOGGED_INFO=1 Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: } Mar 08 19:33:02 crc kubenswrapper[4885]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 08 19:33:02 crc kubenswrapper[4885]: log_missing_certs Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 5 Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/kube-rbac-proxy \ Mar 08 19:33:02 crc kubenswrapper[4885]: --logtostderr \ Mar 08 19:33:02 crc kubenswrapper[4885]: --secure-listen-address=:9108 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 08 19:33:02 crc kubenswrapper[4885]: --upstream=http://127.0.0.1:29108/ \ Mar 08 19:33:02 crc kubenswrapper[4885]: --tls-private-key-file=${TLS_PK} \ Mar 08 19:33:02 crc kubenswrapper[4885]: --tls-cert-file=${TLS_CERT} Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bpng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-t2brt_openshift-ovn-kubernetes(b33b5271-bda3-41ca-81a3-d47fff657c27): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.831778 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ -f "/env/_master" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: source "/env/_master" Mar 08 19:33:02 crc kubenswrapper[4885]: set +o allexport Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v4_join_subnet_opt= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v6_join_subnet_opt= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v4_transit_switch_subnet_opt= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v6_transit_switch_subnet_opt= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: dns_name_resolver_enabled_flag= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "false" == "true" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: persistent_ips_enabled_flag= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "true" == "true" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: # This is needed so that converting clusters from GA to TP Mar 08 19:33:02 crc kubenswrapper[4885]: # will rollout control plane pods as well Mar 08 19:33:02 crc kubenswrapper[4885]: network_segmentation_enabled_flag= Mar 08 19:33:02 crc kubenswrapper[4885]: multi_network_enabled_flag= Mar 08 19:33:02 crc kubenswrapper[4885]: if [[ "true" == "true" ]]; then Mar 08 19:33:02 crc kubenswrapper[4885]: multi_network_enabled_flag="--enable-multi-network" Mar 08 19:33:02 crc kubenswrapper[4885]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: Mar 08 19:33:02 crc kubenswrapper[4885]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 08 19:33:02 crc kubenswrapper[4885]: exec /usr/bin/ovnkube \ Mar 08 19:33:02 crc kubenswrapper[4885]: --enable-interconnect \ Mar 08 19:33:02 crc kubenswrapper[4885]: --init-cluster-manager "${K8S_NODE}" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 08 19:33:02 crc kubenswrapper[4885]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --metrics-bind-address "127.0.0.1:29108" \ Mar 08 19:33:02 crc kubenswrapper[4885]: --metrics-enable-pprof \ Mar 08 19:33:02 crc kubenswrapper[4885]: --metrics-enable-config-duration \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${ovn_v4_join_subnet_opt} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${ovn_v6_join_subnet_opt} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${dns_name_resolver_enabled_flag} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${persistent_ips_enabled_flag} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${multi_network_enabled_flag} \ Mar 08 19:33:02 crc kubenswrapper[4885]: ${network_segmentation_enabled_flag} Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bpng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-t2brt_openshift-ovn-kubernetes(b33b5271-bda3-41ca-81a3-d47fff657c27): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.832867 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c5dda3b_3e01_4bb4_af02_b0f4eeadda58.slice/crio-3a743dc1993b7ae50186d5ba219f5dc206b18cdde6c8c1f56f6cb58951c8515f WatchSource:0}: Error finding container 3a743dc1993b7ae50186d5ba219f5dc206b18cdde6c8c1f56f6cb58951c8515f: Status 404 returned error can't find the container with id 3a743dc1993b7ae50186d5ba219f5dc206b18cdde6c8c1f56f6cb58951c8515f Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.833153 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" podUID="b33b5271-bda3-41ca-81a3-d47fff657c27" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.836263 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-57qch" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.836344 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.837155 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njr92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.843644 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njr92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.845024 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.849332 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: W0308 19:33:02.852654 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cfac2d6_6888_4b2d_982e_826f583396e8.slice/crio-1ec3521f50a626c48f9b835a5ec3251059ddfd747295fa9e337ac8388359dc47 WatchSource:0}: Error finding container 1ec3521f50a626c48f9b835a5ec3251059ddfd747295fa9e337ac8388359dc47: Status 404 returned error can't find the container with id 1ec3521f50a626c48f9b835a5ec3251059ddfd747295fa9e337ac8388359dc47 Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.856349 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:02 crc kubenswrapper[4885]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 08 19:33:02 crc kubenswrapper[4885]: while [ true ]; Mar 08 19:33:02 crc kubenswrapper[4885]: do Mar 08 19:33:02 crc kubenswrapper[4885]: for f in $(ls /tmp/serviceca); do Mar 08 19:33:02 crc kubenswrapper[4885]: echo $f Mar 08 19:33:02 crc kubenswrapper[4885]: ca_file_path="/tmp/serviceca/${f}" Mar 08 19:33:02 crc kubenswrapper[4885]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 08 19:33:02 crc kubenswrapper[4885]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 08 19:33:02 crc kubenswrapper[4885]: if [ -e "${reg_dir_path}" ]; then Mar 08 19:33:02 crc kubenswrapper[4885]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 08 19:33:02 crc kubenswrapper[4885]: else Mar 08 19:33:02 crc kubenswrapper[4885]: mkdir $reg_dir_path Mar 08 19:33:02 crc kubenswrapper[4885]: cp $ca_file_path $reg_dir_path/ca.crt Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: for d in $(ls /etc/docker/certs.d); do Mar 08 19:33:02 crc kubenswrapper[4885]: echo $d Mar 08 19:33:02 crc kubenswrapper[4885]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 08 19:33:02 crc kubenswrapper[4885]: reg_conf_path="/tmp/serviceca/${dp}" Mar 08 19:33:02 crc kubenswrapper[4885]: if [ ! -e "${reg_conf_path}" ]; then Mar 08 19:33:02 crc kubenswrapper[4885]: rm -rf /etc/docker/certs.d/$d Mar 08 19:33:02 crc kubenswrapper[4885]: fi Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: sleep 60 & wait ${!} Mar 08 19:33:02 crc kubenswrapper[4885]: done Mar 08 19:33:02 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r95ct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-57qch_openshift-image-registry(0cfac2d6-6888-4b2d-982e-826f583396e8): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:02 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.858292 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-57qch" podUID="0cfac2d6-6888-4b2d-982e-826f583396e8" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.860353 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-25vxd" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.861439 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.883332 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2zkpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-25vxd_openshift-multus(ac600107-0c97-4ec8-89f6-598b40c166ee): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.890074 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-25vxd" podUID="ac600107-0c97-4ec8-89f6-598b40c166ee" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.909267 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.926127 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.926178 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.926190 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.926208 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.926221 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:02Z","lastTransitionTime":"2026-03-08T19:33:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.934650 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.947433 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.959135 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.963240 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.963281 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.963306 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.963343 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963393 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963455 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963468 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963488 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:03.963461879 +0000 UTC m=+85.359515942 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963489 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963607 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963614 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:03.963585682 +0000 UTC m=+85.359639705 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963488 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963757 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963782 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963672 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:03.963649814 +0000 UTC m=+85.359703847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: E0308 19:33:02.963859 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:03.963831979 +0000 UTC m=+85.359886002 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.973096 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:02 crc kubenswrapper[4885]: I0308 19:33:02.991902 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.003352 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.018677 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.029004 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.029047 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.029057 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.029073 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.029086 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.032716 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.047258 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.058664 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.064116 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.064351 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.064547 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.064611 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:04.064592549 +0000 UTC m=+85.460646572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.064674 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:33:04.064667831 +0000 UTC m=+85.460721854 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.067646 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.076875 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.086818 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.109453 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.122348 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.132086 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.132124 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.132136 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.132162 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.132174 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.139294 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.150185 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.164175 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.180126 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.235415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.235506 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.235531 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.235570 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.235590 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.338785 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.338842 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.338940 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.338997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.339032 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.375363 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.375961 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.377496 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.378273 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.379426 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.379992 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.380656 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.381969 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.382732 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.383848 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.384553 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.385830 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.386359 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.387024 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.388128 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.388715 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.389738 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.390188 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.390759 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.392313 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.392876 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.393888 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.394390 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.395489 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.396023 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.396695 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.398554 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.399627 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.400878 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.401891 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.402907 4885 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.403156 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.406002 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.407762 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.409172 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.413704 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.416282 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.418575 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.420118 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.422623 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.424175 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.426863 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.428652 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.430728 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.432969 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.437885 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.440184 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.443323 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.444616 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.444704 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.444765 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.444783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.444809 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.444828 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.446516 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.447546 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.449596 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.450836 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.451857 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.547410 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.547462 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.547480 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.547504 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.547524 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.650956 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.651029 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.651048 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.651079 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.651100 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.753981 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.754032 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.754050 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.754074 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.754092 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.755347 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"54661ff92d86d446f0561f70be37d97fffc952cd7edc4f3f4e212f70264f4183"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.757168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"3a743dc1993b7ae50186d5ba219f5dc206b18cdde6c8c1f56f6cb58951c8515f"} Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.758855 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:03 crc kubenswrapper[4885]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 08 19:33:03 crc kubenswrapper[4885]: apiVersion: v1 Mar 08 19:33:03 crc kubenswrapper[4885]: clusters: Mar 08 19:33:03 crc kubenswrapper[4885]: - cluster: Mar 08 19:33:03 crc kubenswrapper[4885]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 08 19:33:03 crc kubenswrapper[4885]: server: https://api-int.crc.testing:6443 Mar 08 19:33:03 crc kubenswrapper[4885]: name: default-cluster Mar 08 19:33:03 crc kubenswrapper[4885]: contexts: Mar 08 19:33:03 crc kubenswrapper[4885]: - context: Mar 08 19:33:03 crc kubenswrapper[4885]: cluster: default-cluster Mar 08 19:33:03 crc kubenswrapper[4885]: namespace: default Mar 08 19:33:03 crc kubenswrapper[4885]: user: default-auth Mar 08 19:33:03 crc kubenswrapper[4885]: name: default-context Mar 08 19:33:03 crc kubenswrapper[4885]: current-context: default-context Mar 08 19:33:03 crc kubenswrapper[4885]: kind: Config Mar 08 19:33:03 crc kubenswrapper[4885]: preferences: {} Mar 08 19:33:03 crc kubenswrapper[4885]: users: Mar 08 19:33:03 crc kubenswrapper[4885]: - name: default-auth Mar 08 19:33:03 crc kubenswrapper[4885]: user: Mar 08 19:33:03 crc kubenswrapper[4885]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 08 19:33:03 crc kubenswrapper[4885]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 08 19:33:03 crc kubenswrapper[4885]: EOF Mar 08 19:33:03 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5mlvt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:03 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.760074 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.760081 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njr92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.760516 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"91535920cc74a837cdd04ae33741adccd66ba5dbbe0b8079fc1da551b4b2ffc8"} Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.762650 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njr92,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.762887 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerStarted","Data":"c7b581b664bb832fbfee0daba19f0962b65d3323b906586299c1c0a80cddcb36"} Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.763053 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.764020 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.764219 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.765091 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2zkpk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-25vxd_openshift-multus(ac600107-0c97-4ec8-89f6-598b40c166ee): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.765429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerStarted","Data":"b756091d164726bb7b3357e4fa113e037011faab93456cc52a0ef3704483935e"} Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.766168 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-25vxd" podUID="ac600107-0c97-4ec8-89f6-598b40c166ee" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.767315 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:03 crc kubenswrapper[4885]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 08 19:33:03 crc kubenswrapper[4885]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 08 19:33:03 crc kubenswrapper[4885]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7pllt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-ff7b4_openshift-multus(9ac72c25-d3e6-4dda-8444-6cd4442af7e4): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:03 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.767322 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-57qch" event={"ID":"0cfac2d6-6888-4b2d-982e-826f583396e8","Type":"ContainerStarted","Data":"1ec3521f50a626c48f9b835a5ec3251059ddfd747295fa9e337ac8388359dc47"} Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.768416 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-ff7b4" podUID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.769121 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" event={"ID":"b33b5271-bda3-41ca-81a3-d47fff657c27","Type":"ContainerStarted","Data":"f3227465f55346847f5550884a45b9b226294df4f4196094a95b4362ce78201b"} Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.769175 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:03 crc kubenswrapper[4885]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 08 19:33:03 crc kubenswrapper[4885]: while [ true ]; Mar 08 19:33:03 crc kubenswrapper[4885]: do Mar 08 19:33:03 crc kubenswrapper[4885]: for f in $(ls /tmp/serviceca); do Mar 08 19:33:03 crc kubenswrapper[4885]: echo $f Mar 08 19:33:03 crc kubenswrapper[4885]: ca_file_path="/tmp/serviceca/${f}" Mar 08 19:33:03 crc kubenswrapper[4885]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 08 19:33:03 crc kubenswrapper[4885]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 08 19:33:03 crc kubenswrapper[4885]: if [ -e "${reg_dir_path}" ]; then Mar 08 19:33:03 crc kubenswrapper[4885]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 08 19:33:03 crc kubenswrapper[4885]: else Mar 08 19:33:03 crc kubenswrapper[4885]: mkdir $reg_dir_path Mar 08 19:33:03 crc kubenswrapper[4885]: cp $ca_file_path $reg_dir_path/ca.crt Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: done Mar 08 19:33:03 crc kubenswrapper[4885]: for d in $(ls /etc/docker/certs.d); do Mar 08 19:33:03 crc kubenswrapper[4885]: echo $d Mar 08 19:33:03 crc kubenswrapper[4885]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 08 19:33:03 crc kubenswrapper[4885]: reg_conf_path="/tmp/serviceca/${dp}" Mar 08 19:33:03 crc kubenswrapper[4885]: if [ ! -e "${reg_conf_path}" ]; then Mar 08 19:33:03 crc kubenswrapper[4885]: rm -rf /etc/docker/certs.d/$d Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: done Mar 08 19:33:03 crc kubenswrapper[4885]: sleep 60 & wait ${!} Mar 08 19:33:03 crc kubenswrapper[4885]: done Mar 08 19:33:03 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r95ct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-57qch_openshift-image-registry(0cfac2d6-6888-4b2d-982e-826f583396e8): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:03 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.770564 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-57qch" podUID="0cfac2d6-6888-4b2d-982e-826f583396e8" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.770762 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:03 crc kubenswrapper[4885]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 08 19:33:03 crc kubenswrapper[4885]: set -euo pipefail Mar 08 19:33:03 crc kubenswrapper[4885]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 08 19:33:03 crc kubenswrapper[4885]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 08 19:33:03 crc kubenswrapper[4885]: # As the secret mount is optional we must wait for the files to be present. Mar 08 19:33:03 crc kubenswrapper[4885]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 08 19:33:03 crc kubenswrapper[4885]: TS=$(date +%s) Mar 08 19:33:03 crc kubenswrapper[4885]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 08 19:33:03 crc kubenswrapper[4885]: HAS_LOGGED_INFO=0 Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: log_missing_certs(){ Mar 08 19:33:03 crc kubenswrapper[4885]: CUR_TS=$(date +%s) Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 08 19:33:03 crc kubenswrapper[4885]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 08 19:33:03 crc kubenswrapper[4885]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 08 19:33:03 crc kubenswrapper[4885]: HAS_LOGGED_INFO=1 Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: } Mar 08 19:33:03 crc kubenswrapper[4885]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 08 19:33:03 crc kubenswrapper[4885]: log_missing_certs Mar 08 19:33:03 crc kubenswrapper[4885]: sleep 5 Mar 08 19:33:03 crc kubenswrapper[4885]: done Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 08 19:33:03 crc kubenswrapper[4885]: exec /usr/bin/kube-rbac-proxy \ Mar 08 19:33:03 crc kubenswrapper[4885]: --logtostderr \ Mar 08 19:33:03 crc kubenswrapper[4885]: --secure-listen-address=:9108 \ Mar 08 19:33:03 crc kubenswrapper[4885]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 08 19:33:03 crc kubenswrapper[4885]: --upstream=http://127.0.0.1:29108/ \ Mar 08 19:33:03 crc kubenswrapper[4885]: --tls-private-key-file=${TLS_PK} \ Mar 08 19:33:03 crc kubenswrapper[4885]: --tls-cert-file=${TLS_CERT} Mar 08 19:33:03 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bpng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-t2brt_openshift-ovn-kubernetes(b33b5271-bda3-41ca-81a3-d47fff657c27): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:03 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.773813 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:33:03 crc kubenswrapper[4885]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ -f "/env/_master" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: set -o allexport Mar 08 19:33:03 crc kubenswrapper[4885]: source "/env/_master" Mar 08 19:33:03 crc kubenswrapper[4885]: set +o allexport Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v4_join_subnet_opt= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v6_join_subnet_opt= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v4_transit_switch_subnet_opt= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v6_transit_switch_subnet_opt= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "" != "" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: dns_name_resolver_enabled_flag= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "false" == "true" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: persistent_ips_enabled_flag= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "true" == "true" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: # This is needed so that converting clusters from GA to TP Mar 08 19:33:03 crc kubenswrapper[4885]: # will rollout control plane pods as well Mar 08 19:33:03 crc kubenswrapper[4885]: network_segmentation_enabled_flag= Mar 08 19:33:03 crc kubenswrapper[4885]: multi_network_enabled_flag= Mar 08 19:33:03 crc kubenswrapper[4885]: if [[ "true" == "true" ]]; then Mar 08 19:33:03 crc kubenswrapper[4885]: multi_network_enabled_flag="--enable-multi-network" Mar 08 19:33:03 crc kubenswrapper[4885]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 08 19:33:03 crc kubenswrapper[4885]: fi Mar 08 19:33:03 crc kubenswrapper[4885]: Mar 08 19:33:03 crc kubenswrapper[4885]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 08 19:33:03 crc kubenswrapper[4885]: exec /usr/bin/ovnkube \ Mar 08 19:33:03 crc kubenswrapper[4885]: --enable-interconnect \ Mar 08 19:33:03 crc kubenswrapper[4885]: --init-cluster-manager "${K8S_NODE}" \ Mar 08 19:33:03 crc kubenswrapper[4885]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 08 19:33:03 crc kubenswrapper[4885]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 08 19:33:03 crc kubenswrapper[4885]: --metrics-bind-address "127.0.0.1:29108" \ Mar 08 19:33:03 crc kubenswrapper[4885]: --metrics-enable-pprof \ Mar 08 19:33:03 crc kubenswrapper[4885]: --metrics-enable-config-duration \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${ovn_v4_join_subnet_opt} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${ovn_v6_join_subnet_opt} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${dns_name_resolver_enabled_flag} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${persistent_ips_enabled_flag} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${multi_network_enabled_flag} \ Mar 08 19:33:03 crc kubenswrapper[4885]: ${network_segmentation_enabled_flag} Mar 08 19:33:03 crc kubenswrapper[4885]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2bpng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-t2brt_openshift-ovn-kubernetes(b33b5271-bda3-41ca-81a3-d47fff657c27): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 08 19:33:03 crc kubenswrapper[4885]: > logger="UnhandledError" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.775010 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" podUID="b33b5271-bda3-41ca-81a3-d47fff657c27" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.777882 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.793624 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.808795 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.823493 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.837884 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.850542 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.858133 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.858186 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.858246 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.858277 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.858332 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.859254 4885 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.867682 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.881179 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.897817 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.924254 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.940387 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.959802 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.961975 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.962025 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.962042 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.962069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.962089 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:03Z","lastTransitionTime":"2026-03-08T19:33:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.973248 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.975007 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.975166 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.975458 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.975597 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:05.975560948 +0000 UTC m=+87.371615011 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.975720 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.975763 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.975786 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.976003 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:05.975865386 +0000 UTC m=+87.371919439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.976161 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.976232 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.976374 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.976428 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:05.976411762 +0000 UTC m=+87.372465815 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.977103 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.977159 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.977192 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:03 crc kubenswrapper[4885]: E0308 19:33:03.977280 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:05.977254544 +0000 UTC m=+87.373308827 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:03 crc kubenswrapper[4885]: I0308 19:33:03.989392 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.007082 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.019597 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.030803 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.049296 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.065525 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.065594 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.065623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.065663 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.065691 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.067579 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.077634 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.077957 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:33:06.077871331 +0000 UTC m=+87.473925384 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.078168 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.078351 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.078439 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:06.078422966 +0000 UTC m=+87.474477029 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.099776 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.114359 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.137078 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.155916 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.170327 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.170407 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.170426 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.170475 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.170498 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.174292 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.194680 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.205804 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.220411 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.230848 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.274308 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.274369 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.274382 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.274402 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.274416 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.368476 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.368633 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.368646 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.368772 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.368642 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.369297 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.368621 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:04 crc kubenswrapper[4885]: E0308 19:33:04.370010 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.377779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.377841 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.377861 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.377885 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.377904 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.391438 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.482063 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.482128 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.482145 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.482172 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.482191 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.586185 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.586267 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.586285 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.586318 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.586337 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.690460 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.690529 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.690548 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.690575 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.690592 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.794524 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.794637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.794656 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.794681 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.794701 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.899184 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.899268 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.899295 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.899328 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:04 crc kubenswrapper[4885]: I0308 19:33:04.899351 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:04Z","lastTransitionTime":"2026-03-08T19:33:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.002640 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.002734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.002754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.002785 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.002804 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.106745 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.106820 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.106842 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.106869 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.106888 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.210689 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.210768 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.210785 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.210832 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.210852 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.313826 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.313911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.313976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.314013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.314036 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.417194 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.417305 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.417325 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.417348 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.417365 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.520576 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.520644 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.520661 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.520688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.520706 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.623851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.623964 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.623991 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.624021 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.624043 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.727639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.727712 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.727736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.727768 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.727791 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.831014 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.831114 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.831138 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.831167 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.831188 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.934589 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.934651 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.934669 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.934693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:05 crc kubenswrapper[4885]: I0308 19:33:05.934710 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:05Z","lastTransitionTime":"2026-03-08T19:33:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.002229 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.002342 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.002382 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.002415 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002488 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002530 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002538 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002572 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002573 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002649 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002703 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002724 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002608 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:10.002587194 +0000 UTC m=+91.398641247 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002887 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:10.002802939 +0000 UTC m=+91.398856992 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.002992 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:10.002912892 +0000 UTC m=+91.398967025 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.003074 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:10.003020025 +0000 UTC m=+91.399074088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.038150 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.038262 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.038286 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.038309 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.038363 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.103136 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.103435 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:33:10.103365415 +0000 UTC m=+91.499419468 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.103543 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.103701 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.103788 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:10.103767455 +0000 UTC m=+91.499821518 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.141239 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.141303 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.141320 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.141345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.141363 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.244841 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.244909 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.244961 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.244997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.245015 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.349715 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.349783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.349802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.349830 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.349855 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.367131 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.367143 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.367207 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.367297 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.367441 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.367646 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.367898 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:06 crc kubenswrapper[4885]: E0308 19:33:06.368146 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.455039 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.455108 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.455134 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.455177 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.455196 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.558442 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.558597 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.558907 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.558994 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.559016 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.663562 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.663631 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.663652 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.663685 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.663707 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.767782 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.767858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.767882 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.767912 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.767977 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.871371 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.871434 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.871450 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.871474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.871491 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.974788 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.974870 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.974890 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.974916 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:06 crc kubenswrapper[4885]: I0308 19:33:06.974962 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:06Z","lastTransitionTime":"2026-03-08T19:33:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.078722 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.078800 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.078823 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.078858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.078878 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.182096 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.182163 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.182182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.182208 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.182226 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.286067 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.286135 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.286153 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.286176 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.286194 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.384353 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.384357 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 19:33:07 crc kubenswrapper[4885]: E0308 19:33:07.384604 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.388467 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.388513 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.388529 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.388549 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.388566 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.499285 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.499349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.499371 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.499396 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.499410 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.604544 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.604596 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.604606 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.604622 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.604632 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.708807 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.708895 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.708915 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.708976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.708998 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.783305 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:33:07 crc kubenswrapper[4885]: E0308 19:33:07.783543 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.812853 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.812903 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.812937 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.812956 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.812967 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.916162 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.916222 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.916234 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.916254 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:07 crc kubenswrapper[4885]: I0308 19:33:07.916266 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:07Z","lastTransitionTime":"2026-03-08T19:33:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.019588 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.019655 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.019671 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.019697 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.019721 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.123705 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.123825 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.123897 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.123980 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.124016 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.227115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.227290 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.227304 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.227324 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.227338 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.331175 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.331252 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.331271 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.331301 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.331324 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.367403 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.367467 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.367413 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.367631 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:08 crc kubenswrapper[4885]: E0308 19:33:08.367803 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:08 crc kubenswrapper[4885]: E0308 19:33:08.368243 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:08 crc kubenswrapper[4885]: E0308 19:33:08.368499 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:08 crc kubenswrapper[4885]: E0308 19:33:08.368591 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.432171 4885 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.434311 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.434363 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.434380 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.434406 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.434425 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.536458 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.536510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.536521 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.536538 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.536549 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.639091 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.639162 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.639173 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.639193 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.639206 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.742744 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.742823 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.742841 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.742867 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.742887 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.846379 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.846448 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.846466 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.846492 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.846509 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.949851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.949952 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.949972 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.949998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:08 crc kubenswrapper[4885]: I0308 19:33:08.950018 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:08Z","lastTransitionTime":"2026-03-08T19:33:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.052384 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.052448 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.052470 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.052502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.052522 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.155802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.155861 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.155875 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.155897 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.155911 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.258696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.259078 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.259985 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.260148 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.260258 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.369439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.369499 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.369518 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.369575 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.369593 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.383329 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.404068 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.417827 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.432416 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.444700 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.472723 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.473605 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.473699 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.473715 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.473736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.473750 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.489083 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.507303 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.520370 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.548514 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.565574 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.576847 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.577541 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.577629 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.577650 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.577688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.577709 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.585837 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.593311 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.605105 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.615418 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.681245 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.681349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.681369 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.681415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.681433 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.784119 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.784198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.784223 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.784256 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.784276 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.887172 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.887232 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.887246 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.887269 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.887284 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.991292 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.992081 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.992128 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.992156 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:09 crc kubenswrapper[4885]: I0308 19:33:09.992174 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:09Z","lastTransitionTime":"2026-03-08T19:33:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.022863 4885 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.054821 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.054969 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.055030 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.055100 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055180 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055234 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055317 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055364 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055392 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055335 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:18.055295218 +0000 UTC m=+99.451349281 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055528 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:18.055495743 +0000 UTC m=+99.451549976 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055565 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:18.055553864 +0000 UTC m=+99.451608157 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055564 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055615 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055641 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.055751 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:18.055723809 +0000 UTC m=+99.451777862 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.095474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.095596 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.095621 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.095653 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.095672 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.156373 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.156633 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:33:18.156588473 +0000 UTC m=+99.552642546 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.156976 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.157178 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.157274 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:18.157252581 +0000 UTC m=+99.553306694 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.198134 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.198220 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.198238 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.198268 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.198291 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.301015 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.301118 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.301143 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.301174 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.301196 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.368020 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.368091 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.368056 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.368059 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.368288 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.368393 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.368491 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.368575 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.409344 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.409418 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.409434 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.409457 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.409477 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.513116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.513189 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.513208 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.513236 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.513256 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.616478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.616549 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.616576 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.616613 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.616636 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.720011 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.720069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.720085 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.720109 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.720126 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.823295 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.823379 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.823398 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.823430 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.823452 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.926465 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.926566 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.926588 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.926620 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.926643 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.928902 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.928985 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.929004 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.929036 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.929058 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.942948 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.948080 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.948135 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.948153 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.948182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.948205 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.963316 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.968279 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.968358 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.968378 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.968409 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.968430 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:10 crc kubenswrapper[4885]: E0308 19:33:10.980074 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.984792 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.984849 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.984872 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.984905 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:10 crc kubenswrapper[4885]: I0308 19:33:10.984961 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:10Z","lastTransitionTime":"2026-03-08T19:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: E0308 19:33:11.001815 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.007150 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.007202 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.007220 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.007243 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.007261 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: E0308 19:33:11.023436 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:11 crc kubenswrapper[4885]: E0308 19:33:11.023696 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.030220 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.030276 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.030304 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.030331 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.030353 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.133442 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.133479 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.133499 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.133521 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.133550 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.236036 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.236090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.236108 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.236131 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.236149 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.339652 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.339711 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.339728 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.339754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.339772 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.443753 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.443811 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.443828 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.443851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.443869 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.547546 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.547619 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.547671 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.547695 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.547711 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.651386 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.651503 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.651528 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.651564 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.651587 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.755278 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.755352 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.755372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.755402 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.755428 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.858682 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.858743 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.858763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.858805 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.858824 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.962128 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.962171 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.962181 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.962198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:11 crc kubenswrapper[4885]: I0308 19:33:11.962210 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:11Z","lastTransitionTime":"2026-03-08T19:33:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.064619 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.064696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.064718 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.064748 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.064774 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.168001 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.168082 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.168104 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.168139 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.168162 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.273821 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.273903 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.273938 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.273969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.273994 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.367214 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.367272 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.367439 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:12 crc kubenswrapper[4885]: E0308 19:33:12.367443 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.367493 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:12 crc kubenswrapper[4885]: E0308 19:33:12.367653 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:12 crc kubenswrapper[4885]: E0308 19:33:12.367775 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:12 crc kubenswrapper[4885]: E0308 19:33:12.367865 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.379174 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.379230 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.379248 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.379273 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.379294 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.482486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.482557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.482578 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.482608 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.482632 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.586506 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.586598 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.586630 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.586665 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.586686 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.690135 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.690202 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.690216 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.690238 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.690658 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.795028 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.795583 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.795733 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.795888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.796143 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.899498 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.899567 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.899584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.899611 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:12 crc kubenswrapper[4885]: I0308 19:33:12.899631 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:12Z","lastTransitionTime":"2026-03-08T19:33:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.003008 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.003121 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.003148 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.003190 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.003220 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.107103 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.107164 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.107183 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.107207 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.107225 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.210511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.210582 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.210604 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.210630 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.210647 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.314088 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.314512 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.314696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.314829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.315007 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.421355 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.421408 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.421425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.421451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.421473 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.525295 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.525335 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.525348 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.525369 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.525383 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.629377 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.629426 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.629446 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.629469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.629484 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.733382 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.733451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.733473 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.733501 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.733527 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.804704 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-w5lms" event={"ID":"bc890659-71a7-4024-bae6-e1e1ef563f17","Type":"ContainerStarted","Data":"b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.821528 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.832835 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.838151 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.838211 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.838229 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.838258 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.838283 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.861979 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.880051 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.894426 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.910045 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.921245 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.937789 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.941362 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.941411 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.941431 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.941462 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.941482 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:13Z","lastTransitionTime":"2026-03-08T19:33:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.960048 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:13 crc kubenswrapper[4885]: I0308 19:33:13.988324 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.003863 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.024369 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.034403 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.046616 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.046687 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.046708 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.046738 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.046757 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.057824 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.074313 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.089909 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.150093 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.150178 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.150198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.150230 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.150249 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.254337 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.254420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.254439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.254468 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.254488 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.358401 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.358449 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.358460 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.358483 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.358498 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.367799 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.368177 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.368273 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.368301 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:14 crc kubenswrapper[4885]: E0308 19:33:14.368179 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:14 crc kubenswrapper[4885]: E0308 19:33:14.368536 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:14 crc kubenswrapper[4885]: E0308 19:33:14.368647 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:14 crc kubenswrapper[4885]: E0308 19:33:14.368983 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.462346 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.462412 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.462433 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.462465 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.462487 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.565781 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.565849 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.565867 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.565894 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.565913 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.669801 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.669876 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.669896 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.669999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.670030 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.773651 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.773713 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.773735 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.773763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.773781 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.877209 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.877275 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.877294 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.877321 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.877345 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.980328 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.980755 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.981024 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.981190 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:14 crc kubenswrapper[4885]: I0308 19:33:14.981357 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:14Z","lastTransitionTime":"2026-03-08T19:33:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.084979 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.085048 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.085063 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.085087 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.085102 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.187966 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.188035 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.188052 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.188081 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.188100 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.290954 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.291018 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.291037 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.291063 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.291085 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.396743 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.396799 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.396817 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.396841 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.396859 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.500000 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.500066 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.500087 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.500115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.500132 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.604326 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.604396 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.604417 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.604451 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.604477 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.707994 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.708555 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.708580 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.708609 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.708635 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.816198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.816279 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.816408 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.816450 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.816481 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.822347 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e" exitCode=0 Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.822419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.826781 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.826842 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.839993 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.848723 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.866364 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.880579 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.894883 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.909218 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.921183 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.921365 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.921478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.921590 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.921696 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:15Z","lastTransitionTime":"2026-03-08T19:33:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.925809 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.935674 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.949638 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.973638 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:15 crc kubenswrapper[4885]: I0308 19:33:15.985781 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.015062 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.022587 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.024396 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.024439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.024453 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.024473 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.024487 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.030848 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.040327 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.057209 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.073357 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.089274 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.105104 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.127747 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.127833 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.127858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.127891 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.127915 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.131795 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.148238 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.171479 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.199823 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.228183 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.230641 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.230694 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.230714 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.230739 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.230756 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.247162 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.263343 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.283567 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.300689 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.331126 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.334269 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.334315 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.334331 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.334354 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.334369 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.350371 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.367177 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:16 crc kubenswrapper[4885]: E0308 19:33:16.367355 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.367764 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.367778 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:16 crc kubenswrapper[4885]: E0308 19:33:16.367990 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:16 crc kubenswrapper[4885]: E0308 19:33:16.368089 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.368165 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.368419 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:16 crc kubenswrapper[4885]: E0308 19:33:16.368539 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.383179 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.440802 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.440844 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.440857 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.440877 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.440891 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.543376 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.543425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.543436 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.543459 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.543477 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.646438 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.646479 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.646490 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.646509 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.646522 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.749305 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.749372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.749391 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.749421 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.749440 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.832712 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.835341 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" event={"ID":"b33b5271-bda3-41ca-81a3-d47fff657c27","Type":"ContainerStarted","Data":"f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.835428 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" event={"ID":"b33b5271-bda3-41ca-81a3-d47fff657c27","Type":"ContainerStarted","Data":"817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.841173 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.841241 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.841262 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.841273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.841285 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.841296 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.853064 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.853146 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.853165 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.853188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.853207 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.856052 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.877531 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.891726 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.914936 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.932213 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.950063 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.956721 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.956779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.956798 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.956825 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.956847 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:16Z","lastTransitionTime":"2026-03-08T19:33:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.968158 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:16 crc kubenswrapper[4885]: I0308 19:33:16.990294 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:16Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.006154 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.023024 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.056734 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.059720 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.059786 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.059807 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.059835 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.059854 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.080594 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.101212 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.117976 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.139091 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.157323 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.162820 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.162878 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.162895 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.162948 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.162971 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.180843 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.195818 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.218610 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.236134 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.255589 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.266110 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.266333 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.266419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.266523 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.266613 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.277986 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.313210 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.331122 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.360215 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.369249 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.369289 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.369298 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.369313 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.369323 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.406007 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.421440 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.442804 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.459198 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.472069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.472134 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.472151 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.472175 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.472190 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.473392 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.487358 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.497867 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.574911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.574983 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.574995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.575016 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.575030 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.678140 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.678169 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.678177 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.678190 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.678199 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.781334 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.781383 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.781400 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.781429 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.781452 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.847026 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.847095 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.869733 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.884418 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.884478 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.884500 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.884530 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.884552 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.889157 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.910498 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.939656 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.965025 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.983359 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.988615 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.988663 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.988683 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.988709 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:17 crc kubenswrapper[4885]: I0308 19:33:17.988727 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:17Z","lastTransitionTime":"2026-03-08T19:33:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.000778 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:17Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.027063 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.047769 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.068828 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.091039 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.091078 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.091087 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.091100 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.091111 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.091626 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.117994 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.133553 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.148740 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.149307 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.149357 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.149419 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.149450 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149538 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149563 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149599 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149620 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149626 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149637 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:34.149617942 +0000 UTC m=+115.545671965 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149648 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149664 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149666 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:34.149654683 +0000 UTC m=+115.545708936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149631 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149726 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:34.149705005 +0000 UTC m=+115.545759038 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.149757 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:34.149741206 +0000 UTC m=+115.545795229 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.171877 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.184889 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.193374 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.193419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.193432 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.193450 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.193465 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.250219 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.250424 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.250476 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:33:34.250441795 +0000 UTC m=+115.646495828 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.250605 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.250679 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:33:34.25065743 +0000 UTC m=+115.646711513 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.295621 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.295651 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.295661 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.295674 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.295683 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.367623 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.367767 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.368133 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.368188 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.368433 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.368584 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.368671 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:18 crc kubenswrapper[4885]: E0308 19:33:18.368723 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.397901 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.397976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.397997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.398018 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.398033 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.501636 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.501707 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.501732 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.501765 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.501789 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.605688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.605742 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.605759 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.605783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.605800 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.709659 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.709711 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.709727 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.709751 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.709769 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.812751 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.812811 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.812829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.812857 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.812876 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.852634 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.857584 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerStarted","Data":"578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.876596 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.893059 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.910758 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.916812 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.916847 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.916858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.916880 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.916894 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:18Z","lastTransitionTime":"2026-03-08T19:33:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.930840 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.944367 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.963910 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.978971 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:18 crc kubenswrapper[4885]: I0308 19:33:18.990574 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:18Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.004561 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.019047 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.019080 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.019092 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.019110 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.019122 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.024798 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.054247 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.074149 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.097510 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.116471 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.124584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.124648 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.124662 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.124681 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.124692 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.132384 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.149345 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.164464 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.179003 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.193459 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.213405 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.228839 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.228967 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.228988 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.229012 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.229030 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.229639 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.295638 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.318749 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.329374 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.330837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.330866 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.330876 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.330890 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.330899 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.341501 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.349808 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.363819 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.381339 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.406300 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.421275 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.435555 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.435594 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.435606 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.435624 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.435636 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.442442 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.455187 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.470428 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.479568 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.495056 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.504992 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.517992 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.532570 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.537614 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.537646 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.537655 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.537670 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.537679 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.557041 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.567903 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.577475 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.589735 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.602161 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.621848 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.635639 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.639349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.639474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.639561 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.639642 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.639722 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.651205 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.663157 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.673726 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.742887 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.742969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.742986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.743010 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.743027 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.847027 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.847103 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.847127 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.847158 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.847182 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.863103 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-57qch" event={"ID":"0cfac2d6-6888-4b2d-982e-826f583396e8","Type":"ContainerStarted","Data":"d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.868720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.873823 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerStarted","Data":"219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.882670 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.898215 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.920192 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.943252 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.950242 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.950285 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.950296 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.950310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.950321 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:19Z","lastTransitionTime":"2026-03-08T19:33:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.963305 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.976611 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:19 crc kubenswrapper[4885]: I0308 19:33:19.988228 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.001689 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.013257 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.031278 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.043818 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.052656 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.052690 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.052700 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.052717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.052729 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.061192 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.075125 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.093607 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.112171 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.130620 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.149287 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.154801 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.154834 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.154842 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.154855 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.154863 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.164586 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.179983 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.195147 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.206570 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.230209 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.249948 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.257676 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.257712 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.257720 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.257736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.257747 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.265340 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.281892 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.295410 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.315219 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.332979 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.360016 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.360049 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.360058 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.360071 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.360081 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.363298 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.367773 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:20 crc kubenswrapper[4885]: E0308 19:33:20.367863 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.367984 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:20 crc kubenswrapper[4885]: E0308 19:33:20.368027 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.368114 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.368182 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:20 crc kubenswrapper[4885]: E0308 19:33:20.368252 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:20 crc kubenswrapper[4885]: E0308 19:33:20.368414 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.381182 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.402443 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.416835 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.462740 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.462810 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.462828 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.462855 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.462875 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.566420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.566477 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.566494 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.566518 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.566537 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.669625 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.669696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.669717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.669742 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.669762 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.773768 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.773837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.773853 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.773882 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.773901 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.880367 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac600107-0c97-4ec8-89f6-598b40c166ee" containerID="219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152" exitCode=0 Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.880473 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerDied","Data":"219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.876794 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.883604 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.885449 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.885595 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.885631 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.907988 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.925202 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.941431 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.998799 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.998833 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.998845 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.998862 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.998874 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:20Z","lastTransitionTime":"2026-03-08T19:33:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:20 crc kubenswrapper[4885]: I0308 19:33:20.999186 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:20Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.015332 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.031623 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.048762 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.082155 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.107665 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.107793 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.107842 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.107861 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.107888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.107908 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.122484 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.138907 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.155444 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.171381 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.196747 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.212949 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.212997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.213013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.213037 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.213053 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.219137 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.245406 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.316665 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.316737 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.316757 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.316790 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.316808 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.326136 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.326178 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.326188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.326208 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.326221 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.342013 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.349910 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.349964 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.349977 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.349997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.350011 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.366103 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.368879 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.369247 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.373201 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.373258 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.373276 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.373303 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.373326 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.394046 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.400437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.400495 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.400514 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.400542 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.400561 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.421338 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.427187 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.427244 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.427261 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.427288 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.427307 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.454052 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: E0308 19:33:21.454274 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.456188 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.456228 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.456244 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.456267 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.456285 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.559886 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.559996 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.560015 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.560041 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.560059 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.664144 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.664234 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.664256 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.664291 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.664315 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.767949 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.767986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.767997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.768013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.768026 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.870483 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.870538 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.870558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.870585 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.870604 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.887050 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerStarted","Data":"e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.895079 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.895673 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.895764 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.896013 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.918277 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.938464 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.938568 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.942182 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.963205 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.974087 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.974172 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.974194 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.974391 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.974409 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:21Z","lastTransitionTime":"2026-03-08T19:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:21 crc kubenswrapper[4885]: I0308 19:33:21.994704 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:21Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.014740 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.029711 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.045885 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.060994 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.078375 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.078446 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.078469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.078500 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.078522 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.081320 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.093580 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.123269 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.140097 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.161706 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.172021 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.181854 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.181902 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.181945 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.181969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.181983 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.185613 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.203903 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.226003 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.243714 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.261584 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.284806 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.284837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.284845 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.284877 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.284888 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.291901 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.316239 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.332231 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.357606 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.368129 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.368222 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.368223 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.368227 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:22 crc kubenswrapper[4885]: E0308 19:33:22.368444 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:22 crc kubenswrapper[4885]: E0308 19:33:22.368475 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:22 crc kubenswrapper[4885]: E0308 19:33:22.368659 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:22 crc kubenswrapper[4885]: E0308 19:33:22.368781 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.375731 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.387455 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.387512 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.387529 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.387556 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.387578 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.399990 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.413508 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.430884 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.452392 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.467580 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.484721 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.491252 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.491340 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.491359 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.491384 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.491401 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.504711 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.536227 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.594706 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.594783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.594804 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.594835 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.594857 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.697484 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.697543 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.697562 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.697589 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.697610 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.801431 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.801498 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.801525 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.801557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.801578 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.904360 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.904409 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.904428 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.904455 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.904475 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:22Z","lastTransitionTime":"2026-03-08T19:33:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.905231 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac600107-0c97-4ec8-89f6-598b40c166ee" containerID="e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca" exitCode=0 Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.905300 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerDied","Data":"e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca"} Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.925010 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.938609 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:22 crc kubenswrapper[4885]: I0308 19:33:22.969647 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.004170 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:22Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.007345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.007384 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.007399 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.007416 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.007429 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.027248 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.060690 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.075707 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.099263 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.112395 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.112449 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.112461 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.112483 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.112495 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.115575 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.134578 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.149283 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.169402 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.184078 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.199602 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.212661 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.214784 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.214828 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.214839 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.214857 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.214870 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.222543 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.323904 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.323987 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.323999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.324019 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.324033 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.426491 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.426536 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.426547 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.426565 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.426577 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.529255 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.529314 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.529345 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.529374 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.529393 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.632289 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.632358 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.632377 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.632404 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.632423 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.740515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.740973 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.740995 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.741025 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.741044 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.844263 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.844331 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.844343 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.844361 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.844399 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.914860 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac600107-0c97-4ec8-89f6-598b40c166ee" containerID="a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d" exitCode=0 Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.915680 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerDied","Data":"a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.938668 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.947736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.947774 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.947786 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.947810 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.947824 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:23Z","lastTransitionTime":"2026-03-08T19:33:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.955349 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.978508 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:23 crc kubenswrapper[4885]: I0308 19:33:23.994496 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.014042 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.028548 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.051535 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.051586 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.051603 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.051628 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.051648 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.063355 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.085743 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.103476 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.135328 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.154258 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.154302 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.154312 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.154330 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.154341 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.167992 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.202568 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.228897 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.241702 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.256814 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.256873 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.256886 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.256910 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.256941 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.257666 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.270363 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.360540 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.360583 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.360592 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.360609 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.360619 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.367963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.367994 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.368004 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:24 crc kubenswrapper[4885]: E0308 19:33:24.368122 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.368141 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:24 crc kubenswrapper[4885]: E0308 19:33:24.368233 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:24 crc kubenswrapper[4885]: E0308 19:33:24.368453 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:24 crc kubenswrapper[4885]: E0308 19:33:24.368673 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.472894 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.472983 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.473005 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.473031 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.473050 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.576608 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.576713 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.576733 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.576763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.576783 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.679999 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.680072 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.680093 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.680142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.680161 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.783767 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.783839 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.783856 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.783883 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.783905 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.886692 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.886768 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.886786 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.886812 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.886828 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.922883 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac600107-0c97-4ec8-89f6-598b40c166ee" containerID="fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb" exitCode=0 Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.922990 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerDied","Data":"fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.926306 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/0.log" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.935764 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176" exitCode=1 Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.935857 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.937067 4885 scope.go:117] "RemoveContainer" containerID="e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.954650 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.978348 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.990676 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.990755 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.990782 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.990813 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.990838 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:24Z","lastTransitionTime":"2026-03-08T19:33:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:24 crc kubenswrapper[4885]: I0308 19:33:24.996998 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:24Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.030719 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.050198 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.066509 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.081865 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.094022 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.094058 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.094068 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.094103 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.094115 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.096035 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.111174 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.125449 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.147053 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.161877 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.179106 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.194177 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.197220 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.197289 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.197307 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.197335 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.197354 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.218795 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.231552 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.245718 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.264523 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"message\\\":\\\"or removal\\\\nI0308 19:33:24.210662 6582 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 19:33:24.210667 6582 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 19:33:24.210707 6582 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 19:33:24.210716 6582 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 19:33:24.210725 6582 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 19:33:24.210732 6582 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 19:33:24.210740 6582 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 19:33:24.215990 6582 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 19:33:24.216015 6582 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 19:33:24.216042 6582 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 19:33:24.216043 6582 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 19:33:24.216057 6582 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 19:33:24.216095 6582 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 19:33:24.216114 6582 factory.go:656] Stopping watch factory\\\\nI0308 19:33:24.216127 6582 ovnkube.go:599] Stopped ovnkube\\\\nI0308 19:33:24.216131 6582 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 19:33:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.277032 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.293859 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.302259 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.302303 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.302315 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.302333 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.302344 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.307379 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.320199 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.336459 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.353575 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.372418 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.387187 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.405361 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.405420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.405434 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.405457 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.405475 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.415141 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.435231 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.455623 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.472600 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.488980 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.507638 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.508411 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.508457 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.508495 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.508519 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.508534 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.620489 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.620543 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.620557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.620580 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.620599 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.724186 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.724266 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.724278 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.724302 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.724319 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.826705 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.826760 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.826773 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.826798 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.826812 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.929845 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.929913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.929954 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.929981 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.929999 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:25Z","lastTransitionTime":"2026-03-08T19:33:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.947175 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac600107-0c97-4ec8-89f6-598b40c166ee" containerID="63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0" exitCode=0 Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.947258 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerDied","Data":"63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.950677 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/0.log" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.953996 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9"} Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.954410 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.964561 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.980886 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:25 crc kubenswrapper[4885]: I0308 19:33:25.996899 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:25Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.015613 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.032897 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.032978 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.032997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.033024 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.033043 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.045651 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"message\\\":\\\"or removal\\\\nI0308 19:33:24.210662 6582 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 19:33:24.210667 6582 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 19:33:24.210707 6582 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 19:33:24.210716 6582 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 19:33:24.210725 6582 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 19:33:24.210732 6582 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 19:33:24.210740 6582 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 19:33:24.215990 6582 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 19:33:24.216015 6582 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 19:33:24.216042 6582 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 19:33:24.216043 6582 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 19:33:24.216057 6582 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 19:33:24.216095 6582 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 19:33:24.216114 6582 factory.go:656] Stopping watch factory\\\\nI0308 19:33:24.216127 6582 ovnkube.go:599] Stopped ovnkube\\\\nI0308 19:33:24.216131 6582 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 19:33:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.068593 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.085137 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.098179 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.112240 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.130798 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.135703 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.135746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.135755 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.135772 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.135781 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.145476 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.158499 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.174998 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.188456 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.206462 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.219220 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.234755 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.239069 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.239138 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.239159 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.239187 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.239209 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.250883 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.263311 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.278241 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.300102 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.331267 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"message\\\":\\\"or removal\\\\nI0308 19:33:24.210662 6582 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 19:33:24.210667 6582 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 19:33:24.210707 6582 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 19:33:24.210716 6582 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 19:33:24.210725 6582 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 19:33:24.210732 6582 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 19:33:24.210740 6582 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 19:33:24.215990 6582 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 19:33:24.216015 6582 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 19:33:24.216042 6582 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 19:33:24.216043 6582 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 19:33:24.216057 6582 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 19:33:24.216095 6582 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 19:33:24.216114 6582 factory.go:656] Stopping watch factory\\\\nI0308 19:33:24.216127 6582 ovnkube.go:599] Stopped ovnkube\\\\nI0308 19:33:24.216131 6582 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 19:33:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.343353 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.343419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.343437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.343468 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.343487 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.344206 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.361070 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.367416 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:26 crc kubenswrapper[4885]: E0308 19:33:26.367575 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.367957 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.367992 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:26 crc kubenswrapper[4885]: E0308 19:33:26.368041 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.368169 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:26 crc kubenswrapper[4885]: E0308 19:33:26.368310 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:26 crc kubenswrapper[4885]: E0308 19:33:26.368579 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.375529 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.389586 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.400117 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.414717 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.425233 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.439897 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.445963 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.446028 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.446050 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.446081 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.446101 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.450212 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.483318 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:26Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.549621 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.549692 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.549710 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.549737 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.549756 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.653793 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.653862 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.653883 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.653908 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.653965 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.756653 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.756727 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.756746 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.756772 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.756792 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.860845 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.861017 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.861052 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.861090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.861118 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.961524 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/1.log" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.963077 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/0.log" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.963734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.963777 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.963795 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.963821 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.963838 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:26Z","lastTransitionTime":"2026-03-08T19:33:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.967306 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9" exitCode=1 Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.967364 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9"} Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.967433 4885 scope.go:117] "RemoveContainer" containerID="e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.968957 4885 scope.go:117] "RemoveContainer" containerID="49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9" Mar 08 19:33:26 crc kubenswrapper[4885]: E0308 19:33:26.969369 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.983119 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac600107-0c97-4ec8-89f6-598b40c166ee" containerID="309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5" exitCode=0 Mar 08 19:33:26 crc kubenswrapper[4885]: I0308 19:33:26.983201 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerDied","Data":"309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.019191 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.048718 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.067491 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.067580 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.067597 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.067663 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.067683 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.068650 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.086002 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.102980 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.121357 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.134804 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.162344 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"message\\\":\\\"or removal\\\\nI0308 19:33:24.210662 6582 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 19:33:24.210667 6582 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 19:33:24.210707 6582 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 19:33:24.210716 6582 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 19:33:24.210725 6582 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 19:33:24.210732 6582 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 19:33:24.210740 6582 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 19:33:24.215990 6582 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 19:33:24.216015 6582 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 19:33:24.216042 6582 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 19:33:24.216043 6582 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 19:33:24.216057 6582 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 19:33:24.216095 6582 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 19:33:24.216114 6582 factory.go:656] Stopping watch factory\\\\nI0308 19:33:24.216127 6582 ovnkube.go:599] Stopped ovnkube\\\\nI0308 19:33:24.216131 6582 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 19:33:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.170735 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.170818 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.170868 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.170894 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.170909 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.178773 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.202404 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.218405 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.236393 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.253254 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.273330 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.274407 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.274457 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.274471 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.274491 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.274507 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.295851 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.316554 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.336582 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.350657 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.379710 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.379770 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.379788 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.379813 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.379831 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.384046 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e3bdef437e429fa24044ced0c784f106ae2dc102dc03e3ed87f5ab5b69a3f176\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"message\\\":\\\"or removal\\\\nI0308 19:33:24.210662 6582 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0308 19:33:24.210667 6582 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0308 19:33:24.210707 6582 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0308 19:33:24.210716 6582 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0308 19:33:24.210725 6582 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0308 19:33:24.210732 6582 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0308 19:33:24.210740 6582 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0308 19:33:24.215990 6582 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0308 19:33:24.216015 6582 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0308 19:33:24.216042 6582 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0308 19:33:24.216043 6582 handler.go:208] Removed *v1.Node event handler 2\\\\nI0308 19:33:24.216057 6582 handler.go:208] Removed *v1.Node event handler 7\\\\nI0308 19:33:24.216095 6582 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0308 19:33:24.216114 6582 factory.go:656] Stopping watch factory\\\\nI0308 19:33:24.216127 6582 ovnkube.go:599] Stopped ovnkube\\\\nI0308 19:33:24.216131 6582 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0308 19:33:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.403391 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.426100 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.441291 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.462562 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.483321 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.483888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.483956 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.483976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.484003 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.484022 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.504568 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.523949 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.543967 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.576627 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.586870 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.587074 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.587221 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.587438 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.587569 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.603684 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.625950 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.647518 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.665804 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:27Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.691151 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.691224 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.691251 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.691284 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.691308 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.795223 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.795277 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.795296 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.795321 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.795341 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.898813 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.898869 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.898886 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.898909 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.898951 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:27Z","lastTransitionTime":"2026-03-08T19:33:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:27 crc kubenswrapper[4885]: I0308 19:33:27.995645 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/1.log" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.000860 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.000944 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.001009 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.001107 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.001141 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.004079 4885 scope.go:117] "RemoveContainer" containerID="49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9" Mar 08 19:33:28 crc kubenswrapper[4885]: E0308 19:33:28.004425 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.009238 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" event={"ID":"ac600107-0c97-4ec8-89f6-598b40c166ee","Type":"ContainerStarted","Data":"964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.025667 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.048109 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.066560 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.098344 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.104565 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.104638 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.104693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.104729 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.104751 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.120239 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.135561 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.153686 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.167989 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.188792 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.202223 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.207429 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.207458 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.207468 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.207486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.207497 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.221847 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.242130 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.258986 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.281178 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.305355 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.311066 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.311140 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.311161 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.311189 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.311213 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.337150 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.356658 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.367682 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.367733 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.367733 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:28 crc kubenswrapper[4885]: E0308 19:33:28.367950 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.367994 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:28 crc kubenswrapper[4885]: E0308 19:33:28.368108 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:28 crc kubenswrapper[4885]: E0308 19:33:28.368195 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:28 crc kubenswrapper[4885]: E0308 19:33:28.368236 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.376787 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.395350 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.419793 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.419851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.419867 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.419892 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.419911 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.437096 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.460312 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.479487 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.496213 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.513155 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.523250 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.523299 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.523310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.523331 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.523343 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.534356 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.549521 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.572599 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.592387 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.608528 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.626563 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.626631 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.626649 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.626676 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.626693 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.629483 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.647028 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.678742 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:28Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.730379 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.730456 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.730481 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.730517 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.730547 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.834620 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.834689 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.834712 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.834744 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.834767 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.938502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.938558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.938576 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.938601 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:28 crc kubenswrapper[4885]: I0308 19:33:28.938619 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:28Z","lastTransitionTime":"2026-03-08T19:33:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.041989 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.042049 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.042067 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.042090 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.042109 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.145358 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.145442 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.145469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.145502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.145524 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.248720 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.248788 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.248811 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.248837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.248857 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.352423 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.352480 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.352501 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.352524 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.352543 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.394161 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.411477 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.432673 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.456164 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.456412 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.456552 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.456696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.456840 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.464786 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.483118 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.507725 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.523590 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.542832 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.562064 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.562111 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.562130 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.562154 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.562174 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.562109 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.584266 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.607139 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.647773 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.665093 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.665213 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.665239 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.665271 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.665293 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.683974 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.706182 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.720810 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.735985 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.768699 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.768763 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.768779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.768800 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.768814 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.872559 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.872638 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.872658 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.872688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.872711 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.976281 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.976352 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.976371 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.976400 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:29 crc kubenswrapper[4885]: I0308 19:33:29.976419 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:29Z","lastTransitionTime":"2026-03-08T19:33:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.079812 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.079881 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.079903 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.079960 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.079986 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.183619 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.183727 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.183753 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.183789 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.183816 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.287902 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.288017 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.288046 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.288082 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.288107 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.368078 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.368135 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.368169 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.368135 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:30 crc kubenswrapper[4885]: E0308 19:33:30.368272 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:30 crc kubenswrapper[4885]: E0308 19:33:30.368465 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:30 crc kubenswrapper[4885]: E0308 19:33:30.368572 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:30 crc kubenswrapper[4885]: E0308 19:33:30.368752 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.391198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.391420 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.391605 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.391779 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.391958 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.496055 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.496128 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.496148 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.496176 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.496197 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.598830 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.599282 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.599446 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.599641 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.599783 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.703085 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.704079 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.704221 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.704405 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.704536 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.806908 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.807277 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.807469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.807618 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.807740 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.911240 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.911296 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.911309 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.911348 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:30 crc kubenswrapper[4885]: I0308 19:33:30.911361 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:30Z","lastTransitionTime":"2026-03-08T19:33:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.014993 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.015057 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.015078 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.015102 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.015121 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.118137 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.118194 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.118211 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.118235 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.118252 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.221992 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.222316 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.222486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.222626 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.222756 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.325866 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.326006 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.326030 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.326064 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.326087 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.429354 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.429440 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.429463 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.429503 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.429527 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.532726 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.532801 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.532819 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.532848 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.532872 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.635629 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.635772 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.635887 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.636016 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.636520 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.661507 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.661561 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.661579 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.661601 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.661619 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: E0308 19:33:31.682649 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:31Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.688293 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.688358 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.688380 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.688410 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.688432 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: E0308 19:33:31.712098 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:31Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.717490 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.717555 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.717576 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.717605 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.717624 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: E0308 19:33:31.739186 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:31Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.744851 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.744967 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.744998 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.745029 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.745058 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: E0308 19:33:31.764455 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:31Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.770163 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.770254 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.770280 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.770314 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.770340 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: E0308 19:33:31.798722 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:31Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:31 crc kubenswrapper[4885]: E0308 19:33:31.798898 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.801557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.801637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.801654 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.801682 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.801703 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.904893 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.904963 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.904975 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.904992 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:31 crc kubenswrapper[4885]: I0308 19:33:31.905008 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:31Z","lastTransitionTime":"2026-03-08T19:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.007637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.007688 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.007710 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.007732 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.007748 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.111530 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.111722 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.111734 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.111754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.111763 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.215384 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.215483 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.215515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.215636 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.215660 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.319794 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.320084 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.320134 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.320167 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.320186 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.367996 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.368034 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.368009 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.368061 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:32 crc kubenswrapper[4885]: E0308 19:33:32.368481 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:32 crc kubenswrapper[4885]: E0308 19:33:32.368656 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:32 crc kubenswrapper[4885]: E0308 19:33:32.368915 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:32 crc kubenswrapper[4885]: E0308 19:33:32.369049 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.369867 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.423615 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.423673 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.423691 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.423717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.423735 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.527833 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.527911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.527973 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.528005 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.528025 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.631122 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.631182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.631202 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.631239 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.631260 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.737448 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.737502 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.737523 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.737550 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.737573 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.842962 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.843425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.843443 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.843466 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.843485 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.947560 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.947617 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.947633 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.947660 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:32 crc kubenswrapper[4885]: I0308 19:33:32.947677 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:32Z","lastTransitionTime":"2026-03-08T19:33:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.031759 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.034116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.035422 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.055238 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.055319 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.055338 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.055364 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.055383 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.063207 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.089810 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.108098 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.126170 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.146504 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.159639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.159699 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.159719 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.159747 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.159766 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.227420 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.249229 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.262159 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.262366 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.262433 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.262503 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.262570 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.265779 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.277571 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.300755 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.318972 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.332419 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.346329 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.360937 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.364730 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.364824 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.364888 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.364968 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.365050 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.381774 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.396043 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.468446 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.468515 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.468535 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.468563 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.468583 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.572006 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.572087 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.572108 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.572140 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.572159 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.676585 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.676647 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.676676 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.676708 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.676729 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.780006 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.780568 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.780696 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.780805 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.780901 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.884911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.885200 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.885306 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.885390 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.885452 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.988309 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.988753 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.988887 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.989060 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:33 crc kubenswrapper[4885]: I0308 19:33:33.989241 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:33Z","lastTransitionTime":"2026-03-08T19:33:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.092322 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.092397 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.092415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.092442 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.092462 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.195324 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.195415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.195437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.195465 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.195485 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.206914 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.207011 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.207048 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.207096 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207249 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207301 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207327 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207345 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207381 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:34:06.207347443 +0000 UTC m=+147.603401496 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207388 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207485 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207506 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207418 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:34:06.207396434 +0000 UTC m=+147.603450487 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207621 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:34:06.207585559 +0000 UTC m=+147.603639612 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207670 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.207768 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:34:06.207718862 +0000 UTC m=+147.603772925 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.299133 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.299494 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.299656 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.299826 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.299988 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.307707 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.307977 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:34:06.307906018 +0000 UTC m=+147.703960071 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.308197 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.308444 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.308557 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:34:06.308528334 +0000 UTC m=+147.704582427 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.367772 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.368149 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.368099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.367869 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.368585 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.368774 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.368907 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:34 crc kubenswrapper[4885]: E0308 19:33:34.369587 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.385203 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.403784 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.403911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.404012 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.404043 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.404067 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.507394 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.507456 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.507474 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.507501 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.507518 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.610780 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.611276 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.611414 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.611540 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.611685 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.715393 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.715762 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.715915 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.716116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.716262 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.819889 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.821128 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.821310 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.821447 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.821645 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.925557 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.925622 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.925639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.925664 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:34 crc kubenswrapper[4885]: I0308 19:33:34.925683 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:34Z","lastTransitionTime":"2026-03-08T19:33:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.028333 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.028393 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.028412 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.028437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.028454 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.131234 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.131486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.131630 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.131807 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.131981 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.235560 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.235623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.235646 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.235680 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.235707 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.339015 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.339092 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.339113 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.339142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.339164 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.442563 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.442788 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.442909 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.443085 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.443215 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.546187 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.546309 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.546338 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.546369 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.546389 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.649584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.649671 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.649693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.649715 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.649732 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.753461 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.753527 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.753548 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.753575 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.753593 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.857203 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.857265 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.857284 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.857308 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.857325 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.961356 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.961415 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.961432 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.961456 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:35 crc kubenswrapper[4885]: I0308 19:33:35.961473 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:35Z","lastTransitionTime":"2026-03-08T19:33:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.064637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.064705 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.064723 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.064751 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.064769 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.167780 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.167852 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.167873 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.167900 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.167950 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.271685 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.271747 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.271765 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.271793 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.271812 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.367871 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.367990 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:36 crc kubenswrapper[4885]: E0308 19:33:36.368594 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.367980 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.368106 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:36 crc kubenswrapper[4885]: E0308 19:33:36.368879 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:36 crc kubenswrapper[4885]: E0308 19:33:36.369074 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:36 crc kubenswrapper[4885]: E0308 19:33:36.369184 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.376769 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.376837 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.376873 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.376905 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.376963 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.480389 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.480472 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.480494 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.480523 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.480541 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.583296 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.583354 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.583380 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.583409 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.583429 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.687255 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.687327 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.687349 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.687380 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.687403 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.790162 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.790511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.790668 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.790823 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.790990 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.893436 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.893511 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.893535 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.893564 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.893587 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.996313 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.996386 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.996410 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.996437 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:36 crc kubenswrapper[4885]: I0308 19:33:36.996491 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:36Z","lastTransitionTime":"2026-03-08T19:33:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.100372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.100470 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.100488 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.100514 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.100532 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.208404 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.208466 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.208485 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.208510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.208528 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.311985 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.312049 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.312072 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.312102 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.312124 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.414717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.414783 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.414800 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.414828 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.414849 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.518715 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.518773 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.518790 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.518814 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.518833 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.621372 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.621430 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.621456 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.621486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.621510 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.724232 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.724289 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.724308 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.724333 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.724350 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.827882 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.827986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.828009 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.828039 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.828092 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.932182 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.932353 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.932417 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.932446 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:37 crc kubenswrapper[4885]: I0308 19:33:37.932465 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:37Z","lastTransitionTime":"2026-03-08T19:33:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.035545 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.035660 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.035680 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.035706 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.035726 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.139321 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.139390 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.139408 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.139439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.139458 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.243003 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.243070 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.243088 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.243113 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.243132 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.347244 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.347307 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.347325 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.347353 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.347371 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.367304 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.367368 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.367531 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.367856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:38 crc kubenswrapper[4885]: E0308 19:33:38.368061 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:38 crc kubenswrapper[4885]: E0308 19:33:38.368245 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:38 crc kubenswrapper[4885]: E0308 19:33:38.368331 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:38 crc kubenswrapper[4885]: E0308 19:33:38.368397 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.451013 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.451556 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.451884 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.452287 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.452543 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.556614 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.556766 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.556789 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.556853 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.556873 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.660754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.661148 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.661273 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.661435 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.661553 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.764693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.764777 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.764801 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.764833 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.764852 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.867677 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.867736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.867754 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.867777 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.867794 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.970911 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.971021 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.971041 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.971068 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:38 crc kubenswrapper[4885]: I0308 19:33:38.971088 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:38Z","lastTransitionTime":"2026-03-08T19:33:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.074553 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.074621 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.074639 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.074665 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.074687 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:39Z","lastTransitionTime":"2026-03-08T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.177878 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.178003 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.178030 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.178062 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.178085 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:39Z","lastTransitionTime":"2026-03-08T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.281342 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.281414 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.281440 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.281476 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.281499 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:39Z","lastTransitionTime":"2026-03-08T19:33:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:39 crc kubenswrapper[4885]: E0308 19:33:39.381784 4885 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.390570 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.410341 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.436365 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.455658 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.477306 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.495404 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: E0308 19:33:39.503570 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.529407 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.552453 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.569549 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.590320 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.610396 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.626445 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.646701 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.667043 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.698068 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.717756 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:39 crc kubenswrapper[4885]: I0308 19:33:39.742964 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:40 crc kubenswrapper[4885]: I0308 19:33:40.367818 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:40 crc kubenswrapper[4885]: I0308 19:33:40.367840 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:40 crc kubenswrapper[4885]: I0308 19:33:40.367913 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:40 crc kubenswrapper[4885]: I0308 19:33:40.367976 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:40 crc kubenswrapper[4885]: E0308 19:33:40.368222 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:40 crc kubenswrapper[4885]: E0308 19:33:40.368430 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:40 crc kubenswrapper[4885]: E0308 19:33:40.368584 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:40 crc kubenswrapper[4885]: E0308 19:33:40.368737 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:40 crc kubenswrapper[4885]: I0308 19:33:40.369826 4885 scope.go:117] "RemoveContainer" containerID="49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.066304 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/1.log" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.069806 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d"} Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.070828 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.083347 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.105567 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.128937 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.156768 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.173851 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.188623 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.199767 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.223630 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.237311 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.248585 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.268179 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.279369 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.292130 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.314053 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.335593 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.350070 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.365416 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:41Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.990693 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.990780 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.990803 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.990829 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:41 crc kubenswrapper[4885]: I0308 19:33:41.990848 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:41Z","lastTransitionTime":"2026-03-08T19:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.012557 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.017833 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.017894 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.017913 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.017976 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.018002 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:42Z","lastTransitionTime":"2026-03-08T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.039275 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.044574 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.044634 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.044652 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.044677 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.044695 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:42Z","lastTransitionTime":"2026-03-08T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.065557 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.072425 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.072469 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.072488 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.072509 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.072529 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:42Z","lastTransitionTime":"2026-03-08T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.076992 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/2.log" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.077981 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/1.log" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.082714 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d" exitCode=1 Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.082770 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d"} Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.082817 4885 scope.go:117] "RemoveContainer" containerID="49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.084162 4885 scope.go:117] "RemoveContainer" containerID="78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.084487 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.103826 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.108105 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.110744 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.111001 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.111876 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.112116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.112301 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:42Z","lastTransitionTime":"2026-03-08T19:33:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.130463 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.142727 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.142882 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.162350 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.183025 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.201123 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.216866 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.246258 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.265852 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.282786 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.300648 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.321370 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.333328 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.351330 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.367500 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.367610 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.367705 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.367704 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.367622 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.367853 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.368021 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:42 crc kubenswrapper[4885]: E0308 19:33:42.368128 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.371170 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.398423 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://49afdc26abd8b26ce41e283cf199c60265abdfa1b61cdc848a21804f868a8fe9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"message\\\":\\\"er-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.58:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4607c9b7-15f9-4ba0-86e5-0021ba7e4488}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:26.058609 6856 ovn.go:134] Ensuring zone local for Pod openshift-network-node-identity/network-node-identity-vrzqb in node crc\\\\nI0308 19:33:26.059828 6856 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0308 19:33:26.059841 6856 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0308 19:33:26.059613 6856 services_controller.go:360] Finished syncing service metrics on namespace openshift-console-operator for network=default : 2.135559ms\\\\nF0308 19:33:26.059798 6856 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.415765 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:42 crc kubenswrapper[4885]: I0308 19:33:42.438860 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:42Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.092070 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/2.log" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.098711 4885 scope.go:117] "RemoveContainer" containerID="78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d" Mar 08 19:33:43 crc kubenswrapper[4885]: E0308 19:33:43.099028 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.131394 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.153405 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.172680 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.191591 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.207447 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.230087 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.252033 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.272258 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.290761 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.316579 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.332417 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.352196 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.372710 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.411449 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.430025 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.446436 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:43 crc kubenswrapper[4885]: I0308 19:33:43.467260 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:43Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:44 crc kubenswrapper[4885]: I0308 19:33:44.367333 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:44 crc kubenswrapper[4885]: I0308 19:33:44.367401 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:44 crc kubenswrapper[4885]: I0308 19:33:44.367483 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:44 crc kubenswrapper[4885]: E0308 19:33:44.367477 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:44 crc kubenswrapper[4885]: I0308 19:33:44.367577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:44 crc kubenswrapper[4885]: E0308 19:33:44.367635 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:44 crc kubenswrapper[4885]: E0308 19:33:44.367717 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:44 crc kubenswrapper[4885]: E0308 19:33:44.367794 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:44 crc kubenswrapper[4885]: E0308 19:33:44.505299 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:33:45 crc kubenswrapper[4885]: I0308 19:33:45.959783 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:33:45 crc kubenswrapper[4885]: I0308 19:33:45.982379 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:45Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.002182 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:45Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.023842 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.045086 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.066999 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.086200 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.131374 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.158212 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.179619 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.201605 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.224234 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.241410 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.262871 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.284767 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.315568 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.337721 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.362278 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:46Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.367389 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.367443 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.367483 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:46 crc kubenswrapper[4885]: I0308 19:33:46.367391 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:46 crc kubenswrapper[4885]: E0308 19:33:46.367579 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:46 crc kubenswrapper[4885]: E0308 19:33:46.367729 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:46 crc kubenswrapper[4885]: E0308 19:33:46.367820 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:46 crc kubenswrapper[4885]: E0308 19:33:46.368005 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:48 crc kubenswrapper[4885]: I0308 19:33:48.367415 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:48 crc kubenswrapper[4885]: I0308 19:33:48.367485 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:48 crc kubenswrapper[4885]: E0308 19:33:48.367628 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:48 crc kubenswrapper[4885]: I0308 19:33:48.367723 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:48 crc kubenswrapper[4885]: E0308 19:33:48.367788 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:48 crc kubenswrapper[4885]: E0308 19:33:48.368037 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:48 crc kubenswrapper[4885]: I0308 19:33:48.368155 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:48 crc kubenswrapper[4885]: E0308 19:33:48.368307 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.402159 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.426101 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.444291 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.462837 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.479449 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.498323 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: E0308 19:33:49.506112 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.525562 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.542681 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.573307 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.594553 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.618070 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.633508 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.653653 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.672797 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.693222 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.713156 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:49 crc kubenswrapper[4885]: I0308 19:33:49.732859 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:49Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:50 crc kubenswrapper[4885]: I0308 19:33:50.367125 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:50 crc kubenswrapper[4885]: I0308 19:33:50.367144 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:50 crc kubenswrapper[4885]: I0308 19:33:50.367242 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:50 crc kubenswrapper[4885]: I0308 19:33:50.367284 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:50 crc kubenswrapper[4885]: E0308 19:33:50.367514 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:50 crc kubenswrapper[4885]: E0308 19:33:50.367970 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:50 crc kubenswrapper[4885]: E0308 19:33:50.368233 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:50 crc kubenswrapper[4885]: E0308 19:33:50.368465 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:51 crc kubenswrapper[4885]: I0308 19:33:51.388431 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.269067 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.269142 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.269166 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.269196 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.269245 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:52Z","lastTransitionTime":"2026-03-08T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.290809 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:52Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.296390 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.296486 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.296510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.296533 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.296551 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:52Z","lastTransitionTime":"2026-03-08T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.313540 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:52Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.318667 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.318717 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.318736 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.318757 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.318773 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:52Z","lastTransitionTime":"2026-03-08T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.336731 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:52Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.342512 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.342569 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.342592 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.342623 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.342647 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:52Z","lastTransitionTime":"2026-03-08T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.363584 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:52Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.367053 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.367221 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.367257 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.367282 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.367389 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.367457 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.367516 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.367872 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.369292 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.369334 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.369350 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.369374 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.369392 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:33:52Z","lastTransitionTime":"2026-03-08T19:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:33:52 crc kubenswrapper[4885]: I0308 19:33:52.383150 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.390568 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:52Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:52 crc kubenswrapper[4885]: E0308 19:33:52.390787 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:33:54 crc kubenswrapper[4885]: I0308 19:33:54.367849 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:54 crc kubenswrapper[4885]: E0308 19:33:54.368079 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:54 crc kubenswrapper[4885]: I0308 19:33:54.368174 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:54 crc kubenswrapper[4885]: I0308 19:33:54.368264 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:54 crc kubenswrapper[4885]: I0308 19:33:54.368179 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:54 crc kubenswrapper[4885]: E0308 19:33:54.368381 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:54 crc kubenswrapper[4885]: E0308 19:33:54.368500 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:54 crc kubenswrapper[4885]: E0308 19:33:54.368596 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:54 crc kubenswrapper[4885]: E0308 19:33:54.507867 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:33:56 crc kubenswrapper[4885]: I0308 19:33:56.368014 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:56 crc kubenswrapper[4885]: I0308 19:33:56.368081 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:56 crc kubenswrapper[4885]: I0308 19:33:56.368092 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:56 crc kubenswrapper[4885]: I0308 19:33:56.368115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:56 crc kubenswrapper[4885]: E0308 19:33:56.368227 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:56 crc kubenswrapper[4885]: E0308 19:33:56.368358 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:56 crc kubenswrapper[4885]: E0308 19:33:56.368579 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:56 crc kubenswrapper[4885]: E0308 19:33:56.368709 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:57 crc kubenswrapper[4885]: I0308 19:33:57.368656 4885 scope.go:117] "RemoveContainer" containerID="78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d" Mar 08 19:33:57 crc kubenswrapper[4885]: E0308 19:33:57.368971 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:33:58 crc kubenswrapper[4885]: I0308 19:33:58.367603 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:33:58 crc kubenswrapper[4885]: I0308 19:33:58.367665 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:33:58 crc kubenswrapper[4885]: I0308 19:33:58.367624 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:33:58 crc kubenswrapper[4885]: E0308 19:33:58.367798 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:33:58 crc kubenswrapper[4885]: I0308 19:33:58.367963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:33:58 crc kubenswrapper[4885]: E0308 19:33:58.368020 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:33:58 crc kubenswrapper[4885]: E0308 19:33:58.367961 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:33:58 crc kubenswrapper[4885]: E0308 19:33:58.368121 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.384673 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.404997 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.420727 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.440321 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.460562 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.492033 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: E0308 19:33:59.508490 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.511806 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.543914 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.564807 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.585061 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.606064 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.639791 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.664671 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.684531 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.703467 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.720002 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.743040 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.763168 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:33:59 crc kubenswrapper[4885]: I0308 19:33:59.780474 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:33:59Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:00 crc kubenswrapper[4885]: I0308 19:34:00.367782 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:00 crc kubenswrapper[4885]: I0308 19:34:00.367866 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:00 crc kubenswrapper[4885]: I0308 19:34:00.367808 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:00 crc kubenswrapper[4885]: I0308 19:34:00.367810 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:00 crc kubenswrapper[4885]: E0308 19:34:00.368054 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:00 crc kubenswrapper[4885]: E0308 19:34:00.368156 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:00 crc kubenswrapper[4885]: E0308 19:34:00.368266 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:00 crc kubenswrapper[4885]: E0308 19:34:00.368345 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.368107 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.368162 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.368165 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.368220 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.368314 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.368539 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.368730 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.368843 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.398462 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.398518 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.398535 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.398556 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.398575 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:02Z","lastTransitionTime":"2026-03-08T19:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.419649 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:02Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.425138 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.425198 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.425215 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.425239 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.425259 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:02Z","lastTransitionTime":"2026-03-08T19:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.445235 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:02Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.450467 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.450517 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.450534 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.450558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.450574 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:02Z","lastTransitionTime":"2026-03-08T19:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.470757 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:02Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.476035 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.476097 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.476116 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.476141 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.476161 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:02Z","lastTransitionTime":"2026-03-08T19:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.496111 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:02Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.501044 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.501097 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.501115 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.501138 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:02 crc kubenswrapper[4885]: I0308 19:34:02.501154 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:02Z","lastTransitionTime":"2026-03-08T19:34:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.520071 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:02Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:02 crc kubenswrapper[4885]: E0308 19:34:02.520304 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:34:04 crc kubenswrapper[4885]: I0308 19:34:04.368193 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:04 crc kubenswrapper[4885]: I0308 19:34:04.368424 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:04 crc kubenswrapper[4885]: E0308 19:34:04.368469 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:04 crc kubenswrapper[4885]: E0308 19:34:04.368711 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:04 crc kubenswrapper[4885]: I0308 19:34:04.368838 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:04 crc kubenswrapper[4885]: E0308 19:34:04.369002 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:04 crc kubenswrapper[4885]: I0308 19:34:04.369402 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:04 crc kubenswrapper[4885]: E0308 19:34:04.369539 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:04 crc kubenswrapper[4885]: E0308 19:34:04.510578 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.200292 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/0.log" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.200384 4885 generic.go:334] "Generic (PLEG): container finished" podID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" containerID="578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6" exitCode=1 Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.200433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerDied","Data":"578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6"} Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.201131 4885 scope.go:117] "RemoveContainer" containerID="578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.223127 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.243014 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.244897 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.244981 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.245017 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.245067 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245286 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245325 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245346 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245411 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:35:10.24539066 +0000 UTC m=+211.641444713 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245685 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245748 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:35:10.245734239 +0000 UTC m=+211.641788292 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245820 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245987 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:35:10.245952044 +0000 UTC m=+211.642006097 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.245990 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.246093 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.246343 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.246458 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:35:10.246429998 +0000 UTC m=+211.642484051 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.269843 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.286732 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.307050 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.327899 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.346047 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.346242 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.346421 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:10.346381007 +0000 UTC m=+211.742435060 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.346648 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.346825 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:35:10.346784187 +0000 UTC m=+211.742838290 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.362149 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.367213 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.367223 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.367359 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.367553 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.367585 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.367721 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.367969 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:06 crc kubenswrapper[4885]: E0308 19:34:06.368031 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.381832 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.405541 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.424179 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.443452 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.465835 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.487046 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.514195 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.535556 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.555047 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.567263 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.591254 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:06 crc kubenswrapper[4885]: I0308 19:34:06.609500 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:06Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.208662 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/0.log" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.208752 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerStarted","Data":"f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d"} Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.232486 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.252610 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.270172 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.283312 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.315768 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.338694 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.354234 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.376887 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.395796 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.411708 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.437999 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.458445 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.482617 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.500965 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.523489 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.539723 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.561210 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.581698 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:07 crc kubenswrapper[4885]: I0308 19:34:07.604085 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:07Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:08 crc kubenswrapper[4885]: I0308 19:34:08.367872 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:08 crc kubenswrapper[4885]: I0308 19:34:08.368010 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:08 crc kubenswrapper[4885]: E0308 19:34:08.368061 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:08 crc kubenswrapper[4885]: I0308 19:34:08.368121 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:08 crc kubenswrapper[4885]: I0308 19:34:08.368121 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:08 crc kubenswrapper[4885]: E0308 19:34:08.369070 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:08 crc kubenswrapper[4885]: E0308 19:34:08.369228 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:08 crc kubenswrapper[4885]: E0308 19:34:08.369380 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:08 crc kubenswrapper[4885]: I0308 19:34:08.369607 4885 scope.go:117] "RemoveContainer" containerID="78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.220615 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/2.log" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.225088 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000"} Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.225777 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.245593 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.268375 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.289646 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.308069 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.323668 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.361600 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.384223 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.398883 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.422208 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.437396 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.451307 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.468459 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.489995 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: E0308 19:34:09.511161 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.518819 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.539587 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.559637 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.575775 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.596758 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.614187 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.639947 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.662958 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.679900 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.697595 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.712188 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.732472 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.754615 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.771904 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.791453 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.805850 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.821594 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.837564 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.857355 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.875881 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.906508 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.926034 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.944005 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.965418 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:09 crc kubenswrapper[4885]: I0308 19:34:09.984102 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:09Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.232058 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/3.log" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.233475 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/2.log" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.237553 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" exitCode=1 Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.237638 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000"} Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.237703 4885 scope.go:117] "RemoveContainer" containerID="78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.239042 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:34:10 crc kubenswrapper[4885]: E0308 19:34:10.239569 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.261996 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.283613 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.301612 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.323267 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.356994 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78c8579596f15f2f651895876bea75dd1d7c3d31b2f389a04ce28d2126d1f81d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:33:41Z\\\",\\\"message\\\":\\\"tocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:33:41.401883 7105 obj_retry.go:409] Going to retry *v1.Pod resource setup for 13 objects: [openshift-image-registry/node-ca-57qch openshift-multus/multus-ff7b4 openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-network-diagnostics/network-check-target-xd92c openshift-machine-config-operator/machine-config-daemon-ttb97 openshift-multus/network-metrics-daemon-jps4r openshift-kube-controller-manager/kube-controller-manager-crc openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-multus/multus-additional-cni-plugins-25vxd openshift-network-operator/iptables-alerter-4ln5h openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt openshift-kube-apiserver/kube-apiserver-crc]\\\\nF0308 19:33:41.401901 7105 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller ini\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:09Z\\\",\\\"message\\\":\\\"p:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:34:09.360032 7381 services_controller.go:452] Built service openshift-kube-apiserver/apiserver per-node LB for network=default: []services.LB{}\\\\nF0308 19:34:09.360037 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for node Informer during admin network policy controller initialization, handler {0x1fcb760 0x1fcb440 0x1fcb3e0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.367963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.368036 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.368048 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:10 crc kubenswrapper[4885]: E0308 19:34:10.368124 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.368161 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:10 crc kubenswrapper[4885]: E0308 19:34:10.368369 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:10 crc kubenswrapper[4885]: E0308 19:34:10.368469 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:10 crc kubenswrapper[4885]: E0308 19:34:10.368604 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.377594 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.403770 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.421718 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.440675 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.457524 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.478185 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.498041 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.520177 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.540529 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.556663 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.588707 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.610505 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.628460 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:10 crc kubenswrapper[4885]: I0308 19:34:10.646059 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:10Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.244425 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/3.log" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.249870 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:34:11 crc kubenswrapper[4885]: E0308 19:34:11.250191 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.270302 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.291603 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.314507 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.334054 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.354428 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.370524 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.402080 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.425484 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.441297 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.459880 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.480511 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.495723 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.515720 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.535767 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.566900 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:09Z\\\",\\\"message\\\":\\\"p:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:34:09.360032 7381 services_controller.go:452] Built service openshift-kube-apiserver/apiserver per-node LB for network=default: []services.LB{}\\\\nF0308 19:34:09.360037 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for node Informer during admin network policy controller initialization, handler {0x1fcb760 0x1fcb440 0x1fcb3e0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.584408 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.607285 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.623917 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:11 crc kubenswrapper[4885]: I0308 19:34:11.641472 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:11Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.367528 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.367704 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.368229 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.368434 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.368529 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.368913 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.369014 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.369129 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.809380 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.810549 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.810598 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.810638 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.810664 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:12Z","lastTransitionTime":"2026-03-08T19:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.834379 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:12Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.840445 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.840701 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.840858 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.841053 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.841183 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:12Z","lastTransitionTime":"2026-03-08T19:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.861257 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:12Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.866497 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.866536 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.866558 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.866584 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.866602 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:12Z","lastTransitionTime":"2026-03-08T19:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.886106 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:12Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.891167 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.891244 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.891268 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.891299 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.891321 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:12Z","lastTransitionTime":"2026-03-08T19:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.912295 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:12Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.916813 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.916878 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.916896 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.916957 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:12 crc kubenswrapper[4885]: I0308 19:34:12.916977 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:12Z","lastTransitionTime":"2026-03-08T19:34:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.936865 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:12Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:12 crc kubenswrapper[4885]: E0308 19:34:12.937135 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:34:14 crc kubenswrapper[4885]: I0308 19:34:14.367953 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:14 crc kubenswrapper[4885]: I0308 19:34:14.368008 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:14 crc kubenswrapper[4885]: I0308 19:34:14.367974 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:14 crc kubenswrapper[4885]: I0308 19:34:14.368076 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:14 crc kubenswrapper[4885]: E0308 19:34:14.368214 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:14 crc kubenswrapper[4885]: E0308 19:34:14.368386 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:14 crc kubenswrapper[4885]: E0308 19:34:14.368586 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:14 crc kubenswrapper[4885]: E0308 19:34:14.368760 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:14 crc kubenswrapper[4885]: E0308 19:34:14.512976 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:16 crc kubenswrapper[4885]: I0308 19:34:16.367589 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:16 crc kubenswrapper[4885]: I0308 19:34:16.367622 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:16 crc kubenswrapper[4885]: I0308 19:34:16.367695 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:16 crc kubenswrapper[4885]: I0308 19:34:16.367710 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:16 crc kubenswrapper[4885]: E0308 19:34:16.367792 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:16 crc kubenswrapper[4885]: E0308 19:34:16.367948 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:16 crc kubenswrapper[4885]: E0308 19:34:16.368042 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:16 crc kubenswrapper[4885]: E0308 19:34:16.368194 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:18 crc kubenswrapper[4885]: I0308 19:34:18.368030 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:18 crc kubenswrapper[4885]: I0308 19:34:18.368051 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:18 crc kubenswrapper[4885]: I0308 19:34:18.368123 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:18 crc kubenswrapper[4885]: E0308 19:34:18.368472 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:18 crc kubenswrapper[4885]: E0308 19:34:18.368746 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:18 crc kubenswrapper[4885]: I0308 19:34:18.368971 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:18 crc kubenswrapper[4885]: E0308 19:34:18.369046 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:18 crc kubenswrapper[4885]: E0308 19:34:18.369108 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.389693 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.405675 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.424294 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.444684 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.479067 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:09Z\\\",\\\"message\\\":\\\"p:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:34:09.360032 7381 services_controller.go:452] Built service openshift-kube-apiserver/apiserver per-node LB for network=default: []services.LB{}\\\\nF0308 19:34:09.360037 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for node Informer during admin network policy controller initialization, handler {0x1fcb760 0x1fcb440 0x1fcb3e0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.500187 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: E0308 19:34:19.514044 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.526373 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.542549 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.561555 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.579392 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.598840 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.622422 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.643166 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.660721 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.678544 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.712017 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.733972 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.753052 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:19 crc kubenswrapper[4885]: I0308 19:34:19.775477 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:19Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:20 crc kubenswrapper[4885]: I0308 19:34:20.368071 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:20 crc kubenswrapper[4885]: I0308 19:34:20.368192 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:20 crc kubenswrapper[4885]: I0308 19:34:20.368199 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:20 crc kubenswrapper[4885]: I0308 19:34:20.368390 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:20 crc kubenswrapper[4885]: E0308 19:34:20.368559 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:20 crc kubenswrapper[4885]: E0308 19:34:20.368682 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:20 crc kubenswrapper[4885]: E0308 19:34:20.368809 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:20 crc kubenswrapper[4885]: E0308 19:34:20.369083 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:22 crc kubenswrapper[4885]: I0308 19:34:22.367180 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:22 crc kubenswrapper[4885]: I0308 19:34:22.367180 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:22 crc kubenswrapper[4885]: E0308 19:34:22.367765 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:22 crc kubenswrapper[4885]: I0308 19:34:22.367349 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:22 crc kubenswrapper[4885]: I0308 19:34:22.367330 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:22 crc kubenswrapper[4885]: E0308 19:34:22.368039 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:22 crc kubenswrapper[4885]: E0308 19:34:22.368224 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:22 crc kubenswrapper[4885]: E0308 19:34:22.368392 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.070033 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.070102 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.070126 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.070158 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.070182 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:23Z","lastTransitionTime":"2026-03-08T19:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:23 crc kubenswrapper[4885]: E0308 19:34:23.091392 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.096616 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.096672 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.096692 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.096718 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.096737 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:23Z","lastTransitionTime":"2026-03-08T19:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:23 crc kubenswrapper[4885]: E0308 19:34:23.117793 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.123419 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.123484 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.123510 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.123545 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.123569 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:23Z","lastTransitionTime":"2026-03-08T19:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:23 crc kubenswrapper[4885]: E0308 19:34:23.145283 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.151605 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.151666 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.151682 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.151711 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.151730 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:23Z","lastTransitionTime":"2026-03-08T19:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:23 crc kubenswrapper[4885]: E0308 19:34:23.173745 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.179160 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.179235 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.179252 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.179277 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:23 crc kubenswrapper[4885]: I0308 19:34:23.179293 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:23Z","lastTransitionTime":"2026-03-08T19:34:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:23 crc kubenswrapper[4885]: E0308 19:34:23.211806 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:23Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:23Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:23 crc kubenswrapper[4885]: E0308 19:34:23.212059 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:34:24 crc kubenswrapper[4885]: I0308 19:34:24.367316 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:24 crc kubenswrapper[4885]: I0308 19:34:24.367371 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:24 crc kubenswrapper[4885]: E0308 19:34:24.367500 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:24 crc kubenswrapper[4885]: I0308 19:34:24.367565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:24 crc kubenswrapper[4885]: E0308 19:34:24.367644 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:24 crc kubenswrapper[4885]: I0308 19:34:24.367686 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:24 crc kubenswrapper[4885]: E0308 19:34:24.368123 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:24 crc kubenswrapper[4885]: E0308 19:34:24.368267 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:24 crc kubenswrapper[4885]: I0308 19:34:24.368778 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:34:24 crc kubenswrapper[4885]: E0308 19:34:24.369110 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:34:24 crc kubenswrapper[4885]: E0308 19:34:24.516123 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:26 crc kubenswrapper[4885]: I0308 19:34:26.367160 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:26 crc kubenswrapper[4885]: I0308 19:34:26.367222 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:26 crc kubenswrapper[4885]: I0308 19:34:26.367225 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:26 crc kubenswrapper[4885]: I0308 19:34:26.367160 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:26 crc kubenswrapper[4885]: E0308 19:34:26.367418 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:26 crc kubenswrapper[4885]: E0308 19:34:26.367548 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:26 crc kubenswrapper[4885]: E0308 19:34:26.367623 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:26 crc kubenswrapper[4885]: E0308 19:34:26.367701 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:28 crc kubenswrapper[4885]: I0308 19:34:28.367141 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:28 crc kubenswrapper[4885]: I0308 19:34:28.367205 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:28 crc kubenswrapper[4885]: I0308 19:34:28.367275 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:28 crc kubenswrapper[4885]: I0308 19:34:28.367156 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:28 crc kubenswrapper[4885]: E0308 19:34:28.367399 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:28 crc kubenswrapper[4885]: E0308 19:34:28.367505 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:28 crc kubenswrapper[4885]: E0308 19:34:28.367607 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:28 crc kubenswrapper[4885]: E0308 19:34:28.367693 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.402307 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.426546 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.448163 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.468899 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.488913 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.510468 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: E0308 19:34:29.517352 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.545679 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.561624 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.578224 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.597759 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.614462 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.632675 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.649666 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.675343 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:09Z\\\",\\\"message\\\":\\\"p:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:34:09.360032 7381 services_controller.go:452] Built service openshift-kube-apiserver/apiserver per-node LB for network=default: []services.LB{}\\\\nF0308 19:34:09.360037 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for node Informer during admin network policy controller initialization, handler {0x1fcb760 0x1fcb440 0x1fcb3e0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.690340 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.713257 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.735486 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.757212 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:29 crc kubenswrapper[4885]: I0308 19:34:29.773502 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:29Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:30 crc kubenswrapper[4885]: I0308 19:34:30.367871 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:30 crc kubenswrapper[4885]: I0308 19:34:30.368019 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:30 crc kubenswrapper[4885]: I0308 19:34:30.368109 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:30 crc kubenswrapper[4885]: I0308 19:34:30.368312 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:30 crc kubenswrapper[4885]: E0308 19:34:30.368288 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:30 crc kubenswrapper[4885]: E0308 19:34:30.368441 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:30 crc kubenswrapper[4885]: E0308 19:34:30.368594 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:30 crc kubenswrapper[4885]: E0308 19:34:30.368822 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:32 crc kubenswrapper[4885]: I0308 19:34:32.367487 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:32 crc kubenswrapper[4885]: I0308 19:34:32.367546 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:32 crc kubenswrapper[4885]: I0308 19:34:32.367628 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:32 crc kubenswrapper[4885]: I0308 19:34:32.367735 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:32 crc kubenswrapper[4885]: E0308 19:34:32.367732 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:32 crc kubenswrapper[4885]: E0308 19:34:32.367902 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:32 crc kubenswrapper[4885]: E0308 19:34:32.368062 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:32 crc kubenswrapper[4885]: E0308 19:34:32.368192 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.589883 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.589986 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.590005 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.590032 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.590049 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:33Z","lastTransitionTime":"2026-03-08T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:33 crc kubenswrapper[4885]: E0308 19:34:33.613203 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.619120 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.619174 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.619192 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.619215 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.619233 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:33Z","lastTransitionTime":"2026-03-08T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:33 crc kubenswrapper[4885]: E0308 19:34:33.641334 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.655056 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.655134 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.655155 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.655189 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.655213 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:33Z","lastTransitionTime":"2026-03-08T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:33 crc kubenswrapper[4885]: E0308 19:34:33.678838 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.684841 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.684903 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.684969 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.684997 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.685024 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:33Z","lastTransitionTime":"2026-03-08T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:33 crc kubenswrapper[4885]: E0308 19:34:33.708499 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.714365 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.714439 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.714460 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.714488 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:33 crc kubenswrapper[4885]: I0308 19:34:33.714507 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:33Z","lastTransitionTime":"2026-03-08T19:34:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:33 crc kubenswrapper[4885]: E0308 19:34:33.737203 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:33Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4c2a725e-e9fd-471d-962e-34eaf38ef5ae\\\",\\\"systemUUID\\\":\\\"7aa01b7c-4329-4abc-97e1-626c363cfaee\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:33Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:33 crc kubenswrapper[4885]: E0308 19:34:33.737432 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:34:34 crc kubenswrapper[4885]: I0308 19:34:34.367567 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:34 crc kubenswrapper[4885]: I0308 19:34:34.367644 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:34 crc kubenswrapper[4885]: I0308 19:34:34.368153 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:34 crc kubenswrapper[4885]: E0308 19:34:34.368334 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:34 crc kubenswrapper[4885]: I0308 19:34:34.368396 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:34 crc kubenswrapper[4885]: E0308 19:34:34.368541 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:34 crc kubenswrapper[4885]: E0308 19:34:34.368601 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:34 crc kubenswrapper[4885]: E0308 19:34:34.368686 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:34 crc kubenswrapper[4885]: E0308 19:34:34.519128 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:35 crc kubenswrapper[4885]: I0308 19:34:35.368914 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:34:35 crc kubenswrapper[4885]: E0308 19:34:35.369281 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:34:36 crc kubenswrapper[4885]: I0308 19:34:36.367505 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:36 crc kubenswrapper[4885]: I0308 19:34:36.367565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:36 crc kubenswrapper[4885]: I0308 19:34:36.367630 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:36 crc kubenswrapper[4885]: E0308 19:34:36.367707 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:36 crc kubenswrapper[4885]: I0308 19:34:36.367604 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:36 crc kubenswrapper[4885]: E0308 19:34:36.367849 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:36 crc kubenswrapper[4885]: E0308 19:34:36.368096 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:36 crc kubenswrapper[4885]: E0308 19:34:36.368255 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:38 crc kubenswrapper[4885]: I0308 19:34:38.367122 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:38 crc kubenswrapper[4885]: I0308 19:34:38.367213 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:38 crc kubenswrapper[4885]: E0308 19:34:38.367324 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:38 crc kubenswrapper[4885]: I0308 19:34:38.367380 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:38 crc kubenswrapper[4885]: I0308 19:34:38.367413 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:38 crc kubenswrapper[4885]: E0308 19:34:38.367552 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:38 crc kubenswrapper[4885]: E0308 19:34:38.367956 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:38 crc kubenswrapper[4885]: E0308 19:34:38.368154 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.393215 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.413016 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5e4f1f6cfb7c2169a53f2f593037d5ba05a50a6d3ec4b4fa5cbc698f6f74a6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c34a01fda27ff6410051743012dd0ab233ba4249a63b735641247c8bbb621db2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.436582 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dedec2a4-d864-4f30-8a2d-b3168817ea34\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:09Z\\\",\\\"message\\\":\\\"p:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-route-controller-manager/route-controller-manager]} name:Service_openshift-route-controller-manager/route-controller-manager_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.239:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {18746a4d-8a63-458a-b7e3-8fb89ff95fc0}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0308 19:34:09.360032 7381 services_controller.go:452] Built service openshift-kube-apiserver/apiserver per-node LB for network=default: []services.LB{}\\\\nF0308 19:34:09.360037 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for node Informer during admin network policy controller initialization, handler {0x1fcb760 0x1fcb440 0x1fcb3e0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed c\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:34:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5mlvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bssfh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.449482 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b33b5271-bda3-41ca-81a3-d47fff657c27\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://817201f4cb1771a6596eabe8b5b8cc2d17c496ad60b557e15510425fca4b9f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f860ecfa439e093c313bb98b51d2cb22c45022d96f92e1e81f83a21a8b7baff1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bpng\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-t2brt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.470603 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-25vxd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ac600107-0c97-4ec8-89f6-598b40c166ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://964670ba00a67f96120302b888b176283e4da5479206c29e8c45772c70907804\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://219ff1cfc15b380dd3eb412b9a1a67ae6572eeb9eb507aa77c0ceb57a91fa152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e0d1f10fd46358b93c0f630f0d3814663e300c7b6619e8374b4534fa5ef173ca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a78ee2773e8636b4d2ad7f8db45deff0fd0a5ea284e7fb83aac1b2791c810c3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe421efa191fe251efc5223a13fe508a9ec4f88192773678d7afa989a734feeb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63125f35fa5e22c477386bae8fe8a35d1c70246b0e6ef216b93a907672f83fa0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://309cf124ecd7bb2251884a0310f6b33960b82daac6e24ada41f2cfb052721ad5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:33:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2zkpk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-25vxd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.487275 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0d6c934-6816-4bb9-b965-615136eb944a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fd305b60303968950907b7705e566b3616f2dc67ba8d17024519ca2158494b5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4bf2ac51a9d8a27b0b4af4441db34854d8f84737006cc713f138360e625fb42a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.503865 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"762cbe5e-c6a8-4825-a7db-b2afc3dedba4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c007a13651a14e0aa4788ead8b4f3c4c463eaf7eaa4bca3eba0e5f3b3dcc5f39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd0f12650c69c394cb46ab313c5b210fb4cc7d59e8dd098fdd50c1728479fa98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a92d9d3c6568e480aec258fcb42dad9b4107c2303a1ba4f4b3fbcf485bb79178\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63fb25b7793deacfaeffa20b390efa271d9d6af634a9c771a1b594029f9d3135\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.517294 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-w5lms" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bc890659-71a7-4024-bae6-e1e1ef563f17\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b5fbb02ada03e7372a57d5db05f2b222bc01895eea8d645744c6d330b1ffc3e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d6kv8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-w5lms\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: E0308 19:34:39.520092 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.536897 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.557125 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-ff7b4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ac72c25-d3e6-4dda-8444-6cd4442af7e4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:34:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-08T19:34:05Z\\\",\\\"message\\\":\\\"2026-03-08T19:33:20+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a\\\\n2026-03-08T19:33:20+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_565703d2-7168-4ca3-be3a-3215be34bb1a to /host/opt/cni/bin/\\\\n2026-03-08T19:33:20Z [verbose] multus-daemon started\\\\n2026-03-08T19:33:20Z [verbose] Readiness Indicator file check\\\\n2026-03-08T19:34:05Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:34:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7pllt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"multus-ff7b4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.575157 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.590332 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a93a8db36804092275e145715564c5676f19a335a794ef3c5c170bb0505f0dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-njr92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ttb97\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.605893 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-jps4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f639c4e-64b8-45e9-bf33-c1d8c376b438\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mr8jn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-jps4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.636276 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b0c3bb8f-9230-4618-99eb-ada667486e50\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0decb416293ab0e168aebdf979d22e925b293a137280c1183487c563edb953af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://21200b346675ec470e590055aedd7e482393af72e99aa3c4ec9668527944f286\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96ff9f34797976aa5604427ce95f753f989b4f212b093c432a6702662f0c305f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c986695e49bc306202033299d7aa0dc5a804f93cb3f29f8ae33b1ecbc430569\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ce38514e27aa02e90fac455686ab4062c34cb0286bb9367da2c6d992f48c06c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://396b778b306704439878d8c1267bd27f995aa4facffb7586f09515fe56eca975\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b3a6676cc6c1f6929f12655a0c84d2c3041e53d1c26b5e3d68d146dbf21e95e6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41cbddd8d2bfd5da8d10ad7d7877aeaf2380dc2fe0785be9da48905259809533\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.655677 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-08T19:32:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0308 19:32:51.905054 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0308 19:32:51.905253 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0308 19:32:51.906171 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1634696136/tls.crt::/tmp/serving-cert-1634696136/tls.key\\\\\\\"\\\\nI0308 19:32:52.127380 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0308 19:32:52.129201 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0308 19:32:52.129250 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0308 19:32:52.129301 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0308 19:32:52.129325 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0308 19:32:52.133373 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0308 19:32:52.133413 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133425 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0308 19:32:52.133443 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0308 19:32:52.133452 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0308 19:32:52.133460 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0308 19:32:52.133468 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0308 19:32:52.133384 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0308 19:32:52.136070 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-08T19:32:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:42Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-08T19:31:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.669099 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c8f7e88f55f722657a19f425f4f446c0325e21fcbb1f8ddbd300cb074e39624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.688394 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"07f98b07-bd4c-42b5-ac2c-d750a316d9e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:32:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:31:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72e234a8a3d25e039161c0616cc8e92e364e9bc54149def0887203e2bfe18b6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa09344c4e65eaf28bcd07bcda27b08304bc036aa2a4f36636dc6dc19203465b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://539587095c50cf04a7a91aea29c9af82a58df760490dcc5f12eaa3b11b11f917\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:31:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:31:39Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.709325 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:16Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b24c7d868da795c8441df17efcdb6693684374f818d359d3452bd262242f44d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:39 crc kubenswrapper[4885]: I0308 19:34:39.723209 4885 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-57qch" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0cfac2d6-6888-4b2d-982e-826f583396e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T19:33:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a3383019481fd384582e5f78d03b594be2f7f20735cf6a78fc2ff655943ade\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-08T19:33:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-r95ct\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T19:33:02Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-57qch\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-08T19:34:39Z is after 2025-08-24T17:21:41Z" Mar 08 19:34:40 crc kubenswrapper[4885]: I0308 19:34:40.367403 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:40 crc kubenswrapper[4885]: I0308 19:34:40.367689 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:40 crc kubenswrapper[4885]: I0308 19:34:40.367748 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:40 crc kubenswrapper[4885]: I0308 19:34:40.367689 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:40 crc kubenswrapper[4885]: E0308 19:34:40.367894 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:40 crc kubenswrapper[4885]: E0308 19:34:40.368085 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:40 crc kubenswrapper[4885]: E0308 19:34:40.368689 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:40 crc kubenswrapper[4885]: E0308 19:34:40.368852 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:42 crc kubenswrapper[4885]: I0308 19:34:42.367216 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:42 crc kubenswrapper[4885]: I0308 19:34:42.367444 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:42 crc kubenswrapper[4885]: E0308 19:34:42.367739 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:42 crc kubenswrapper[4885]: I0308 19:34:42.368185 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:42 crc kubenswrapper[4885]: I0308 19:34:42.368208 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:42 crc kubenswrapper[4885]: E0308 19:34:42.368358 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:42 crc kubenswrapper[4885]: E0308 19:34:42.368618 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:42 crc kubenswrapper[4885]: E0308 19:34:42.368705 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.034560 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.034617 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.034637 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.034662 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.034680 4885 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-08T19:34:44Z","lastTransitionTime":"2026-03-08T19:34:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.111313 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh"] Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.111979 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.115645 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.115952 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.116221 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.119127 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.186292 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/07d4b007-1aec-465c-a5db-92efaa4defbe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.186382 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ff7b4" podStartSLOduration=149.186344079 podStartE2EDuration="2m29.186344079s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.165147566 +0000 UTC m=+185.561201629" watchObservedRunningTime="2026-03-08 19:34:44.186344079 +0000 UTC m=+185.582398132" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.186673 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d4b007-1aec-465c-a5db-92efaa4defbe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.186868 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07d4b007-1aec-465c-a5db-92efaa4defbe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.186968 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/07d4b007-1aec-465c-a5db-92efaa4defbe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.187030 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d4b007-1aec-465c-a5db-92efaa4defbe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.213186 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podStartSLOduration=149.213156729 podStartE2EDuration="2m29.213156729s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.212776949 +0000 UTC m=+185.608830982" watchObservedRunningTime="2026-03-08 19:34:44.213156729 +0000 UTC m=+185.609210782" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288112 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/07d4b007-1aec-465c-a5db-92efaa4defbe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288188 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d4b007-1aec-465c-a5db-92efaa4defbe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288235 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07d4b007-1aec-465c-a5db-92efaa4defbe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288283 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/07d4b007-1aec-465c-a5db-92efaa4defbe-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288338 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/07d4b007-1aec-465c-a5db-92efaa4defbe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288382 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/07d4b007-1aec-465c-a5db-92efaa4defbe-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.288394 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d4b007-1aec-465c-a5db-92efaa4defbe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.289560 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d4b007-1aec-465c-a5db-92efaa4defbe-service-ca\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.296104 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d4b007-1aec-465c-a5db-92efaa4defbe-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.327731 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=100.32770924 podStartE2EDuration="1m40.32770924s" podCreationTimestamp="2026-03-08 19:33:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.296560426 +0000 UTC m=+185.692614509" watchObservedRunningTime="2026-03-08 19:34:44.32770924 +0000 UTC m=+185.723763273" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.327998 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=97.327992738 podStartE2EDuration="1m37.327992738s" podCreationTimestamp="2026-03-08 19:33:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.32735488 +0000 UTC m=+185.723408903" watchObservedRunningTime="2026-03-08 19:34:44.327992738 +0000 UTC m=+185.724046771" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.334733 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07d4b007-1aec-465c-a5db-92efaa4defbe-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-s2klh\" (UID: \"07d4b007-1aec-465c-a5db-92efaa4defbe\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.361592 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=70.361573955 podStartE2EDuration="1m10.361573955s" podCreationTimestamp="2026-03-08 19:33:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.361101953 +0000 UTC m=+185.757155976" watchObservedRunningTime="2026-03-08 19:34:44.361573955 +0000 UTC m=+185.757627988" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.367856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:44 crc kubenswrapper[4885]: E0308 19:34:44.367999 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.368198 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.368281 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.368345 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:44 crc kubenswrapper[4885]: E0308 19:34:44.368387 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:44 crc kubenswrapper[4885]: E0308 19:34:44.368654 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:44 crc kubenswrapper[4885]: E0308 19:34:44.368832 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.396974 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.402497 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-57qch" podStartSLOduration=149.402470452 podStartE2EDuration="2m29.402470452s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.401114947 +0000 UTC m=+185.797168970" watchObservedRunningTime="2026-03-08 19:34:44.402470452 +0000 UTC m=+185.798524515" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.409223 4885 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.433257 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" Mar 08 19:34:44 crc kubenswrapper[4885]: W0308 19:34:44.451812 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07d4b007_1aec_465c_a5db_92efaa4defbe.slice/crio-819395f9bd1c9edfe8385fdc42ceda224dc8de4c2cbdc2b48a57031866536c6e WatchSource:0}: Error finding container 819395f9bd1c9edfe8385fdc42ceda224dc8de4c2cbdc2b48a57031866536c6e: Status 404 returned error can't find the container with id 819395f9bd1c9edfe8385fdc42ceda224dc8de4c2cbdc2b48a57031866536c6e Mar 08 19:34:44 crc kubenswrapper[4885]: E0308 19:34:44.521345 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.533823 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-t2brt" podStartSLOduration=148.533789861 podStartE2EDuration="2m28.533789861s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.504156418 +0000 UTC m=+185.900210511" watchObservedRunningTime="2026-03-08 19:34:44.533789861 +0000 UTC m=+185.929843924" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.534264 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-25vxd" podStartSLOduration=149.534254823 podStartE2EDuration="2m29.534254823s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.534004457 +0000 UTC m=+185.930058520" watchObservedRunningTime="2026-03-08 19:34:44.534254823 +0000 UTC m=+185.930308886" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.552223 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=52.552183961 podStartE2EDuration="52.552183961s" podCreationTimestamp="2026-03-08 19:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.552020347 +0000 UTC m=+185.948074370" watchObservedRunningTime="2026-03-08 19:34:44.552183961 +0000 UTC m=+185.948238014" Mar 08 19:34:44 crc kubenswrapper[4885]: I0308 19:34:44.589901 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=53.589863025 podStartE2EDuration="53.589863025s" podCreationTimestamp="2026-03-08 19:33:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.572168583 +0000 UTC m=+185.968222646" watchObservedRunningTime="2026-03-08 19:34:44.589863025 +0000 UTC m=+185.985917078" Mar 08 19:34:45 crc kubenswrapper[4885]: I0308 19:34:45.398204 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" event={"ID":"07d4b007-1aec-465c-a5db-92efaa4defbe","Type":"ContainerStarted","Data":"6913dce9a030a733bc10cac09bffc5f9aa0b5fbf88ad608321a140780b687ab6"} Mar 08 19:34:45 crc kubenswrapper[4885]: I0308 19:34:45.398325 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" event={"ID":"07d4b007-1aec-465c-a5db-92efaa4defbe","Type":"ContainerStarted","Data":"819395f9bd1c9edfe8385fdc42ceda224dc8de4c2cbdc2b48a57031866536c6e"} Mar 08 19:34:45 crc kubenswrapper[4885]: I0308 19:34:45.418612 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-w5lms" podStartSLOduration=150.418583285 podStartE2EDuration="2m30.418583285s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:44.587821072 +0000 UTC m=+185.983875125" watchObservedRunningTime="2026-03-08 19:34:45.418583285 +0000 UTC m=+186.814637308" Mar 08 19:34:45 crc kubenswrapper[4885]: I0308 19:34:45.420329 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-s2klh" podStartSLOduration=150.4203144 podStartE2EDuration="2m30.4203144s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:34:45.418860142 +0000 UTC m=+186.814914215" watchObservedRunningTime="2026-03-08 19:34:45.4203144 +0000 UTC m=+186.816368453" Mar 08 19:34:46 crc kubenswrapper[4885]: I0308 19:34:46.367383 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:46 crc kubenswrapper[4885]: I0308 19:34:46.367423 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:46 crc kubenswrapper[4885]: E0308 19:34:46.367597 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:46 crc kubenswrapper[4885]: I0308 19:34:46.367671 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:46 crc kubenswrapper[4885]: E0308 19:34:46.367764 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:46 crc kubenswrapper[4885]: I0308 19:34:46.367816 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:46 crc kubenswrapper[4885]: E0308 19:34:46.367902 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:46 crc kubenswrapper[4885]: E0308 19:34:46.368476 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:47 crc kubenswrapper[4885]: I0308 19:34:47.369233 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:34:47 crc kubenswrapper[4885]: E0308 19:34:47.369692 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bssfh_openshift-ovn-kubernetes(dedec2a4-d864-4f30-8a2d-b3168817ea34)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" Mar 08 19:34:48 crc kubenswrapper[4885]: I0308 19:34:48.367763 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:48 crc kubenswrapper[4885]: I0308 19:34:48.367855 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:48 crc kubenswrapper[4885]: I0308 19:34:48.367856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:48 crc kubenswrapper[4885]: I0308 19:34:48.367856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:48 crc kubenswrapper[4885]: E0308 19:34:48.367990 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:48 crc kubenswrapper[4885]: E0308 19:34:48.368091 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:48 crc kubenswrapper[4885]: E0308 19:34:48.368201 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:48 crc kubenswrapper[4885]: E0308 19:34:48.368329 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:49 crc kubenswrapper[4885]: E0308 19:34:49.522375 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:50 crc kubenswrapper[4885]: I0308 19:34:50.367576 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:50 crc kubenswrapper[4885]: I0308 19:34:50.367602 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:50 crc kubenswrapper[4885]: E0308 19:34:50.368108 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:50 crc kubenswrapper[4885]: I0308 19:34:50.367651 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:50 crc kubenswrapper[4885]: I0308 19:34:50.367639 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:50 crc kubenswrapper[4885]: E0308 19:34:50.368403 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:50 crc kubenswrapper[4885]: E0308 19:34:50.368750 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:50 crc kubenswrapper[4885]: E0308 19:34:50.368630 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.367367 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.367447 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.367493 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:52 crc kubenswrapper[4885]: E0308 19:34:52.367572 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.367735 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:52 crc kubenswrapper[4885]: E0308 19:34:52.367711 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:52 crc kubenswrapper[4885]: E0308 19:34:52.367856 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:52 crc kubenswrapper[4885]: E0308 19:34:52.368062 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.427745 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/1.log" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.428791 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/0.log" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.428893 4885 generic.go:334] "Generic (PLEG): container finished" podID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" containerID="f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d" exitCode=1 Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.428996 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerDied","Data":"f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d"} Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.429079 4885 scope.go:117] "RemoveContainer" containerID="578618cc910fd0803e40cc9b60dd425893a7cd038b7ecad9a831bfde447eead6" Mar 08 19:34:52 crc kubenswrapper[4885]: I0308 19:34:52.429667 4885 scope.go:117] "RemoveContainer" containerID="f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d" Mar 08 19:34:52 crc kubenswrapper[4885]: E0308 19:34:52.429974 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-ff7b4_openshift-multus(9ac72c25-d3e6-4dda-8444-6cd4442af7e4)\"" pod="openshift-multus/multus-ff7b4" podUID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" Mar 08 19:34:53 crc kubenswrapper[4885]: I0308 19:34:53.433521 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/1.log" Mar 08 19:34:54 crc kubenswrapper[4885]: I0308 19:34:54.367555 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:54 crc kubenswrapper[4885]: I0308 19:34:54.367601 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:54 crc kubenswrapper[4885]: I0308 19:34:54.367570 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:54 crc kubenswrapper[4885]: I0308 19:34:54.367570 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:54 crc kubenswrapper[4885]: E0308 19:34:54.367724 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:54 crc kubenswrapper[4885]: E0308 19:34:54.367817 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:54 crc kubenswrapper[4885]: E0308 19:34:54.368013 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:54 crc kubenswrapper[4885]: E0308 19:34:54.368153 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:54 crc kubenswrapper[4885]: E0308 19:34:54.523775 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:34:56 crc kubenswrapper[4885]: I0308 19:34:56.367056 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:56 crc kubenswrapper[4885]: I0308 19:34:56.367128 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:56 crc kubenswrapper[4885]: I0308 19:34:56.367153 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:56 crc kubenswrapper[4885]: E0308 19:34:56.367262 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:56 crc kubenswrapper[4885]: E0308 19:34:56.367385 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:56 crc kubenswrapper[4885]: E0308 19:34:56.367466 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:56 crc kubenswrapper[4885]: I0308 19:34:56.367091 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:56 crc kubenswrapper[4885]: E0308 19:34:56.368821 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:58 crc kubenswrapper[4885]: I0308 19:34:58.367649 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:34:58 crc kubenswrapper[4885]: I0308 19:34:58.367700 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:34:58 crc kubenswrapper[4885]: I0308 19:34:58.367682 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:34:58 crc kubenswrapper[4885]: I0308 19:34:58.367653 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:34:58 crc kubenswrapper[4885]: E0308 19:34:58.367877 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:34:58 crc kubenswrapper[4885]: E0308 19:34:58.368072 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:34:58 crc kubenswrapper[4885]: E0308 19:34:58.368308 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:34:58 crc kubenswrapper[4885]: E0308 19:34:58.368405 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:34:59 crc kubenswrapper[4885]: E0308 19:34:59.524694 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:35:00 crc kubenswrapper[4885]: I0308 19:35:00.367053 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:00 crc kubenswrapper[4885]: I0308 19:35:00.367100 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:00 crc kubenswrapper[4885]: I0308 19:35:00.367095 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:00 crc kubenswrapper[4885]: E0308 19:35:00.367275 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:35:00 crc kubenswrapper[4885]: I0308 19:35:00.367305 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:00 crc kubenswrapper[4885]: E0308 19:35:00.367465 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:35:00 crc kubenswrapper[4885]: E0308 19:35:00.367627 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:35:00 crc kubenswrapper[4885]: E0308 19:35:00.367798 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:35:02 crc kubenswrapper[4885]: I0308 19:35:02.367505 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:02 crc kubenswrapper[4885]: I0308 19:35:02.367580 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:02 crc kubenswrapper[4885]: I0308 19:35:02.367541 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:02 crc kubenswrapper[4885]: I0308 19:35:02.367693 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:02 crc kubenswrapper[4885]: E0308 19:35:02.368037 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:35:02 crc kubenswrapper[4885]: E0308 19:35:02.368646 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:35:02 crc kubenswrapper[4885]: E0308 19:35:02.368770 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:35:02 crc kubenswrapper[4885]: E0308 19:35:02.368906 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:35:02 crc kubenswrapper[4885]: I0308 19:35:02.369473 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:35:03 crc kubenswrapper[4885]: I0308 19:35:03.478364 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/3.log" Mar 08 19:35:03 crc kubenswrapper[4885]: I0308 19:35:03.483741 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerStarted","Data":"9417467a958808ec76e7baaf3e912528258fa08f33b991a9a656c8f2699dfe08"} Mar 08 19:35:03 crc kubenswrapper[4885]: I0308 19:35:03.484799 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:35:03 crc kubenswrapper[4885]: I0308 19:35:03.570434 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podStartSLOduration=168.570395124 podStartE2EDuration="2m48.570395124s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:03.522614802 +0000 UTC m=+204.918668855" watchObservedRunningTime="2026-03-08 19:35:03.570395124 +0000 UTC m=+204.966449177" Mar 08 19:35:03 crc kubenswrapper[4885]: I0308 19:35:03.571765 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jps4r"] Mar 08 19:35:03 crc kubenswrapper[4885]: I0308 19:35:03.571962 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:03 crc kubenswrapper[4885]: E0308 19:35:03.572161 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:35:04 crc kubenswrapper[4885]: I0308 19:35:04.367712 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:04 crc kubenswrapper[4885]: I0308 19:35:04.367873 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:04 crc kubenswrapper[4885]: E0308 19:35:04.367907 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:35:04 crc kubenswrapper[4885]: I0308 19:35:04.368014 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:04 crc kubenswrapper[4885]: E0308 19:35:04.368183 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:35:04 crc kubenswrapper[4885]: E0308 19:35:04.368283 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:35:04 crc kubenswrapper[4885]: E0308 19:35:04.525890 4885 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:35:05 crc kubenswrapper[4885]: I0308 19:35:05.367589 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:05 crc kubenswrapper[4885]: E0308 19:35:05.367816 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:35:06 crc kubenswrapper[4885]: I0308 19:35:06.367996 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:06 crc kubenswrapper[4885]: I0308 19:35:06.368034 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:06 crc kubenswrapper[4885]: I0308 19:35:06.368006 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:06 crc kubenswrapper[4885]: E0308 19:35:06.368170 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:35:06 crc kubenswrapper[4885]: E0308 19:35:06.368277 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:35:06 crc kubenswrapper[4885]: E0308 19:35:06.368765 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:35:07 crc kubenswrapper[4885]: I0308 19:35:07.367296 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:07 crc kubenswrapper[4885]: I0308 19:35:07.367859 4885 scope.go:117] "RemoveContainer" containerID="f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d" Mar 08 19:35:07 crc kubenswrapper[4885]: E0308 19:35:07.367884 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:35:08 crc kubenswrapper[4885]: I0308 19:35:08.367803 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:08 crc kubenswrapper[4885]: I0308 19:35:08.367868 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:08 crc kubenswrapper[4885]: E0308 19:35:08.368518 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:35:08 crc kubenswrapper[4885]: I0308 19:35:08.368037 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:08 crc kubenswrapper[4885]: E0308 19:35:08.368739 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:35:08 crc kubenswrapper[4885]: E0308 19:35:08.369012 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:35:08 crc kubenswrapper[4885]: I0308 19:35:08.508544 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/1.log" Mar 08 19:35:08 crc kubenswrapper[4885]: I0308 19:35:08.508661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerStarted","Data":"47b9aa6e943174d2f8819d017007c51f3809d8a8e2d7a64900f1aa71bf065584"} Mar 08 19:35:09 crc kubenswrapper[4885]: I0308 19:35:09.367538 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:09 crc kubenswrapper[4885]: E0308 19:35:09.370423 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.297814 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.297888 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.298002 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.298094 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298121 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298185 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298205 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298211 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298273 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298288 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:37:12.298262533 +0000 UTC m=+333.694316596 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298397 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:37:12.298366025 +0000 UTC m=+333.694420088 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298430 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:37:12.298413187 +0000 UTC m=+333.694467400 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298454 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298473 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298489 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.298558 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:37:12.29853396 +0000 UTC m=+333.694587983 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.368098 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.368146 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.368684 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.372776 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.373379 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.373979 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.374323 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.399110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.399345 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:37:12.399305181 +0000 UTC m=+333.795359234 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:10 crc kubenswrapper[4885]: I0308 19:35:10.399854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.400116 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:35:10 crc kubenswrapper[4885]: E0308 19:35:10.400311 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:37:12.400294068 +0000 UTC m=+333.796348121 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 08 19:35:11 crc kubenswrapper[4885]: I0308 19:35:11.367837 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:35:11 crc kubenswrapper[4885]: I0308 19:35:11.370536 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 19:35:11 crc kubenswrapper[4885]: I0308 19:35:11.371132 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.798523 4885 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.849148 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6lcrf"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.849815 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.850408 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cpx85"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.851147 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.854476 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.855702 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.857996 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.858783 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.861592 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.863370 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.866421 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.867517 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.869553 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.867527 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.873605 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.874440 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.883144 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.884179 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.884246 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.884387 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.884461 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.884701 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.884875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.885078 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.885259 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.885627 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.887895 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.889677 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.890182 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.890424 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.890644 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.890682 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.890708 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.891091 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.892349 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.893282 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.893404 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.894238 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q8q8m"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.894482 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.894784 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hsdmw"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.895307 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.895400 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.895829 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.896451 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.907591 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.907841 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908071 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908432 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2tz9t"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908531 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908631 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908727 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908817 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.908908 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.909097 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.910407 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.910690 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.911253 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.911496 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.911905 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.912367 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.912470 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.912682 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.912904 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.913052 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.912995 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.913637 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.913975 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.914245 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.917792 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.919130 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.924412 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.924529 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.925064 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4xs78"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.925344 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-t2b7w"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.925701 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.926155 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.926816 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.934622 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.935382 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fq2fp"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.935749 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.936130 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.936487 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.936861 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.937177 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.937536 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.937789 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.937958 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.937994 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944079 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944229 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944259 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944388 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944675 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944799 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.944913 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.945044 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.945145 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.945245 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.945563 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.945787 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.945998 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.946114 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.946497 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.947503 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.947667 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.947818 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.947824 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.948261 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.948946 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.949399 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.949440 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.949528 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.949563 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.949664 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.949759 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.950572 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.950588 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.950736 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.953120 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.953847 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c7583a8-a980-4ab2-a594-bf55ec72c91c-serving-cert\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.953906 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwcst\" (UniqueName: \"kubernetes.io/projected/175c50f5-857d-4697-bcde-2ce47f2edfc5-kube-api-access-gwcst\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.953947 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-client-ca\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.953974 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/175c50f5-857d-4697-bcde-2ce47f2edfc5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.954005 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.954046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d98wm\" (UniqueName: \"kubernetes.io/projected/4c7583a8-a980-4ab2-a594-bf55ec72c91c-kube-api-access-d98wm\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.954068 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175c50f5-857d-4697-bcde-2ce47f2edfc5-config\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.954104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-config\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.954137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/175c50f5-857d-4697-bcde-2ce47f2edfc5-images\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.961580 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-lvfcn"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.969056 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.970157 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.975830 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.976354 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q9q8c"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.980227 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.982530 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.983184 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.983732 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.984251 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.985100 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.985685 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.986459 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.986689 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm"] Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.990293 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.991575 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.991767 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.991797 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.991877 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.992051 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.992388 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.992591 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 08 19:35:14 crc kubenswrapper[4885]: I0308 19:35:14.993165 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.000049 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.002984 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.004730 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2bp7t"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.010375 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.010584 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.010942 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.011266 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.011466 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bzvjp"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.012020 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.012241 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.012448 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.016114 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.017178 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.019385 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.019460 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hqtdl"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.020424 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.034248 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.035577 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.035633 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.036208 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.036579 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.036712 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.036847 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.037118 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.037156 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.037337 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.037603 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6lcrf"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.037619 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.037662 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.038102 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.038179 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.041853 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5ghwr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.044788 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.045229 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.045689 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.051996 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ldvgz"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072836 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbpxs\" (UniqueName: \"kubernetes.io/projected/96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1-kube-api-access-gbpxs\") pod \"cluster-samples-operator-665b6dd947-wbbxm\" (UID: \"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072864 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072886 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-service-ca-bundle\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072907 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-client-ca\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072943 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-447lv\" (UniqueName: \"kubernetes.io/projected/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-kube-api-access-447lv\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072968 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9878f6fd-fc1f-4980-a687-84478d0b92c1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.072987 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkm6d\" (UniqueName: \"kubernetes.io/projected/9878f6fd-fc1f-4980-a687-84478d0b92c1-kube-api-access-pkm6d\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073004 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b4zh\" (UniqueName: \"kubernetes.io/projected/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-kube-api-access-2b4zh\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073038 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-trusted-ca-bundle\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073061 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-etcd-client\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-config\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75c67b6f-14bc-4d96-a6b6-ae020ace5353-config\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073126 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d98wm\" (UniqueName: \"kubernetes.io/projected/4c7583a8-a980-4ab2-a594-bf55ec72c91c-kube-api-access-d98wm\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073148 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f5b425d2-db8e-45f3-a141-8ac7bd678491-node-pullsecrets\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175c50f5-857d-4697-bcde-2ce47f2edfc5-config\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073189 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-serving-cert\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5jp7\" (UniqueName: \"kubernetes.io/projected/75c67b6f-14bc-4d96-a6b6-ae020ace5353-kube-api-access-l5jp7\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073235 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrkrt\" (UniqueName: \"kubernetes.io/projected/5a244e04-1aec-4355-89c5-794667b5969f-kube-api-access-jrkrt\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073255 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-config\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073276 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wbbxm\" (UID: \"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073297 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-trusted-ca\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073321 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-config\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073342 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a244e04-1aec-4355-89c5-794667b5969f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073372 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/175c50f5-857d-4697-bcde-2ce47f2edfc5-images\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073401 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-oauth-config\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073425 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kc2g\" (UniqueName: \"kubernetes.io/projected/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-kube-api-access-8kc2g\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073446 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwcst\" (UniqueName: \"kubernetes.io/projected/175c50f5-857d-4697-bcde-2ce47f2edfc5-kube-api-access-gwcst\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073468 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073484 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df78e76d-5024-4d31-a0b9-17d0d6c6c258-serving-cert\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073504 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc92s\" (UniqueName: \"kubernetes.io/projected/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-kube-api-access-hc92s\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073523 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-config\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073543 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8rh9\" (UniqueName: \"kubernetes.io/projected/df78e76d-5024-4d31-a0b9-17d0d6c6c258-kube-api-access-w8rh9\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073574 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-etcd-client\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073593 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-serving-cert\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073615 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073631 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvxds\" (UniqueName: \"kubernetes.io/projected/f5b425d2-db8e-45f3-a141-8ac7bd678491-kube-api-access-xvxds\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073651 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-encryption-config\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073671 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a244e04-1aec-4355-89c5-794667b5969f-serving-cert\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073691 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f5b425d2-db8e-45f3-a141-8ac7bd678491-audit-dir\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073708 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073729 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-oauth-serving-cert\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073750 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/75c67b6f-14bc-4d96-a6b6-ae020ace5353-machine-approver-tls\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073769 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-trusted-ca\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073788 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-serving-cert\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073809 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-audit-policies\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073833 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxk9p\" (UniqueName: \"kubernetes.io/projected/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-kube-api-access-lxk9p\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073852 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073873 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9878f6fd-fc1f-4980-a687-84478d0b92c1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073890 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-image-import-ca\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073910 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf8hq\" (UniqueName: \"kubernetes.io/projected/5ac2fbf9-c9bb-4ef8-988f-4407e688ad54-kube-api-access-xf8hq\") pod \"downloads-7954f5f757-t2b7w\" (UID: \"5ac2fbf9-c9bb-4ef8-988f-4407e688ad54\") " pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073945 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-encryption-config\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073963 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-config\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.073992 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074012 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-serving-cert\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074030 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c7583a8-a980-4ab2-a594-bf55ec72c91c-serving-cert\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-audit\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074062 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-etcd-serving-ca\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074083 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-audit-dir\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074119 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-config\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074135 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-service-ca\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074153 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75c67b6f-14bc-4d96-a6b6-ae020ace5353-auth-proxy-config\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074183 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-client-ca\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074202 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/175c50f5-857d-4697-bcde-2ce47f2edfc5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.074222 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-metrics-tls\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.075739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.076524 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175c50f5-857d-4697-bcde-2ce47f2edfc5-config\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.077465 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-config\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.078249 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.078638 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.079288 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.080294 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-client-ca\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.080414 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.081234 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.082969 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/175c50f5-857d-4697-bcde-2ce47f2edfc5-images\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.096351 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.096973 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/175c50f5-857d-4697-bcde-2ce47f2edfc5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.102283 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.104061 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.104132 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549974-jjqkh"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106583 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106611 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q8q8m"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106624 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106635 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hsdmw"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106645 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2tz9t"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106656 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106667 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.106761 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.107405 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-s5pkw"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.108237 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.108984 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.110312 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q9q8c"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.111276 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.114268 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bzvjp"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.114341 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.117079 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.117108 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fq2fp"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.117329 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.118349 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c7583a8-a980-4ab2-a594-bf55ec72c91c-serving-cert\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.118379 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tw9z2"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.119819 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-bl88k"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.119959 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.120545 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.120906 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.122141 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s5pkw"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.123262 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hqtdl"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.124567 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.125554 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.126813 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.127835 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.129086 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.130378 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.131512 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cpx85"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.132498 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.132673 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t2b7w"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.134035 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.135111 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.136560 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4xs78"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.137692 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2bp7t"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.138752 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tw9z2"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.139769 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ldvgz"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.140784 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549974-jjqkh"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.141755 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.142866 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.144230 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.145489 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5ghwr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.146576 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.147529 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jbwsr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.148151 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.148529 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jbwsr"] Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.153073 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.173132 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174689 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df78e76d-5024-4d31-a0b9-17d0d6c6c258-serving-cert\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174727 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174776 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc92s\" (UniqueName: \"kubernetes.io/projected/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-kube-api-access-hc92s\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174799 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8rh9\" (UniqueName: \"kubernetes.io/projected/df78e76d-5024-4d31-a0b9-17d0d6c6c258-kube-api-access-w8rh9\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174821 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-config\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174839 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-etcd-client\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174857 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-serving-cert\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174886 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174908 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvxds\" (UniqueName: \"kubernetes.io/projected/f5b425d2-db8e-45f3-a141-8ac7bd678491-kube-api-access-xvxds\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174940 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-encryption-config\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a244e04-1aec-4355-89c5-794667b5969f-serving-cert\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174982 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f5b425d2-db8e-45f3-a141-8ac7bd678491-audit-dir\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.174998 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175017 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-oauth-serving-cert\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175038 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/75c67b6f-14bc-4d96-a6b6-ae020ace5353-machine-approver-tls\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175054 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-trusted-ca\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175074 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-serving-cert\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175094 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-audit-policies\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175112 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxk9p\" (UniqueName: \"kubernetes.io/projected/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-kube-api-access-lxk9p\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175149 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-image-import-ca\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175170 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9878f6fd-fc1f-4980-a687-84478d0b92c1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175188 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf8hq\" (UniqueName: \"kubernetes.io/projected/5ac2fbf9-c9bb-4ef8-988f-4407e688ad54-kube-api-access-xf8hq\") pod \"downloads-7954f5f757-t2b7w\" (UID: \"5ac2fbf9-c9bb-4ef8-988f-4407e688ad54\") " pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175216 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-encryption-config\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175242 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-config\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175461 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-serving-cert\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175485 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-audit\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175508 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-etcd-serving-ca\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175525 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-audit-dir\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175542 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-config\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175558 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-service-ca\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175574 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75c67b6f-14bc-4d96-a6b6-ae020ace5353-auth-proxy-config\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-metrics-tls\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175615 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175633 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbpxs\" (UniqueName: \"kubernetes.io/projected/96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1-kube-api-access-gbpxs\") pod \"cluster-samples-operator-665b6dd947-wbbxm\" (UID: \"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-service-ca-bundle\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175668 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-client-ca\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175685 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-447lv\" (UniqueName: \"kubernetes.io/projected/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-kube-api-access-447lv\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175701 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9878f6fd-fc1f-4980-a687-84478d0b92c1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175718 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkm6d\" (UniqueName: \"kubernetes.io/projected/9878f6fd-fc1f-4980-a687-84478d0b92c1-kube-api-access-pkm6d\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175740 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b4zh\" (UniqueName: \"kubernetes.io/projected/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-kube-api-access-2b4zh\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175765 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75c67b6f-14bc-4d96-a6b6-ae020ace5353-config\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175784 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-trusted-ca-bundle\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-etcd-client\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175826 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-config\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f5b425d2-db8e-45f3-a141-8ac7bd678491-node-pullsecrets\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175869 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-serving-cert\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175888 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5jp7\" (UniqueName: \"kubernetes.io/projected/75c67b6f-14bc-4d96-a6b6-ae020ace5353-kube-api-access-l5jp7\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175910 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrkrt\" (UniqueName: \"kubernetes.io/projected/5a244e04-1aec-4355-89c5-794667b5969f-kube-api-access-jrkrt\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175946 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-trusted-ca\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175966 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-config\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175991 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a244e04-1aec-4355-89c5-794667b5969f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.176021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wbbxm\" (UID: \"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.176061 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-oauth-config\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.176082 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kc2g\" (UniqueName: \"kubernetes.io/projected/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-kube-api-access-8kc2g\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.176617 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.176960 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/75c67b6f-14bc-4d96-a6b6-ae020ace5353-auth-proxy-config\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.177405 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-config\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.178195 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.178226 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-audit-policies\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.178293 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f5b425d2-db8e-45f3-a141-8ac7bd678491-node-pullsecrets\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.178549 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-trusted-ca\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.178648 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-audit\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.178665 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-etcd-client\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.179085 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-config\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.179179 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-etcd-serving-ca\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.179238 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-audit-dir\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.179670 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-config\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.175991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-trusted-ca-bundle\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.179818 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-config\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.180794 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-service-ca\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.180804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/5a244e04-1aec-4355-89c5-794667b5969f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.180852 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-metrics-tls\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.181178 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df78e76d-5024-4d31-a0b9-17d0d6c6c258-serving-cert\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.181207 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f5b425d2-db8e-45f3-a141-8ac7bd678491-audit-dir\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.182396 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/75c67b6f-14bc-4d96-a6b6-ae020ace5353-machine-approver-tls\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.182465 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df78e76d-5024-4d31-a0b9-17d0d6c6c258-service-ca-bundle\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.182544 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-config\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.182686 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-oauth-serving-cert\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.182868 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75c67b6f-14bc-4d96-a6b6-ae020ace5353-config\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.183075 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-serving-cert\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.183391 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-trusted-ca-bundle\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.183565 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-client-ca\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.183758 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.183981 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-serving-cert\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.183994 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-encryption-config\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.184271 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-encryption-config\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.184682 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f5b425d2-db8e-45f3-a141-8ac7bd678491-image-import-ca\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.184815 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-serving-cert\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.185269 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-serving-cert\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.192388 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-serving-cert\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.192823 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f5b425d2-db8e-45f3-a141-8ac7bd678491-etcd-client\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.193734 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-wbbxm\" (UID: \"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.194432 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9878f6fd-fc1f-4980-a687-84478d0b92c1-config\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.196147 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.198016 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-oauth-config\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.198175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-trusted-ca\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.198378 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5a244e04-1aec-4355-89c5-794667b5969f-serving-cert\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.202267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9878f6fd-fc1f-4980-a687-84478d0b92c1-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.212569 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.232718 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.253381 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.292673 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.313227 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.332698 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.353717 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.374138 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.393115 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.416667 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.432610 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.451987 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.474117 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.492481 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.525510 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.533046 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.552905 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.572975 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.602249 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.612830 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.632857 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.653491 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.672997 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.694310 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.712382 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.733812 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.754449 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.773890 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.793682 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.813293 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.833436 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.853346 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.873359 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.913210 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.933127 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.954655 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.972996 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.986991 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce8edea-f754-4ee2-a475-2022f99ed7f9-config\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987033 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987092 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987136 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb554\" (UniqueName: \"kubernetes.io/projected/f2a6bad6-cd1e-4e38-88fe-d531ea458683-kube-api-access-jb554\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987158 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a6bad6-cd1e-4e38-88fe-d531ea458683-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987186 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2a6bad6-cd1e-4e38-88fe-d531ea458683-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987217 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-certificates\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987236 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfrhm\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-kube-api-access-mfrhm\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: E0308 19:35:15.987489 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.487478732 +0000 UTC m=+217.883532755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987479 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pld28\" (UniqueName: \"kubernetes.io/projected/4928f728-c20b-4d8e-83f3-786cf90cf3e6-kube-api-access-pld28\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987555 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987572 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ce8edea-f754-4ee2-a475-2022f99ed7f9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987590 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4928f728-c20b-4d8e-83f3-786cf90cf3e6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987610 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-tls\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987631 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4928f728-c20b-4d8e-83f3-786cf90cf3e6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987660 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-trusted-ca\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987693 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-bound-sa-token\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987714 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4928f728-c20b-4d8e-83f3-786cf90cf3e6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.987735 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ce8edea-f754-4ee2-a475-2022f99ed7f9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:15 crc kubenswrapper[4885]: I0308 19:35:15.994968 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.013629 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.034316 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.051610 4885 request.go:700] Waited for 1.014463904s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/secrets?fieldSelector=metadata.name%3Dkube-scheduler-operator-serving-cert&limit=500&resourceVersion=0 Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.053972 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.075100 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.088817 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.089063 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.589019804 +0000 UTC m=+217.985073867 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089184 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089272 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfbc2e1-ae98-4c40-a739-877e7296f16a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zvl2r\" (UID: \"fbfbc2e1-ae98-4c40-a739-877e7296f16a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089305 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089337 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f932056-01e3-43aa-a91a-7f33d20445ba-config-volume\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089415 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089466 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089514 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-tls\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089559 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4928f728-c20b-4d8e-83f3-786cf90cf3e6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089598 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-mountpoint-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089664 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjdvb\" (UniqueName: \"kubernetes.io/projected/d008be41-8eac-496a-9c3d-083014dc402c-kube-api-access-sjdvb\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.089832 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-srv-cert\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090016 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-trusted-ca\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090114 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92jf4\" (UniqueName: \"kubernetes.io/projected/89561acc-f596-4f61-95b9-0cbc686a0b47-kube-api-access-92jf4\") pod \"migrator-59844c95c7-qnq6k\" (UID: \"89561acc-f596-4f61-95b9-0cbc686a0b47\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090271 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090461 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqgg4\" (UniqueName: \"kubernetes.io/projected/fbfbc2e1-ae98-4c40-a739-877e7296f16a-kube-api-access-lqgg4\") pod \"package-server-manager-789f6589d5-zvl2r\" (UID: \"fbfbc2e1-ae98-4c40-a739-877e7296f16a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090538 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fc52227b-0572-4fed-a5c1-e86521a20e58-signing-key\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090697 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-bound-sa-token\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090785 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-client\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090864 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/753974fb-c7b2-4e2b-a62d-22544f357c9b-profile-collector-cert\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.090957 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrsm5\" (UniqueName: \"kubernetes.io/projected/790c2bc5-e8b1-4943-affd-360042eb1a79-kube-api-access-rrsm5\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.091047 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ce8edea-f754-4ee2-a475-2022f99ed7f9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.091101 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-socket-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.091160 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp85f\" (UniqueName: \"kubernetes.io/projected/6f932056-01e3-43aa-a91a-7f33d20445ba-kube-api-access-pp85f\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.091212 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/494bb437-45dd-48e3-b932-9c3645e493ef-tmpfs\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.092148 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce8edea-f754-4ee2-a475-2022f99ed7f9-config\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.092426 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a554818d-91a7-48e1-a5a7-5808a5240f3e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.092651 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.092823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtctx\" (UniqueName: \"kubernetes.io/projected/60da1edb-8474-4368-a6ae-0bb2b1b7b845-kube-api-access-qtctx\") pod \"auto-csr-approver-29549974-jjqkh\" (UID: \"60da1edb-8474-4368-a6ae-0bb2b1b7b845\") " pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.093160 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/494bb437-45dd-48e3-b932-9c3645e493ef-webhook-cert\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.093192 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.593166465 +0000 UTC m=+217.989220528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.093264 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-trusted-ca\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.093623 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v24rf\" (UniqueName: \"kubernetes.io/projected/3c8bd61f-4965-4410-9ec7-b858a4529287-kube-api-access-v24rf\") pod \"multus-admission-controller-857f4d67dd-hqtdl\" (UID: \"3c8bd61f-4965-4410-9ec7-b858a4529287\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.093881 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f932056-01e3-43aa-a91a-7f33d20445ba-metrics-tls\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094100 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-config\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.093984 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.093614 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ce8edea-f754-4ee2-a475-2022f99ed7f9-config\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094259 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q64h9\" (UniqueName: \"kubernetes.io/projected/ca80eb80-6964-436a-bf66-0c5fe9b7e641-kube-api-access-q64h9\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094444 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rvsd\" (UniqueName: \"kubernetes.io/projected/0a7420ef-f20d-4d48-a619-627327de2063-kube-api-access-9rvsd\") pod \"ingress-canary-jbwsr\" (UID: \"0a7420ef-f20d-4d48-a619-627327de2063\") " pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094494 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094531 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ca80eb80-6964-436a-bf66-0c5fe9b7e641-certs\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094538 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ce8edea-f754-4ee2-a475-2022f99ed7f9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094606 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094644 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ca246d9-b15a-4163-87dc-84b8bc916c4d-proxy-tls\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094679 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a554818d-91a7-48e1-a5a7-5808a5240f3e-config\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094718 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6drnf\" (UniqueName: \"kubernetes.io/projected/753974fb-c7b2-4e2b-a62d-22544f357c9b-kube-api-access-6drnf\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094750 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495803ea-175c-4ad0-ac77-0598ce8213c1-metrics-tls\") pod \"dns-operator-744455d44c-bzvjp\" (UID: \"495803ea-175c-4ad0-ac77-0598ce8213c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094805 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5664b98a-83b1-433d-8449-04a982f77fff-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094838 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6ca246d9-b15a-4163-87dc-84b8bc916c4d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094899 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a6bad6-cd1e-4e38-88fe-d531ea458683-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.094983 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.095778 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.097308 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.095899 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-tls\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.096031 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2a6bad6-cd1e-4e38-88fe-d531ea458683-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.097409 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-csi-data-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.097591 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a93ee425-a2b2-492c-bafc-2443d2fde2d4-serving-cert\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.097636 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s7wt\" (UniqueName: \"kubernetes.io/projected/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-kube-api-access-4s7wt\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.097720 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-certificates\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098007 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfrhm\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-kube-api-access-mfrhm\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098230 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqf5f\" (UniqueName: \"kubernetes.io/projected/83de4c2d-767a-4635-8748-486dd45683a1-kube-api-access-pqf5f\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098429 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-config\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098520 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ca80eb80-6964-436a-bf66-0c5fe9b7e641-node-bootstrap-token\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwqcf\" (UniqueName: \"kubernetes.io/projected/494bb437-45dd-48e3-b932-9c3645e493ef-kube-api-access-fwqcf\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098688 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pld28\" (UniqueName: \"kubernetes.io/projected/4928f728-c20b-4d8e-83f3-786cf90cf3e6-kube-api-access-pld28\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098795 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-metrics-certs\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098877 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngwcn\" (UniqueName: \"kubernetes.io/projected/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-kube-api-access-ngwcn\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.098957 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1329795d-a8f9-4896-adba-23c2c0da9261-config-volume\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.099054 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jplnv\" (UniqueName: \"kubernetes.io/projected/a93ee425-a2b2-492c-bafc-2443d2fde2d4-kube-api-access-jplnv\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.099168 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.099244 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.099673 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101088 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-certificates\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101439 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ce8edea-f754-4ee2-a475-2022f99ed7f9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101724 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a7420ef-f20d-4d48-a619-627327de2063-cert\") pod \"ingress-canary-jbwsr\" (UID: \"0a7420ef-f20d-4d48-a619-627327de2063\") " pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101763 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4928f728-c20b-4d8e-83f3-786cf90cf3e6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101802 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngzsj\" (UniqueName: \"kubernetes.io/projected/e58e5e9a-de88-4209-8100-e9d4e415e68d-kube-api-access-ngzsj\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101838 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101873 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-registration-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.101998 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/790c2bc5-e8b1-4943-affd-360042eb1a79-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.102061 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-default-certificate\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.102114 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5664b98a-83b1-433d-8449-04a982f77fff-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.102152 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-stats-auth\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.102186 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-plugins-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.102726 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4928f728-c20b-4d8e-83f3-786cf90cf3e6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.102970 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jjd6\" (UniqueName: \"kubernetes.io/projected/fc52227b-0572-4fed-a5c1-e86521a20e58-kube-api-access-9jjd6\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.103026 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.103160 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-service-ca\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.103433 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb2k8\" (UniqueName: \"kubernetes.io/projected/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-kube-api-access-hb2k8\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.103623 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np254\" (UniqueName: \"kubernetes.io/projected/fe3a8c81-8c1d-4b38-9cae-813fb749fd43-kube-api-access-np254\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rwt\" (UID: \"fe3a8c81-8c1d-4b38-9cae-813fb749fd43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.103879 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-audit-policies\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104074 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d008be41-8eac-496a-9c3d-083014dc402c-audit-dir\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104134 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/790c2bc5-e8b1-4943-affd-360042eb1a79-images\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104361 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rjfm\" (UniqueName: \"kubernetes.io/projected/495803ea-175c-4ad0-ac77-0598ce8213c1-kube-api-access-7rjfm\") pod \"dns-operator-744455d44c-bzvjp\" (UID: \"495803ea-175c-4ad0-ac77-0598ce8213c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c8bd61f-4965-4410-9ec7-b858a4529287-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hqtdl\" (UID: \"3c8bd61f-4965-4410-9ec7-b858a4529287\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-serving-cert\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104808 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fc52227b-0572-4fed-a5c1-e86521a20e58-signing-cabundle\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104846 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/494bb437-45dd-48e3-b932-9c3645e493ef-apiservice-cert\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104888 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.104950 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105006 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g597v\" (UniqueName: \"kubernetes.io/projected/6ca246d9-b15a-4163-87dc-84b8bc916c4d-kube-api-access-g597v\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105063 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-ca\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105136 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb554\" (UniqueName: \"kubernetes.io/projected/f2a6bad6-cd1e-4e38-88fe-d531ea458683-kube-api-access-jb554\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105282 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/790c2bc5-e8b1-4943-affd-360042eb1a79-proxy-tls\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105531 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1329795d-a8f9-4896-adba-23c2c0da9261-secret-volume\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105590 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a554818d-91a7-48e1-a5a7-5808a5240f3e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105761 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e58e5e9a-de88-4209-8100-e9d4e415e68d-service-ca-bundle\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105845 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5msmd\" (UniqueName: \"kubernetes.io/projected/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-kube-api-access-5msmd\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.105912 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4fv6\" (UniqueName: \"kubernetes.io/projected/1329795d-a8f9-4896-adba-23c2c0da9261-kube-api-access-k4fv6\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.106038 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2a6bad6-cd1e-4e38-88fe-d531ea458683-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.106094 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5664b98a-83b1-433d-8449-04a982f77fff-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.106137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/753974fb-c7b2-4e2b-a62d-22544f357c9b-srv-cert\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.106191 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe3a8c81-8c1d-4b38-9cae-813fb749fd43-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rwt\" (UID: \"fe3a8c81-8c1d-4b38-9cae-813fb749fd43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.106368 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4928f728-c20b-4d8e-83f3-786cf90cf3e6-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.107068 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4928f728-c20b-4d8e-83f3-786cf90cf3e6-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.110721 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2a6bad6-cd1e-4e38-88fe-d531ea458683-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.111698 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.114337 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.134543 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.153006 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.174180 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.193636 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207065 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207243 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jplnv\" (UniqueName: \"kubernetes.io/projected/a93ee425-a2b2-492c-bafc-2443d2fde2d4-kube-api-access-jplnv\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207281 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.207324 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.707286285 +0000 UTC m=+218.103340348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207425 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a7420ef-f20d-4d48-a619-627327de2063-cert\") pod \"ingress-canary-jbwsr\" (UID: \"0a7420ef-f20d-4d48-a619-627327de2063\") " pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207477 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207518 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207556 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngzsj\" (UniqueName: \"kubernetes.io/projected/e58e5e9a-de88-4209-8100-e9d4e415e68d-kube-api-access-ngzsj\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207593 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-registration-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207630 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/790c2bc5-e8b1-4943-affd-360042eb1a79-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207717 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-default-certificate\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207751 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5664b98a-83b1-433d-8449-04a982f77fff-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207786 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-stats-auth\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207823 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jjd6\" (UniqueName: \"kubernetes.io/projected/fc52227b-0572-4fed-a5c1-e86521a20e58-kube-api-access-9jjd6\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207857 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-plugins-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207955 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.207996 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np254\" (UniqueName: \"kubernetes.io/projected/fe3a8c81-8c1d-4b38-9cae-813fb749fd43-kube-api-access-np254\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rwt\" (UID: \"fe3a8c81-8c1d-4b38-9cae-813fb749fd43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208056 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-service-ca\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208095 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb2k8\" (UniqueName: \"kubernetes.io/projected/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-kube-api-access-hb2k8\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208158 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-audit-policies\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208204 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d008be41-8eac-496a-9c3d-083014dc402c-audit-dir\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208269 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/790c2bc5-e8b1-4943-affd-360042eb1a79-images\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208305 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rjfm\" (UniqueName: \"kubernetes.io/projected/495803ea-175c-4ad0-ac77-0598ce8213c1-kube-api-access-7rjfm\") pod \"dns-operator-744455d44c-bzvjp\" (UID: \"495803ea-175c-4ad0-ac77-0598ce8213c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208374 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c8bd61f-4965-4410-9ec7-b858a4529287-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hqtdl\" (UID: \"3c8bd61f-4965-4410-9ec7-b858a4529287\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208440 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fc52227b-0572-4fed-a5c1-e86521a20e58-signing-cabundle\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208474 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/494bb437-45dd-48e3-b932-9c3645e493ef-apiservice-cert\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208510 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-serving-cert\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208548 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g597v\" (UniqueName: \"kubernetes.io/projected/6ca246d9-b15a-4163-87dc-84b8bc916c4d-kube-api-access-g597v\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208628 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-ca\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208695 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208752 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a554818d-91a7-48e1-a5a7-5808a5240f3e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208788 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/790c2bc5-e8b1-4943-affd-360042eb1a79-proxy-tls\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209185 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209569 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-registration-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209690 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1329795d-a8f9-4896-adba-23c2c0da9261-secret-volume\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209753 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4fv6\" (UniqueName: \"kubernetes.io/projected/1329795d-a8f9-4896-adba-23c2c0da9261-kube-api-access-k4fv6\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209794 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e58e5e9a-de88-4209-8100-e9d4e415e68d-service-ca-bundle\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209834 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5msmd\" (UniqueName: \"kubernetes.io/projected/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-kube-api-access-5msmd\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209873 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5664b98a-83b1-433d-8449-04a982f77fff-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/753974fb-c7b2-4e2b-a62d-22544f357c9b-srv-cert\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.209980 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe3a8c81-8c1d-4b38-9cae-813fb749fd43-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rwt\" (UID: \"fe3a8c81-8c1d-4b38-9cae-813fb749fd43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210031 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210066 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210122 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfbc2e1-ae98-4c40-a739-877e7296f16a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zvl2r\" (UID: \"fbfbc2e1-ae98-4c40-a739-877e7296f16a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210160 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210195 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210231 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f932056-01e3-43aa-a91a-7f33d20445ba-config-volume\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210269 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-mountpoint-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210333 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjdvb\" (UniqueName: \"kubernetes.io/projected/d008be41-8eac-496a-9c3d-083014dc402c-kube-api-access-sjdvb\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210377 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-srv-cert\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92jf4\" (UniqueName: \"kubernetes.io/projected/89561acc-f596-4f61-95b9-0cbc686a0b47-kube-api-access-92jf4\") pod \"migrator-59844c95c7-qnq6k\" (UID: \"89561acc-f596-4f61-95b9-0cbc686a0b47\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210452 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqgg4\" (UniqueName: \"kubernetes.io/projected/fbfbc2e1-ae98-4c40-a739-877e7296f16a-kube-api-access-lqgg4\") pod \"package-server-manager-789f6589d5-zvl2r\" (UID: \"fbfbc2e1-ae98-4c40-a739-877e7296f16a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210514 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fc52227b-0572-4fed-a5c1-e86521a20e58-signing-key\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210552 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-client\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210602 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/753974fb-c7b2-4e2b-a62d-22544f357c9b-profile-collector-cert\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210657 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-socket-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210696 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrsm5\" (UniqueName: \"kubernetes.io/projected/790c2bc5-e8b1-4943-affd-360042eb1a79-kube-api-access-rrsm5\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210735 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp85f\" (UniqueName: \"kubernetes.io/projected/6f932056-01e3-43aa-a91a-7f33d20445ba-kube-api-access-pp85f\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210770 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/494bb437-45dd-48e3-b932-9c3645e493ef-tmpfs\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210806 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a554818d-91a7-48e1-a5a7-5808a5240f3e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210844 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtctx\" (UniqueName: \"kubernetes.io/projected/60da1edb-8474-4368-a6ae-0bb2b1b7b845-kube-api-access-qtctx\") pod \"auto-csr-approver-29549974-jjqkh\" (UID: \"60da1edb-8474-4368-a6ae-0bb2b1b7b845\") " pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210898 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.210965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v24rf\" (UniqueName: \"kubernetes.io/projected/3c8bd61f-4965-4410-9ec7-b858a4529287-kube-api-access-v24rf\") pod \"multus-admission-controller-857f4d67dd-hqtdl\" (UID: \"3c8bd61f-4965-4410-9ec7-b858a4529287\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211006 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/494bb437-45dd-48e3-b932-9c3645e493ef-webhook-cert\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211024 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211056 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f932056-01e3-43aa-a91a-7f33d20445ba-metrics-tls\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211039 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.208628 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/790c2bc5-e8b1-4943-affd-360042eb1a79-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211135 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e58e5e9a-de88-4209-8100-e9d4e415e68d-service-ca-bundle\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211211 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-plugins-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211106 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-config\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211289 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q64h9\" (UniqueName: \"kubernetes.io/projected/ca80eb80-6964-436a-bf66-0c5fe9b7e641-kube-api-access-q64h9\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211318 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ca80eb80-6964-436a-bf66-0c5fe9b7e641-certs\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211355 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rvsd\" (UniqueName: \"kubernetes.io/projected/0a7420ef-f20d-4d48-a619-627327de2063-kube-api-access-9rvsd\") pod \"ingress-canary-jbwsr\" (UID: \"0a7420ef-f20d-4d48-a619-627327de2063\") " pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211391 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211420 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211445 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ca246d9-b15a-4163-87dc-84b8bc916c4d-proxy-tls\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211467 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a554818d-91a7-48e1-a5a7-5808a5240f3e-config\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5664b98a-83b1-433d-8449-04a982f77fff-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211509 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6drnf\" (UniqueName: \"kubernetes.io/projected/753974fb-c7b2-4e2b-a62d-22544f357c9b-kube-api-access-6drnf\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211530 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495803ea-175c-4ad0-ac77-0598ce8213c1-metrics-tls\") pod \"dns-operator-744455d44c-bzvjp\" (UID: \"495803ea-175c-4ad0-ac77-0598ce8213c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.211557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6ca246d9-b15a-4163-87dc-84b8bc916c4d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.212264 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-ca\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.212300 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.212464 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-config\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.212840 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-service-ca\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.212965 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d008be41-8eac-496a-9c3d-083014dc402c-audit-dir\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.213128 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-mountpoint-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.213405 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-socket-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.213879 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1329795d-a8f9-4896-adba-23c2c0da9261-secret-volume\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214089 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214160 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.214096 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.714070726 +0000 UTC m=+218.110124789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214422 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-csi-data-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214494 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a93ee425-a2b2-492c-bafc-2443d2fde2d4-serving-cert\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214547 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214598 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214675 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqf5f\" (UniqueName: \"kubernetes.io/projected/83de4c2d-767a-4635-8748-486dd45683a1-kube-api-access-pqf5f\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214716 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s7wt\" (UniqueName: \"kubernetes.io/projected/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-kube-api-access-4s7wt\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214750 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-config\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214789 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwqcf\" (UniqueName: \"kubernetes.io/projected/494bb437-45dd-48e3-b932-9c3645e493ef-kube-api-access-fwqcf\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214822 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ca80eb80-6964-436a-bf66-0c5fe9b7e641-node-bootstrap-token\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214865 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-metrics-certs\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngwcn\" (UniqueName: \"kubernetes.io/projected/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-kube-api-access-ngwcn\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.214991 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1329795d-a8f9-4896-adba-23c2c0da9261-config-volume\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.215202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/494bb437-45dd-48e3-b932-9c3645e493ef-tmpfs\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.215264 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-audit-policies\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.215300 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-csi-data-dir\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.217096 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5664b98a-83b1-433d-8449-04a982f77fff-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.219393 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.219711 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6ca246d9-b15a-4163-87dc-84b8bc916c4d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.220790 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.220987 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.221240 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.222139 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-default-certificate\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.222324 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a554818d-91a7-48e1-a5a7-5808a5240f3e-config\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.222835 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.223563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.224083 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a93ee425-a2b2-492c-bafc-2443d2fde2d4-serving-cert\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.224427 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/fe3a8c81-8c1d-4b38-9cae-813fb749fd43-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rwt\" (UID: \"fe3a8c81-8c1d-4b38-9cae-813fb749fd43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.224576 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/495803ea-175c-4ad0-ac77-0598ce8213c1-metrics-tls\") pod \"dns-operator-744455d44c-bzvjp\" (UID: \"495803ea-175c-4ad0-ac77-0598ce8213c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.224676 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.224852 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-srv-cert\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.225071 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.226194 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.226216 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a93ee425-a2b2-492c-bafc-2443d2fde2d4-etcd-client\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.226399 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-metrics-certs\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.226622 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a554818d-91a7-48e1-a5a7-5808a5240f3e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.227212 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.227653 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e58e5e9a-de88-4209-8100-e9d4e415e68d-stats-auth\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.228642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ca246d9-b15a-4163-87dc-84b8bc916c4d-proxy-tls\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.229067 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5664b98a-83b1-433d-8449-04a982f77fff-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.229595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/753974fb-c7b2-4e2b-a62d-22544f357c9b-profile-collector-cert\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.231176 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3c8bd61f-4965-4410-9ec7-b858a4529287-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-hqtdl\" (UID: \"3c8bd61f-4965-4410-9ec7-b858a4529287\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.233631 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.241709 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/494bb437-45dd-48e3-b932-9c3645e493ef-webhook-cert\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.246064 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/494bb437-45dd-48e3-b932-9c3645e493ef-apiservice-cert\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.253201 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.260361 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/790c2bc5-e8b1-4943-affd-360042eb1a79-images\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.272889 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.293428 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.307346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/790c2bc5-e8b1-4943-affd-360042eb1a79-proxy-tls\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.313570 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.316200 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.317429 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.817392846 +0000 UTC m=+218.213446879 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.317568 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.318143 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.818118545 +0000 UTC m=+218.214172608 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.326836 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/fbfbc2e1-ae98-4c40-a739-877e7296f16a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zvl2r\" (UID: \"fbfbc2e1-ae98-4c40-a739-877e7296f16a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.350734 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d98wm\" (UniqueName: \"kubernetes.io/projected/4c7583a8-a980-4ab2-a594-bf55ec72c91c-kube-api-access-d98wm\") pod \"controller-manager-879f6c89f-6lcrf\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.372571 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.373098 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwcst\" (UniqueName: \"kubernetes.io/projected/175c50f5-857d-4697-bcde-2ce47f2edfc5-kube-api-access-gwcst\") pod \"machine-api-operator-5694c8668f-cpx85\" (UID: \"175c50f5-857d-4697-bcde-2ce47f2edfc5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.392812 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.395349 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.405192 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-serving-cert\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.406980 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.414038 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.419467 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.419834 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.919803021 +0000 UTC m=+218.315857104 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.420175 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.421073 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:16.921036484 +0000 UTC m=+218.317090587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.433130 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.440126 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-config\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.454711 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.473257 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.490818 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fc52227b-0572-4fed-a5c1-e86521a20e58-signing-key\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.494015 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.513217 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.524317 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.524558 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.024516858 +0000 UTC m=+218.420570921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.525465 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.526013 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.025996288 +0000 UTC m=+218.422050351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.538083 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.544982 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fc52227b-0572-4fed-a5c1-e86521a20e58-signing-cabundle\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.554087 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.566109 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/753974fb-c7b2-4e2b-a62d-22544f357c9b-srv-cert\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.573754 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.594521 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.612885 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.627446 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.628657 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.128484585 +0000 UTC m=+218.524538648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.629176 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.630101 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.130078007 +0000 UTC m=+218.526132060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.634404 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.641262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.662706 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.671008 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.673329 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.693253 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.700606 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-cpx85"] Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.701335 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1329795d-a8f9-4896-adba-23c2c0da9261-config-volume\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:16 crc kubenswrapper[4885]: W0308 19:35:16.703159 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod175c50f5_857d_4697_bcde_2ce47f2edfc5.slice/crio-03a3f4e5495e5af6d028f06da63e60eb1a2a3f162d95317269693c9bf7b89b7c WatchSource:0}: Error finding container 03a3f4e5495e5af6d028f06da63e60eb1a2a3f162d95317269693c9bf7b89b7c: Status 404 returned error can't find the container with id 03a3f4e5495e5af6d028f06da63e60eb1a2a3f162d95317269693c9bf7b89b7c Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.704500 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6lcrf"] Mar 08 19:35:16 crc kubenswrapper[4885]: W0308 19:35:16.712284 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c7583a8_a980_4ab2_a594_bf55ec72c91c.slice/crio-4a34d899e46ec2504a4c18a48ad21ba4015069c3cce16e7b4c57e8e3b93a6b39 WatchSource:0}: Error finding container 4a34d899e46ec2504a4c18a48ad21ba4015069c3cce16e7b4c57e8e3b93a6b39: Status 404 returned error can't find the container with id 4a34d899e46ec2504a4c18a48ad21ba4015069c3cce16e7b4c57e8e3b93a6b39 Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.713540 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.734432 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.734695 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.234642261 +0000 UTC m=+218.630696284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.735798 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.738082 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.238057703 +0000 UTC m=+218.634111796 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.738121 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.753216 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.773724 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.784116 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f932056-01e3-43aa-a91a-7f33d20445ba-config-volume\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.793150 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.813694 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.825016 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6f932056-01e3-43aa-a91a-7f33d20445ba-metrics-tls\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.833875 4885 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.841574 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.841717 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.341683851 +0000 UTC m=+218.737737874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.842471 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.842836 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.34282802 +0000 UTC m=+218.738882043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.854103 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.872801 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.894166 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.913118 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.926074 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ca80eb80-6964-436a-bf66-0c5fe9b7e641-node-bootstrap-token\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.933466 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.939331 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ca80eb80-6964-436a-bf66-0c5fe9b7e641-certs\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.944081 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.944267 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.44424822 +0000 UTC m=+218.840302243 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.944513 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:16 crc kubenswrapper[4885]: E0308 19:35:16.945011 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.445000899 +0000 UTC m=+218.841054922 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.953463 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.972872 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.984591 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a7420ef-f20d-4d48-a619-627327de2063-cert\") pod \"ingress-canary-jbwsr\" (UID: \"0a7420ef-f20d-4d48-a619-627327de2063\") " pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:16 crc kubenswrapper[4885]: I0308 19:35:16.993508 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.013553 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.047201 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.047365 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.547328092 +0000 UTC m=+218.943382155 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.048322 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.049260 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.549170362 +0000 UTC m=+218.945224395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.058295 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8rh9\" (UniqueName: \"kubernetes.io/projected/df78e76d-5024-4d31-a0b9-17d0d6c6c258-kube-api-access-w8rh9\") pod \"authentication-operator-69f744f599-q8q8m\" (UID: \"df78e76d-5024-4d31-a0b9-17d0d6c6c258\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.071860 4885 request.go:700] Waited for 1.895591995s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-oauth-apiserver/serviceaccounts/oauth-apiserver-sa/token Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.077080 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc92s\" (UniqueName: \"kubernetes.io/projected/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-kube-api-access-hc92s\") pod \"console-f9d7485db-hsdmw\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.097904 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kc2g\" (UniqueName: \"kubernetes.io/projected/ace2a8fd-20b4-40b6-a2ce-3e34454b3c71-kube-api-access-8kc2g\") pod \"apiserver-7bbb656c7d-spqp8\" (UID: \"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.109358 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvxds\" (UniqueName: \"kubernetes.io/projected/f5b425d2-db8e-45f3-a141-8ac7bd678491-kube-api-access-xvxds\") pod \"apiserver-76f77b778f-fq2fp\" (UID: \"f5b425d2-db8e-45f3-a141-8ac7bd678491\") " pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.119840 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.131042 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxk9p\" (UniqueName: \"kubernetes.io/projected/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-kube-api-access-lxk9p\") pod \"route-controller-manager-6576b87f9c-bg5wl\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.137239 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.152383 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.152620 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.652581845 +0000 UTC m=+219.048635898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.153193 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.153812 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.653790477 +0000 UTC m=+219.049844530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.178838 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5jp7\" (UniqueName: \"kubernetes.io/projected/75c67b6f-14bc-4d96-a6b6-ae020ace5353-kube-api-access-l5jp7\") pod \"machine-approver-56656f9798-f6tw8\" (UID: \"75c67b6f-14bc-4d96-a6b6-ae020ace5353\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.187878 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrkrt\" (UniqueName: \"kubernetes.io/projected/5a244e04-1aec-4355-89c5-794667b5969f-kube-api-access-jrkrt\") pod \"openshift-config-operator-7777fb866f-ftkzn\" (UID: \"5a244e04-1aec-4355-89c5-794667b5969f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.206883 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf8hq\" (UniqueName: \"kubernetes.io/projected/5ac2fbf9-c9bb-4ef8-988f-4407e688ad54-kube-api-access-xf8hq\") pod \"downloads-7954f5f757-t2b7w\" (UID: \"5ac2fbf9-c9bb-4ef8-988f-4407e688ad54\") " pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.210624 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.213855 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbpxs\" (UniqueName: \"kubernetes.io/projected/96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1-kube-api-access-gbpxs\") pod \"cluster-samples-operator-665b6dd947-wbbxm\" (UID: \"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.231388 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkm6d\" (UniqueName: \"kubernetes.io/projected/9878f6fd-fc1f-4980-a687-84478d0b92c1-kube-api-access-pkm6d\") pod \"openshift-apiserver-operator-796bbdcf4f-czwph\" (UID: \"9878f6fd-fc1f-4980-a687-84478d0b92c1\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.241829 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.254557 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-447lv\" (UniqueName: \"kubernetes.io/projected/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-kube-api-access-447lv\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.256304 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.257066 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.757043844 +0000 UTC m=+219.153097877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.276260 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46d0f7c6-3622-4e8a-885a-8f85ac63c36f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-mxxgt\" (UID: \"46d0f7c6-3622-4e8a-885a-8f85ac63c36f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.302277 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b4zh\" (UniqueName: \"kubernetes.io/projected/6396835c-4d1e-4b5d-a6f1-4f8003f073e9-kube-api-access-2b4zh\") pod \"console-operator-58897d9998-2tz9t\" (UID: \"6396835c-4d1e-4b5d-a6f1-4f8003f073e9\") " pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.316775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.330702 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.331735 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4928f728-c20b-4d8e-83f3-786cf90cf3e6-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.349041 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-bound-sa-token\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.350691 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.360266 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.360704 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.860690293 +0000 UTC m=+219.256744326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.367988 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfrhm\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-kube-api-access-mfrhm\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.371902 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.398304 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pld28\" (UniqueName: \"kubernetes.io/projected/4928f728-c20b-4d8e-83f3-786cf90cf3e6-kube-api-access-pld28\") pod \"cluster-image-registry-operator-dc59b4c8b-c6xkj\" (UID: \"4928f728-c20b-4d8e-83f3-786cf90cf3e6\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.411090 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-q8q8m"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.412077 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.413889 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7ce8edea-f754-4ee2-a475-2022f99ed7f9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-kr78b\" (UID: \"7ce8edea-f754-4ee2-a475-2022f99ed7f9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.427211 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb554\" (UniqueName: \"kubernetes.io/projected/f2a6bad6-cd1e-4e38-88fe-d531ea458683-kube-api-access-jb554\") pod \"openshift-controller-manager-operator-756b6f6bc6-rsdtz\" (UID: \"f2a6bad6-cd1e-4e38-88fe-d531ea458683\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.448196 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jplnv\" (UniqueName: \"kubernetes.io/projected/a93ee425-a2b2-492c-bafc-2443d2fde2d4-kube-api-access-jplnv\") pod \"etcd-operator-b45778765-q9q8c\" (UID: \"a93ee425-a2b2-492c-bafc-2443d2fde2d4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.460835 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.461013 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.960988012 +0000 UTC m=+219.357042035 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.461173 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.461558 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:17.961548826 +0000 UTC m=+219.357602849 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.471131 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hsdmw"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.475539 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngzsj\" (UniqueName: \"kubernetes.io/projected/e58e5e9a-de88-4209-8100-e9d4e415e68d-kube-api-access-ngzsj\") pod \"router-default-5444994796-lvfcn\" (UID: \"e58e5e9a-de88-4209-8100-e9d4e415e68d\") " pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.484108 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.503101 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rjfm\" (UniqueName: \"kubernetes.io/projected/495803ea-175c-4ad0-ac77-0598ce8213c1-kube-api-access-7rjfm\") pod \"dns-operator-744455d44c-bzvjp\" (UID: \"495803ea-175c-4ad0-ac77-0598ce8213c1\") " pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.503538 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.504667 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-t2b7w"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.514860 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4fv6\" (UniqueName: \"kubernetes.io/projected/1329795d-a8f9-4896-adba-23c2c0da9261-kube-api-access-k4fv6\") pod \"collect-profiles-29549970-lfczv\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:17 crc kubenswrapper[4885]: W0308 19:35:17.518885 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdbb4f97_c9c8_43ef_a4b1_06dea8d6d8b9.slice/crio-3f15d4fa347fa026085bfce2e909833cf87859b7db626e5ef40b0541cc513c59 WatchSource:0}: Error finding container 3f15d4fa347fa026085bfce2e909833cf87859b7db626e5ef40b0541cc513c59: Status 404 returned error can't find the container with id 3f15d4fa347fa026085bfce2e909833cf87859b7db626e5ef40b0541cc513c59 Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.523119 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" Mar 08 19:35:17 crc kubenswrapper[4885]: W0308 19:35:17.534276 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ac2fbf9_c9bb_4ef8_988f_4407e688ad54.slice/crio-1b9f9770f747e4ea9de3b28aa6bfcfef1e38616b82e25b5011012445d9dbb83d WatchSource:0}: Error finding container 1b9f9770f747e4ea9de3b28aa6bfcfef1e38616b82e25b5011012445d9dbb83d: Status 404 returned error can't find the container with id 1b9f9770f747e4ea9de3b28aa6bfcfef1e38616b82e25b5011012445d9dbb83d Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.538463 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.543442 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5664b98a-83b1-433d-8449-04a982f77fff-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-jv6w9\" (UID: \"5664b98a-83b1-433d-8449-04a982f77fff\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.552481 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.553405 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-fq2fp"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.558750 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.560394 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jjd6\" (UniqueName: \"kubernetes.io/projected/fc52227b-0572-4fed-a5c1-e86521a20e58-kube-api-access-9jjd6\") pod \"service-ca-9c57cc56f-5ghwr\" (UID: \"fc52227b-0572-4fed-a5c1-e86521a20e58\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.562523 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.563563 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.563592 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.063571531 +0000 UTC m=+219.459625554 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.563807 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" event={"ID":"75c67b6f-14bc-4d96-a6b6-ae020ace5353","Type":"ContainerStarted","Data":"dca6385e3ff6d75cd66d79ce65f428f2299199dccdcceb22795c5b4d923f14af"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.566866 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" event={"ID":"175c50f5-857d-4697-bcde-2ce47f2edfc5","Type":"ContainerStarted","Data":"7c53dc93eac2dc1186590567f3f6247b507dc5b2be999e78057b8c12727c7ec5"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.566962 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" event={"ID":"175c50f5-857d-4697-bcde-2ce47f2edfc5","Type":"ContainerStarted","Data":"8442bacc49b0d28ad2d61c70a11b459dc87988fb6032f517c9384c95838ab3f8"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.566993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" event={"ID":"175c50f5-857d-4697-bcde-2ce47f2edfc5","Type":"ContainerStarted","Data":"03a3f4e5495e5af6d028f06da63e60eb1a2a3f162d95317269693c9bf7b89b7c"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.576711 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g597v\" (UniqueName: \"kubernetes.io/projected/6ca246d9-b15a-4163-87dc-84b8bc916c4d-kube-api-access-g597v\") pod \"machine-config-controller-84d6567774-4gqrl\" (UID: \"6ca246d9-b15a-4163-87dc-84b8bc916c4d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.577011 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" event={"ID":"4c7583a8-a980-4ab2-a594-bf55ec72c91c","Type":"ContainerStarted","Data":"9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.577051 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" event={"ID":"4c7583a8-a980-4ab2-a594-bf55ec72c91c","Type":"ContainerStarted","Data":"4a34d899e46ec2504a4c18a48ad21ba4015069c3cce16e7b4c57e8e3b93a6b39"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.577225 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.578649 4885 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6lcrf container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.578688 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" podUID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.581528 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" event={"ID":"df78e76d-5024-4d31-a0b9-17d0d6c6c258","Type":"ContainerStarted","Data":"888bff3f11c36a2990ba40c2b1027e5a0de19de32078e51bf9bb026c6597982a"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.584022 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hsdmw" event={"ID":"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9","Type":"ContainerStarted","Data":"3f15d4fa347fa026085bfce2e909833cf87859b7db626e5ef40b0541cc513c59"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.590431 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92jf4\" (UniqueName: \"kubernetes.io/projected/89561acc-f596-4f61-95b9-0cbc686a0b47-kube-api-access-92jf4\") pod \"migrator-59844c95c7-qnq6k\" (UID: \"89561acc-f596-4f61-95b9-0cbc686a0b47\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.608996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5msmd\" (UniqueName: \"kubernetes.io/projected/0bc91e4d-f2d9-494a-bca6-4a55cc82823b-kube-api-access-5msmd\") pod \"csi-hostpathplugin-tw9z2\" (UID: \"0bc91e4d-f2d9-494a-bca6-4a55cc82823b\") " pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.616032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t2b7w" event={"ID":"5ac2fbf9-c9bb-4ef8-988f-4407e688ad54","Type":"ContainerStarted","Data":"1b9f9770f747e4ea9de3b28aa6bfcfef1e38616b82e25b5011012445d9dbb83d"} Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.618612 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.629844 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqgg4\" (UniqueName: \"kubernetes.io/projected/fbfbc2e1-ae98-4c40-a739-877e7296f16a-kube-api-access-lqgg4\") pod \"package-server-manager-789f6589d5-zvl2r\" (UID: \"fbfbc2e1-ae98-4c40-a739-877e7296f16a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.632999 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.649471 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.651227 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np254\" (UniqueName: \"kubernetes.io/projected/fe3a8c81-8c1d-4b38-9cae-813fb749fd43-kube-api-access-np254\") pod \"control-plane-machine-set-operator-78cbb6b69f-k2rwt\" (UID: \"fe3a8c81-8c1d-4b38-9cae-813fb749fd43\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.657350 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.671538 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a554818d-91a7-48e1-a5a7-5808a5240f3e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vdqjm\" (UID: \"a554818d-91a7-48e1-a5a7-5808a5240f3e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.673241 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.673603 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.173586451 +0000 UTC m=+219.569640474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.693100 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.693598 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.694578 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrsm5\" (UniqueName: \"kubernetes.io/projected/790c2bc5-e8b1-4943-affd-360042eb1a79-kube-api-access-rrsm5\") pod \"machine-config-operator-74547568cd-gqm6w\" (UID: \"790c2bc5-e8b1-4943-affd-360042eb1a79\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.698770 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.706534 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.711552 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb2k8\" (UniqueName: \"kubernetes.io/projected/9930a19e-2aa9-42ec-91fc-16cd50bc2f40-kube-api-access-hb2k8\") pod \"kube-storage-version-migrator-operator-b67b599dd-2p9hf\" (UID: \"9930a19e-2aa9-42ec-91fc-16cd50bc2f40\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.721313 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.734263 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjdvb\" (UniqueName: \"kubernetes.io/projected/d008be41-8eac-496a-9c3d-083014dc402c-kube-api-access-sjdvb\") pod \"oauth-openshift-558db77b4-2bp7t\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.747664 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v24rf\" (UniqueName: \"kubernetes.io/projected/3c8bd61f-4965-4410-9ec7-b858a4529287-kube-api-access-v24rf\") pod \"multus-admission-controller-857f4d67dd-hqtdl\" (UID: \"3c8bd61f-4965-4410-9ec7-b858a4529287\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.756196 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.757100 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-2tz9t"] Mar 08 19:35:17 crc kubenswrapper[4885]: W0308 19:35:17.761624 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode58e5e9a_de88_4209_8100_e9d4e415e68d.slice/crio-287c51e206e17f70938901810509dcd30ad1e8cf6e7f98fb4ab1340276721e91 WatchSource:0}: Error finding container 287c51e206e17f70938901810509dcd30ad1e8cf6e7f98fb4ab1340276721e91: Status 404 returned error can't find the container with id 287c51e206e17f70938901810509dcd30ad1e8cf6e7f98fb4ab1340276721e91 Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.768017 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtctx\" (UniqueName: \"kubernetes.io/projected/60da1edb-8474-4368-a6ae-0bb2b1b7b845-kube-api-access-qtctx\") pod \"auto-csr-approver-29549974-jjqkh\" (UID: \"60da1edb-8474-4368-a6ae-0bb2b1b7b845\") " pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.776698 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.777101 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.277083136 +0000 UTC m=+219.673137159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.798696 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp85f\" (UniqueName: \"kubernetes.io/projected/6f932056-01e3-43aa-a91a-7f33d20445ba-kube-api-access-pp85f\") pod \"dns-default-s5pkw\" (UID: \"6f932056-01e3-43aa-a91a-7f33d20445ba\") " pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.821567 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.830284 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q64h9\" (UniqueName: \"kubernetes.io/projected/ca80eb80-6964-436a-bf66-0c5fe9b7e641-kube-api-access-q64h9\") pod \"machine-config-server-bl88k\" (UID: \"ca80eb80-6964-436a-bf66-0c5fe9b7e641\") " pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.831950 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.832637 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.840535 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rvsd\" (UniqueName: \"kubernetes.io/projected/0a7420ef-f20d-4d48-a619-627327de2063-kube-api-access-9rvsd\") pod \"ingress-canary-jbwsr\" (UID: \"0a7420ef-f20d-4d48-a619-627327de2063\") " pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.840739 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.851165 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6drnf\" (UniqueName: \"kubernetes.io/projected/753974fb-c7b2-4e2b-a62d-22544f357c9b-kube-api-access-6drnf\") pod \"catalog-operator-68c6474976-h2bp5\" (UID: \"753974fb-c7b2-4e2b-a62d-22544f357c9b\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.869099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.870003 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqf5f\" (UniqueName: \"kubernetes.io/projected/83de4c2d-767a-4635-8748-486dd45683a1-kube-api-access-pqf5f\") pod \"marketplace-operator-79b997595-ldvgz\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.876055 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.877958 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.878321 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.378310079 +0000 UTC m=+219.774364102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.905135 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s7wt\" (UniqueName: \"kubernetes.io/projected/6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459-kube-api-access-4s7wt\") pod \"service-ca-operator-777779d784-dmtc7\" (UID: \"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.929664 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngwcn\" (UniqueName: \"kubernetes.io/projected/82ad04de-932b-4ebf-97cf-0a6344ee1a9e-kube-api-access-ngwcn\") pod \"olm-operator-6b444d44fb-m7ttr\" (UID: \"82ad04de-932b-4ebf-97cf-0a6344ee1a9e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.930143 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.931545 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.934965 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwqcf\" (UniqueName: \"kubernetes.io/projected/494bb437-45dd-48e3-b932-9c3645e493ef-kube-api-access-fwqcf\") pod \"packageserver-d55dfcdfc-j7xfr\" (UID: \"494bb437-45dd-48e3-b932-9c3645e493ef\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.937895 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.944418 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.963070 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.968763 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.969686 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.973695 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.978188 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.979464 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.979648 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.479621074 +0000 UTC m=+219.875675097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.980105 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:17 crc kubenswrapper[4885]: E0308 19:35:17.980416 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.480403815 +0000 UTC m=+219.876457838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.982749 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q9q8c"] Mar 08 19:35:17 crc kubenswrapper[4885]: I0308 19:35:17.995264 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.001208 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.015176 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.035202 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.043188 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.051790 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.070251 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-bl88k" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.074231 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jbwsr" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.081419 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.081561 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.581540296 +0000 UTC m=+219.977594309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.081726 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.082111 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.582096762 +0000 UTC m=+219.978150855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: W0308 19:35:18.168142 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2a6bad6_cd1e_4e38_88fe_d531ea458683.slice/crio-2b4b4d6c691c9bf385d1aca10675c3a0f6e1dce4c0d75c0a9518dc576f033a3a WatchSource:0}: Error finding container 2b4b4d6c691c9bf385d1aca10675c3a0f6e1dce4c0d75c0a9518dc576f033a3a: Status 404 returned error can't find the container with id 2b4b4d6c691c9bf385d1aca10675c3a0f6e1dce4c0d75c0a9518dc576f033a3a Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.182644 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.182964 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.682812811 +0000 UTC m=+220.078866834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.183298 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.183621 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.683610153 +0000 UTC m=+220.079664176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.284312 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.284673 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.784658091 +0000 UTC m=+220.180712114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.385871 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.386224 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.886213144 +0000 UTC m=+220.282267167 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.487713 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.488065 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.988036333 +0000 UTC m=+220.384090346 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.488250 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.488652 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:18.98864458 +0000 UTC m=+220.384698603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.589098 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.589461 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.089446112 +0000 UTC m=+220.485500135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.683269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" event={"ID":"5a244e04-1aec-4355-89c5-794667b5969f","Type":"ContainerStarted","Data":"9be25d7733e293f02f15db81210b4d6937d933a0f0c84184e04a7e5621c56df8"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.700957 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.701374 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.201360012 +0000 UTC m=+220.597414035 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.721073 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.750236 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" podStartSLOduration=183.750220312 podStartE2EDuration="3m3.750220312s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:18.749206184 +0000 UTC m=+220.145260207" watchObservedRunningTime="2026-03-08 19:35:18.750220312 +0000 UTC m=+220.146274335" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.755793 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-bzvjp"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.762029 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.762054 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.789626 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.789669 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2bp7t"] Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.795248 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" event={"ID":"9878f6fd-fc1f-4980-a687-84478d0b92c1","Type":"ContainerStarted","Data":"91d1f1a1e8874b5d7a868e5fe1a550f38d03387f0601c1f60d2396a313caae96"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.800268 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" event={"ID":"f5b425d2-db8e-45f3-a141-8ac7bd678491","Type":"ContainerStarted","Data":"6da66b4547f5616c03795376a3643219df504dc7a6844e33d368b86daeb04269"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.801825 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.803940 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.303904191 +0000 UTC m=+220.699958214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.815285 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hsdmw" event={"ID":"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9","Type":"ContainerStarted","Data":"72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.823325 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" event={"ID":"46d0f7c6-3622-4e8a-885a-8f85ac63c36f","Type":"ContainerStarted","Data":"a75c342d6c7128f5cfc7ba5e493b603a907baec57caa4dec6a6301b3aa095a95"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.849701 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" event={"ID":"df78e76d-5024-4d31-a0b9-17d0d6c6c258","Type":"ContainerStarted","Data":"726d5822e3dcd21bba0f1b655f119ee9a19a61beb8108cae545b921c5f421cd3"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.854646 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" event={"ID":"a93ee425-a2b2-492c-bafc-2443d2fde2d4","Type":"ContainerStarted","Data":"af8ada1c83d377a083cd80c83ba67ca575cd054d01c6637a2e87cb70a37f07ea"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.867099 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" event={"ID":"75c67b6f-14bc-4d96-a6b6-ae020ace5353","Type":"ContainerStarted","Data":"6ebe829d87ca90bab0852d90ee536bd6df6c7241943f58baf5f947d0c1a63cfd"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.875524 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" event={"ID":"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71","Type":"ContainerStarted","Data":"5a1fd3ef96a73a5fc6f9284be43b3b9835d635e847a666b7836e4da30366a98d"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.891303 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-t2b7w" event={"ID":"5ac2fbf9-c9bb-4ef8-988f-4407e688ad54","Type":"ContainerStarted","Data":"5c7f415af0a310e0e5ec815fb8b99f9af5a695746f634a306a170ddae1dc5ab1"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.893218 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:18 crc kubenswrapper[4885]: W0308 19:35:18.894990 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca80eb80_6964_436a_bf66_0c5fe9b7e641.slice/crio-e36aa307efc224d05755f51ae7a1ba720dd78d7e212ea7c988fb34b75e8fc132 WatchSource:0}: Error finding container e36aa307efc224d05755f51ae7a1ba720dd78d7e212ea7c988fb34b75e8fc132: Status 404 returned error can't find the container with id e36aa307efc224d05755f51ae7a1ba720dd78d7e212ea7c988fb34b75e8fc132 Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.896080 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" event={"ID":"f2a6bad6-cd1e-4e38-88fe-d531ea458683","Type":"ContainerStarted","Data":"2b4b4d6c691c9bf385d1aca10675c3a0f6e1dce4c0d75c0a9518dc576f033a3a"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.904870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:18 crc kubenswrapper[4885]: E0308 19:35:18.907337 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.407322593 +0000 UTC m=+220.803376616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.922387 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" event={"ID":"7ce8edea-f754-4ee2-a475-2022f99ed7f9","Type":"ContainerStarted","Data":"88f02e527e05546477939405952a27f505b7ed5c0e9bb3144acc4432372dbfac"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.932476 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" event={"ID":"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0","Type":"ContainerStarted","Data":"40c776312848989ccb2eddcff1adbedd8af2200ce48ffdc1e4635ef09792719e"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.932806 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.943585 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" event={"ID":"6396835c-4d1e-4b5d-a6f1-4f8003f073e9","Type":"ContainerStarted","Data":"2f12d83b6e520da0ff0a1f7171d278c8ae9e7493aa1a2f58395b447dfa4e0de4"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.944178 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.946374 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lvfcn" event={"ID":"e58e5e9a-de88-4209-8100-e9d4e415e68d","Type":"ContainerStarted","Data":"287c51e206e17f70938901810509dcd30ad1e8cf6e7f98fb4ab1340276721e91"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.950892 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" event={"ID":"89561acc-f596-4f61-95b9-0cbc686a0b47","Type":"ContainerStarted","Data":"9e732dd141ffc6d56acf5e0362491a88c049308d550bfef78ae76c517eb37aa4"} Mar 08 19:35:18 crc kubenswrapper[4885]: I0308 19:35:18.985653 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-cpx85" podStartSLOduration=182.985635012 podStartE2EDuration="3m2.985635012s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:18.984229235 +0000 UTC m=+220.380283278" watchObservedRunningTime="2026-03-08 19:35:18.985635012 +0000 UTC m=+220.381689035" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.005627 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.007640 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.507618992 +0000 UTC m=+220.903673005 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.054462 4885 patch_prober.go:28] interesting pod/console-operator-58897d9998-2tz9t container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.054509 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" podUID="6396835c-4d1e-4b5d-a6f1-4f8003f073e9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.054787 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2b7w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.054841 4885 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bg5wl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.054850 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t2b7w" podUID="5ac2fbf9-c9bb-4ef8-988f-4407e688ad54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.054863 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" podUID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.063588 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.110139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.113682 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.613659634 +0000 UTC m=+221.009713657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.214279 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.215322 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.715305349 +0000 UTC m=+221.111359372 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.316584 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.316860 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.816848511 +0000 UTC m=+221.212902534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.417562 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.417932 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:19.91790127 +0000 UTC m=+221.313955293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.519118 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.519769 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.01975295 +0000 UTC m=+221.415806973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.541709 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-tw9z2"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.554271 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.566815 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.591520 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.599880 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.599938 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.623959 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.624405 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.124390525 +0000 UTC m=+221.520444538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.683438 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" podStartSLOduration=183.683405317 podStartE2EDuration="3m3.683405317s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:19.680847268 +0000 UTC m=+221.076901291" watchObservedRunningTime="2026-03-08 19:35:19.683405317 +0000 UTC m=+221.079459340" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.715427 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-t2b7w" podStartSLOduration=184.715407805 podStartE2EDuration="3m4.715407805s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:19.714788809 +0000 UTC m=+221.110842832" watchObservedRunningTime="2026-03-08 19:35:19.715407805 +0000 UTC m=+221.111461828" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.725016 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.725294 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.22528296 +0000 UTC m=+221.621336983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.728196 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.730606 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.782089 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hsdmw" podStartSLOduration=184.782067212 podStartE2EDuration="3m4.782067212s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:19.761027788 +0000 UTC m=+221.157081811" watchObservedRunningTime="2026-03-08 19:35:19.782067212 +0000 UTC m=+221.178121225" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.826528 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.827961 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.32790195 +0000 UTC m=+221.723955973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.828525 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.858440 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" podStartSLOduration=184.858423879 podStartE2EDuration="3m4.858423879s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:19.856087877 +0000 UTC m=+221.252141890" watchObservedRunningTime="2026-03-08 19:35:19.858423879 +0000 UTC m=+221.254477902" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.875299 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5ghwr"] Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.884703 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-lvfcn" podStartSLOduration=183.884688673 podStartE2EDuration="3m3.884688673s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:19.881734474 +0000 UTC m=+221.277788497" watchObservedRunningTime="2026-03-08 19:35:19.884688673 +0000 UTC m=+221.280742696" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.912561 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-q8q8m" podStartSLOduration=184.91254454 podStartE2EDuration="3m4.91254454s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:19.91034708 +0000 UTC m=+221.306401103" watchObservedRunningTime="2026-03-08 19:35:19.91254454 +0000 UTC m=+221.308598563" Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.929269 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:19 crc kubenswrapper[4885]: E0308 19:35:19.929752 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.42973612 +0000 UTC m=+221.825790143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.961510 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w"] Mar 08 19:35:19 crc kubenswrapper[4885]: W0308 19:35:19.967509 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc52227b_0572_4fed_a5c1_e86521a20e58.slice/crio-d6324ba7e136849d35c67b5f0bd01bf24c3a978e92e681f166a89fef91a40b61 WatchSource:0}: Error finding container d6324ba7e136849d35c67b5f0bd01bf24c3a978e92e681f166a89fef91a40b61: Status 404 returned error can't find the container with id d6324ba7e136849d35c67b5f0bd01bf24c3a978e92e681f166a89fef91a40b61 Mar 08 19:35:19 crc kubenswrapper[4885]: I0308 19:35:19.970533 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549974-jjqkh"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.010600 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jbwsr"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.022203 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" event={"ID":"1329795d-a8f9-4896-adba-23c2c0da9261","Type":"ContainerStarted","Data":"134df6f2646bd48f24f5f5499ba5d32597cef8343a7f0bc84c8f90089d6df4b2"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.030946 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.031308 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.531285723 +0000 UTC m=+221.927339746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.032230 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" event={"ID":"75c67b6f-14bc-4d96-a6b6-ae020ace5353","Type":"ContainerStarted","Data":"0f8119b0d68e82807f3907440f0ab6a1dd0c951f0bc34c5ea8883d882aa8768e"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.035472 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" event={"ID":"fbfbc2e1-ae98-4c40-a739-877e7296f16a","Type":"ContainerStarted","Data":"24fa952afc1cee799dbe57d90ca6a31ef015131f7bcb28a28e0cae5629e156af"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.044124 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" event={"ID":"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0","Type":"ContainerStarted","Data":"8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.053811 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" event={"ID":"89561acc-f596-4f61-95b9-0cbc686a0b47","Type":"ContainerStarted","Data":"822026861bd7bd3eb3b4abc852e4a8b2ee41fbc101e83faae4ea4adda87b4aba"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.058191 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" event={"ID":"fe3a8c81-8c1d-4b38-9cae-813fb749fd43","Type":"ContainerStarted","Data":"22c1696bfe5d42dda6bd4ec2ff027c09be34e25612f324559a931a48e4fe7037"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.066760 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" event={"ID":"4928f728-c20b-4d8e-83f3-786cf90cf3e6","Type":"ContainerStarted","Data":"5c303026128bc9a23d184cd4393e6b606778e0a778c0e10cfe8f03f268ec44d2"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.084754 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" event={"ID":"7ce8edea-f754-4ee2-a475-2022f99ed7f9","Type":"ContainerStarted","Data":"f66c621893c1804a2ce9dd2c9e30230e8c52596e34c2e42465a9285e6768bcf2"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.086245 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" event={"ID":"9878f6fd-fc1f-4980-a687-84478d0b92c1","Type":"ContainerStarted","Data":"e0cadddc4c54fc080453871aee1452224b0f7cf0ac2e114f1399e3a163a6efb0"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.092151 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" event={"ID":"6396835c-4d1e-4b5d-a6f1-4f8003f073e9","Type":"ContainerStarted","Data":"9ca9706ffcf72c3c73f90c5b8b579cbad436f32afa85e40bad562c9ecb6198c7"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.093195 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" event={"ID":"5664b98a-83b1-433d-8449-04a982f77fff","Type":"ContainerStarted","Data":"5f22ebcb294fc868cd9864233184817834561a14db9eecdc3410944896bd4b06"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.094120 4885 patch_prober.go:28] interesting pod/console-operator-58897d9998-2tz9t container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.094169 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" podUID="6396835c-4d1e-4b5d-a6f1-4f8003f073e9" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.095172 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" event={"ID":"494bb437-45dd-48e3-b932-9c3645e493ef","Type":"ContainerStarted","Data":"f659fc375e81a0da3b3a8a5764dac9643b37e01b75000e4baf0c2c826224551b"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.103263 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-hqtdl"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.105263 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" event={"ID":"46d0f7c6-3622-4e8a-885a-8f85ac63c36f","Type":"ContainerStarted","Data":"cfbedb223e1048892c1331c9f68c07cd8c7ed4be2b1b10fb1fd66bb675ec0496"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.116056 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.126147 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.136869 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.138270 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.63825871 +0000 UTC m=+222.034312733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.144283 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bl88k" event={"ID":"ca80eb80-6964-436a-bf66-0c5fe9b7e641","Type":"ContainerStarted","Data":"71a55f6115fed54a842d0d8a7fe91a125603a769193f94ec11703612038a1772"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.144318 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-bl88k" event={"ID":"ca80eb80-6964-436a-bf66-0c5fe9b7e641","Type":"ContainerStarted","Data":"e36aa307efc224d05755f51ae7a1ba720dd78d7e212ea7c988fb34b75e8fc132"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.154550 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.157596 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-kr78b" podStartSLOduration=184.157580788 podStartE2EDuration="3m4.157580788s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.154887826 +0000 UTC m=+221.550941859" watchObservedRunningTime="2026-03-08 19:35:20.157580788 +0000 UTC m=+221.553634801" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.164097 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-s5pkw"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.186412 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" event={"ID":"6ca246d9-b15a-4163-87dc-84b8bc916c4d","Type":"ContainerStarted","Data":"5984b65b91ed88837ac3d747bb2d9735e94039292fdd6931b4edb014e97f896d"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.186452 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" event={"ID":"6ca246d9-b15a-4163-87dc-84b8bc916c4d","Type":"ContainerStarted","Data":"7e0d027df693db86a17d40da3dc20a54596865a69eda634418b82dad66b0d64b"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.189416 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" podStartSLOduration=184.189406452 podStartE2EDuration="3m4.189406452s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.185527768 +0000 UTC m=+221.581581791" watchObservedRunningTime="2026-03-08 19:35:20.189406452 +0000 UTC m=+221.585460475" Mar 08 19:35:20 crc kubenswrapper[4885]: W0308 19:35:20.200505 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c8bd61f_4965_4410_9ec7_b858a4529287.slice/crio-5b8dbe15e26c016a36c27b2e9fe6adb02153e3419b63e7ccc486e1c23dd52fb8 WatchSource:0}: Error finding container 5b8dbe15e26c016a36c27b2e9fe6adb02153e3419b63e7ccc486e1c23dd52fb8: Status 404 returned error can't find the container with id 5b8dbe15e26c016a36c27b2e9fe6adb02153e3419b63e7ccc486e1c23dd52fb8 Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.204746 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" event={"ID":"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1","Type":"ContainerStarted","Data":"88c0b1af50a552d08be715255e8619226fa4c9e630976040932d73e5d1b633eb"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.228068 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.236979 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" event={"ID":"495803ea-175c-4ad0-ac77-0598ce8213c1","Type":"ContainerStarted","Data":"c605a611d56f05303c2dcd605e19bb8a2402ffc1ea2399ec11f51363586ef9b5"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.238147 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-f6tw8" podStartSLOduration=185.238136697 podStartE2EDuration="3m5.238136697s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.237321726 +0000 UTC m=+221.633375749" watchObservedRunningTime="2026-03-08 19:35:20.238136697 +0000 UTC m=+221.634190720" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.239393 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.240393 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.740371718 +0000 UTC m=+222.136425741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.276277 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" podStartSLOduration=184.271412039 podStartE2EDuration="3m4.271412039s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.269652713 +0000 UTC m=+221.665706736" watchObservedRunningTime="2026-03-08 19:35:20.271412039 +0000 UTC m=+221.667466062" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.281856 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ldvgz"] Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.290220 4885 generic.go:334] "Generic (PLEG): container finished" podID="f5b425d2-db8e-45f3-a141-8ac7bd678491" containerID="cbe27be8b7dab43238c6046bdd1ad6b778c046687071c13fbcbbacadf7286eb5" exitCode=0 Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.291793 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" event={"ID":"f5b425d2-db8e-45f3-a141-8ac7bd678491","Type":"ContainerDied","Data":"cbe27be8b7dab43238c6046bdd1ad6b778c046687071c13fbcbbacadf7286eb5"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.297809 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" event={"ID":"f2a6bad6-cd1e-4e38-88fe-d531ea458683","Type":"ContainerStarted","Data":"9600dc055804a3c36d351eb86512c92ddfcc39baafd43cbd5a93f262b40e898e"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.321289 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-czwph" podStartSLOduration=185.321267856 podStartE2EDuration="3m5.321267856s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.320044504 +0000 UTC m=+221.716098527" watchObservedRunningTime="2026-03-08 19:35:20.321267856 +0000 UTC m=+221.717321879" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.341837 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" event={"ID":"0bc91e4d-f2d9-494a-bca6-4a55cc82823b","Type":"ContainerStarted","Data":"50bf4da5bc456db0d20e51c442b88818b7facd5dec864ab7655b9c4bcb8bd792"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.342118 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.342895 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.842877445 +0000 UTC m=+222.238931468 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.372970 4885 generic.go:334] "Generic (PLEG): container finished" podID="5a244e04-1aec-4355-89c5-794667b5969f" containerID="cd28bccff11894cfc898016d0e9636b64db0338a86899e48f70c2cc6d9936340" exitCode=0 Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.373246 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" event={"ID":"5a244e04-1aec-4355-89c5-794667b5969f","Type":"ContainerDied","Data":"cd28bccff11894cfc898016d0e9636b64db0338a86899e48f70c2cc6d9936340"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.379381 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.381830 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-bl88k" podStartSLOduration=6.381810569 podStartE2EDuration="6.381810569s" podCreationTimestamp="2026-03-08 19:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.360438086 +0000 UTC m=+221.756492109" watchObservedRunningTime="2026-03-08 19:35:20.381810569 +0000 UTC m=+221.777864592" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.405241 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rsdtz" podStartSLOduration=185.405220847 podStartE2EDuration="3m5.405220847s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.400901461 +0000 UTC m=+221.796955484" watchObservedRunningTime="2026-03-08 19:35:20.405220847 +0000 UTC m=+221.801274870" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.382257 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" event={"ID":"a554818d-91a7-48e1-a5a7-5808a5240f3e","Type":"ContainerStarted","Data":"5c6b2740317e9c99c70a069e1107a4823751c8d4119f1015bb56c43ea6ad6dba"} Mar 08 19:35:20 crc kubenswrapper[4885]: W0308 19:35:20.399668 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f932056_01e3_43aa_a91a_7f33d20445ba.slice/crio-54af98463d5f942570d4ff3ed6c7eea66416d1158fc848c328eb506d2c95787d WatchSource:0}: Error finding container 54af98463d5f942570d4ff3ed6c7eea66416d1158fc848c328eb506d2c95787d: Status 404 returned error can't find the container with id 54af98463d5f942570d4ff3ed6c7eea66416d1158fc848c328eb506d2c95787d Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.419851 4885 generic.go:334] "Generic (PLEG): container finished" podID="ace2a8fd-20b4-40b6-a2ce-3e34454b3c71" containerID="5726d4714ea936e1fe74b51d8e22ca7342cc7250bda63a0d262e70977c5d6314" exitCode=0 Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.427304 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-lvfcn" event={"ID":"e58e5e9a-de88-4209-8100-e9d4e415e68d","Type":"ContainerStarted","Data":"358af39687fbb3a6287d37b8924ed64ff5df895ca495d93d9dea0acb3df66e4c"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.427415 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" event={"ID":"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71","Type":"ContainerDied","Data":"5726d4714ea936e1fe74b51d8e22ca7342cc7250bda63a0d262e70977c5d6314"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.445464 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.446899 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:20.946884213 +0000 UTC m=+222.342938236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.466546 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" event={"ID":"9930a19e-2aa9-42ec-91fc-16cd50bc2f40","Type":"ContainerStarted","Data":"bd37791fc087646bf00840284fb7bb455597a0b50d52b33de9e0532834430bed"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.495049 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" event={"ID":"d008be41-8eac-496a-9c3d-083014dc402c","Type":"ContainerStarted","Data":"f8be22b6ff0acffcb9ccf46c1c57e368d284120713d460abb8b81012323f6dcf"} Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.495089 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.496812 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2b7w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.496870 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t2b7w" podUID="5ac2fbf9-c9bb-4ef8-988f-4407e688ad54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.500494 4885 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2bp7t container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.32:6443/healthz\": dial tcp 10.217.0.32:6443: connect: connection refused" start-of-body= Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.500558 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" podUID="d008be41-8eac-496a-9c3d-083014dc402c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.32:6443/healthz\": dial tcp 10.217.0.32:6443: connect: connection refused" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.548169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.551553 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.051541609 +0000 UTC m=+222.447595632 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.575034 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:20 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:20 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:20 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.575252 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.650484 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.650836 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.15082219 +0000 UTC m=+222.546876213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.654959 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" podStartSLOduration=185.65490339 podStartE2EDuration="3m5.65490339s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.602050133 +0000 UTC m=+221.998104156" watchObservedRunningTime="2026-03-08 19:35:20.65490339 +0000 UTC m=+222.050957413" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.655676 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" podStartSLOduration=184.655669561 podStartE2EDuration="3m4.655669561s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:20.651079727 +0000 UTC m=+222.047133750" watchObservedRunningTime="2026-03-08 19:35:20.655669561 +0000 UTC m=+222.051723594" Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.752691 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.753218 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.253205855 +0000 UTC m=+222.649259878 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.853675 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.854070 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.354056118 +0000 UTC m=+222.750110141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:20 crc kubenswrapper[4885]: I0308 19:35:20.956527 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:20 crc kubenswrapper[4885]: E0308 19:35:20.956845 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.456834094 +0000 UTC m=+222.852888117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.059361 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.060033 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.560004689 +0000 UTC m=+222.956058712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.164859 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.165349 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.665335313 +0000 UTC m=+223.061389336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.268308 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.268479 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.768442307 +0000 UTC m=+223.164496330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.269514 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.269864 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.769850024 +0000 UTC m=+223.165904047 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.364902 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59112: no serving certificate available for the kubelet" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.383306 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.383843 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.883828 +0000 UTC m=+223.279882013 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.461145 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59116: no serving certificate available for the kubelet" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.484997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.485321 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:21.98530997 +0000 UTC m=+223.381363993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.528035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" event={"ID":"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1","Type":"ContainerStarted","Data":"2afac84c9994094dbcc536c4992e520d150cafe6ff99582d069b7e57a065ee18"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.528078 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" event={"ID":"96e1dff1-5d9e-40ab-bc1b-16d2925bc8b1","Type":"ContainerStarted","Data":"98b356fec6f5a3b07b7f8d550069bc888524e61649dfe9e4f3646cfc3746121b"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.556771 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-wbbxm" podStartSLOduration=186.556756515 podStartE2EDuration="3m6.556756515s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.554793143 +0000 UTC m=+222.950847166" watchObservedRunningTime="2026-03-08 19:35:21.556756515 +0000 UTC m=+222.952810538" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.561692 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:21 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:21 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:21 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.561744 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.574205 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" event={"ID":"89561acc-f596-4f61-95b9-0cbc686a0b47","Type":"ContainerStarted","Data":"ae3259feb4e05dab34376546fa71e134c7db4dd16752a41734b7be4914fcbf6e"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.586247 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.587016 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" event={"ID":"3c8bd61f-4965-4410-9ec7-b858a4529287","Type":"ContainerStarted","Data":"a018a113bf2c34ad790622b7e6a21b7df550841ddadd4ac0190d2fb27bfb0729"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.587058 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" event={"ID":"3c8bd61f-4965-4410-9ec7-b858a4529287","Type":"ContainerStarted","Data":"5b8dbe15e26c016a36c27b2e9fe6adb02153e3419b63e7ccc486e1c23dd52fb8"} Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.587161 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.08714484 +0000 UTC m=+223.483198863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.589369 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" event={"ID":"fc52227b-0572-4fed-a5c1-e86521a20e58","Type":"ContainerStarted","Data":"d2c8fcad6617015b58eacd10f15825587255865b024a8bd694205a7bb7310736"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.589406 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" event={"ID":"fc52227b-0572-4fed-a5c1-e86521a20e58","Type":"ContainerStarted","Data":"d6324ba7e136849d35c67b5f0bd01bf24c3a978e92e681f166a89fef91a40b61"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.592760 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" event={"ID":"494bb437-45dd-48e3-b932-9c3645e493ef","Type":"ContainerStarted","Data":"061e204cf4b604d0aff77e0470b12de1bd86fd4d554678c8a8f235c4aec2236b"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.593188 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.605211 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" event={"ID":"495803ea-175c-4ad0-ac77-0598ce8213c1","Type":"ContainerStarted","Data":"0236a696a6278a43e30ab9924a31d19da4cc309a43f9302effc299193e780900"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.611473 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" event={"ID":"a93ee425-a2b2-492c-bafc-2443d2fde2d4","Type":"ContainerStarted","Data":"8a7d4b715db9b88eb0212a5cb14539c74237db982628ac0efce6aff9d1e1558a"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.617227 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59118: no serving certificate available for the kubelet" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.619409 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" event={"ID":"82ad04de-932b-4ebf-97cf-0a6344ee1a9e","Type":"ContainerStarted","Data":"78e9a56d72798428d2805265d6ca05b9c1b61dddabc9930818607bcb302bae04"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.619453 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" event={"ID":"82ad04de-932b-4ebf-97cf-0a6344ee1a9e","Type":"ContainerStarted","Data":"8e098fe83bb84f9477f906d20387594d444eacdb41e849a6af7be74a7aef91f8"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.620308 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.621506 4885 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-m7ttr container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.621565 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" podUID="82ad04de-932b-4ebf-97cf-0a6344ee1a9e" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.632340 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" event={"ID":"790c2bc5-e8b1-4943-affd-360042eb1a79","Type":"ContainerStarted","Data":"7610278a1d339674636069d701d3a641d470b7cda8e0f49ee642589c394b1a09"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.632374 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" event={"ID":"790c2bc5-e8b1-4943-affd-360042eb1a79","Type":"ContainerStarted","Data":"75d8a238176218ff48dca8d9d8d7315e732593335742237740bd86a1389d3f47"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.632384 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" event={"ID":"790c2bc5-e8b1-4943-affd-360042eb1a79","Type":"ContainerStarted","Data":"a82a9753b5a0ab0015b44e21844d5750af26eb5e84e8dedb6e6fa09124c3d003"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.653733 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" event={"ID":"4928f728-c20b-4d8e-83f3-786cf90cf3e6","Type":"ContainerStarted","Data":"251fc07b69057bce26461a677acd5d783780fe5344eef1cc3d7eeec2e562456b"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.659662 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" event={"ID":"d008be41-8eac-496a-9c3d-083014dc402c","Type":"ContainerStarted","Data":"4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.661308 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qnq6k" podStartSLOduration=185.661293468 podStartE2EDuration="3m5.661293468s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.631663974 +0000 UTC m=+223.027717997" watchObservedRunningTime="2026-03-08 19:35:21.661293468 +0000 UTC m=+223.057347491" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.661760 4885 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-2bp7t container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.32:6443/healthz\": dial tcp 10.217.0.32:6443: connect: connection refused" start-of-body= Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.661813 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" podUID="d008be41-8eac-496a-9c3d-083014dc402c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.32:6443/healthz\": dial tcp 10.217.0.32:6443: connect: connection refused" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.663718 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" podStartSLOduration=185.663713303 podStartE2EDuration="3m5.663713303s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.661169745 +0000 UTC m=+223.057223768" watchObservedRunningTime="2026-03-08 19:35:21.663713303 +0000 UTC m=+223.059767326" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.687847 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.689861 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.189843393 +0000 UTC m=+223.585897416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.700229 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" event={"ID":"a554818d-91a7-48e1-a5a7-5808a5240f3e","Type":"ContainerStarted","Data":"7a4e0d8b5dd04f3c2b9599f08805d9622c4ab1441d28e223e4066a951d90c221"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.736805 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-jv6w9" event={"ID":"5664b98a-83b1-433d-8449-04a982f77fff","Type":"ContainerStarted","Data":"addab8fadc44035b194d48f810c6120eb7ce9d3c5ffdb8a3fb6c3d2b4394803e"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.738188 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" podStartSLOduration=186.738178329 podStartE2EDuration="3m6.738178329s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.694026855 +0000 UTC m=+223.090080878" watchObservedRunningTime="2026-03-08 19:35:21.738178329 +0000 UTC m=+223.134232352" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.738481 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-q9q8c" podStartSLOduration=185.738476227 podStartE2EDuration="3m5.738476227s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.738012995 +0000 UTC m=+223.134067018" watchObservedRunningTime="2026-03-08 19:35:21.738476227 +0000 UTC m=+223.134530250" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.755191 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" event={"ID":"60da1edb-8474-4368-a6ae-0bb2b1b7b845","Type":"ContainerStarted","Data":"37aa47577a13b111a43340382f6d3fe85ac7c15ce7d0d5052fe2dc63b6ee4d2c"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.769193 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59126: no serving certificate available for the kubelet" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.779711 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5ghwr" podStartSLOduration=185.779696612 podStartE2EDuration="3m5.779696612s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.779355173 +0000 UTC m=+223.175409196" watchObservedRunningTime="2026-03-08 19:35:21.779696612 +0000 UTC m=+223.175750635" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.788908 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.790268 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.290253315 +0000 UTC m=+223.686307338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.802021 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-2p9hf" event={"ID":"9930a19e-2aa9-42ec-91fc-16cd50bc2f40","Type":"ContainerStarted","Data":"b68d8855c3d14efa923fb60bf2796a36fab89cd8d899314e2bd6c4567ba8f37c"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.819447 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" podStartSLOduration=185.819429147 podStartE2EDuration="3m5.819429147s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.807298952 +0000 UTC m=+223.203352975" watchObservedRunningTime="2026-03-08 19:35:21.819429147 +0000 UTC m=+223.215483170" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.837902 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" event={"ID":"6ca246d9-b15a-4163-87dc-84b8bc916c4d","Type":"ContainerStarted","Data":"c7312ccdce297bf1cb5832f17d211b176ed0a46e5e0bf56fb40e793e79e8e0bf"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.851106 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-c6xkj" podStartSLOduration=186.851087305 podStartE2EDuration="3m6.851087305s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.849044801 +0000 UTC m=+223.245098824" watchObservedRunningTime="2026-03-08 19:35:21.851087305 +0000 UTC m=+223.247141328" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.858174 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" event={"ID":"fbfbc2e1-ae98-4c40-a739-877e7296f16a","Type":"ContainerStarted","Data":"fc5985b1de4e52486230835005129f42595df4bf6093a086215a241cef3953f4"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.858221 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" event={"ID":"fbfbc2e1-ae98-4c40-a739-877e7296f16a","Type":"ContainerStarted","Data":"3fbd91dc6c3731018331d8714c4d928bf1e958d81c1651a9ed465e3be3a68538"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.858347 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59140: no serving certificate available for the kubelet" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.858789 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.887213 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" event={"ID":"0bc91e4d-f2d9-494a-bca6-4a55cc82823b","Type":"ContainerStarted","Data":"76bf8744bcfc46e603b924837b1d9d82677d928c50703a871352c5035b8bfe75"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.896859 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqm6w" podStartSLOduration=185.896844002 podStartE2EDuration="3m5.896844002s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.896323249 +0000 UTC m=+223.292377272" watchObservedRunningTime="2026-03-08 19:35:21.896844002 +0000 UTC m=+223.292898025" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.897779 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.899164 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vdqjm" podStartSLOduration=185.899155285 podStartE2EDuration="3m5.899155285s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.874952935 +0000 UTC m=+223.271006958" watchObservedRunningTime="2026-03-08 19:35:21.899155285 +0000 UTC m=+223.295209308" Mar 08 19:35:21 crc kubenswrapper[4885]: E0308 19:35:21.899991 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.399974946 +0000 UTC m=+223.796028959 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.912343 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" event={"ID":"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459","Type":"ContainerStarted","Data":"b829eaa1a4b9abcca4a77046f2fb98ecf6de87c37a123b4fd7929bc57ce5e728"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.912391 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" event={"ID":"6d4b6fc2-2e3a-45f6-bf94-64d9aac9e459","Type":"ContainerStarted","Data":"cc1de70dc4290eb9eb710be70cd796443996f1f016634cca43de0b717ab4d300"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.922783 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" event={"ID":"753974fb-c7b2-4e2b-a62d-22544f357c9b","Type":"ContainerStarted","Data":"e57db9bd3698de6976d16ae2777d5f41258da142e2fb951445691d6b5897d94b"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.922827 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" event={"ID":"753974fb-c7b2-4e2b-a62d-22544f357c9b","Type":"ContainerStarted","Data":"a36061103302c2542520327f3c17973f7b1bfc4519647805cca1f03fe28492d4"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.923646 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.924902 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" event={"ID":"1329795d-a8f9-4896-adba-23c2c0da9261","Type":"ContainerStarted","Data":"2dc1a91346ca1a3f953a586589c1cef9384b7780b322dbba277349a4d5f8d041"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.932017 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4gqrl" podStartSLOduration=185.932002345 podStartE2EDuration="3m5.932002345s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.930384241 +0000 UTC m=+223.326438264" watchObservedRunningTime="2026-03-08 19:35:21.932002345 +0000 UTC m=+223.328056368" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.947032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-mxxgt" event={"ID":"46d0f7c6-3622-4e8a-885a-8f85ac63c36f","Type":"ContainerStarted","Data":"3f9619d94104fc326ea6c648cfb32059e55688e8bf9b2d4729e502fa84e1a431"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.954354 4885 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-h2bp5 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.954422 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" podUID="753974fb-c7b2-4e2b-a62d-22544f357c9b" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.965966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jbwsr" event={"ID":"0a7420ef-f20d-4d48-a619-627327de2063","Type":"ContainerStarted","Data":"e18311321d3c0014e1426b869312b2990c8d412d6413e3cd283d62871252ec66"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.966008 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jbwsr" event={"ID":"0a7420ef-f20d-4d48-a619-627327de2063","Type":"ContainerStarted","Data":"ee1100e97dbf89d4e7b1ea5eca189e6341bfe3b6b9ce78930e988e8849b1cc6e"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.967873 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" event={"ID":"83de4c2d-767a-4635-8748-486dd45683a1","Type":"ContainerStarted","Data":"598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.967891 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" event={"ID":"83de4c2d-767a-4635-8748-486dd45683a1","Type":"ContainerStarted","Data":"2ea70402b3dbdea12ac7aa07af023bd9134877c1cdbc64f413fbc103681b29c0"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.968473 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.974065 4885 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ldvgz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.974104 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.979780 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59146: no serving certificate available for the kubelet" Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.980435 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s5pkw" event={"ID":"6f932056-01e3-43aa-a91a-7f33d20445ba","Type":"ContainerStarted","Data":"54af98463d5f942570d4ff3ed6c7eea66416d1158fc848c328eb506d2c95787d"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.988848 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" event={"ID":"fe3a8c81-8c1d-4b38-9cae-813fb749fd43","Type":"ContainerStarted","Data":"408a83629bea15447a284e212f600dc5a3061c22a8484e4eabb1b71b7b670ef5"} Mar 08 19:35:21 crc kubenswrapper[4885]: I0308 19:35:21.989688 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" podStartSLOduration=185.98966224 podStartE2EDuration="3m5.98966224s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:21.980184467 +0000 UTC m=+223.376238490" watchObservedRunningTime="2026-03-08 19:35:21.98966224 +0000 UTC m=+223.385716263" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.000687 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.001828 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.501813796 +0000 UTC m=+223.897867819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.034982 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2b7w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.035028 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t2b7w" podUID="5ac2fbf9-c9bb-4ef8-988f-4407e688ad54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.035501 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" event={"ID":"5a244e04-1aec-4355-89c5-794667b5969f","Type":"ContainerStarted","Data":"a7712e58e3657307c61d5c1c66c3ba86b19138bdd56e7c08c023008232e51c80"} Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.036493 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.049134 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" podStartSLOduration=186.049101004 podStartE2EDuration="3m6.049101004s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.048746335 +0000 UTC m=+223.444800348" watchObservedRunningTime="2026-03-08 19:35:22.049101004 +0000 UTC m=+223.445155027" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.051339 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dmtc7" podStartSLOduration=186.051330293 podStartE2EDuration="3m6.051330293s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.019735487 +0000 UTC m=+223.415789500" watchObservedRunningTime="2026-03-08 19:35:22.051330293 +0000 UTC m=+223.447384316" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.066055 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-2tz9t" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.070389 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59156: no serving certificate available for the kubelet" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.075075 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jbwsr" podStartSLOduration=7.075059929 podStartE2EDuration="7.075059929s" podCreationTimestamp="2026-03-08 19:35:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.073908079 +0000 UTC m=+223.469962102" watchObservedRunningTime="2026-03-08 19:35:22.075059929 +0000 UTC m=+223.471113952" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.125191 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.128112 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" podStartSLOduration=186.128097171 podStartE2EDuration="3m6.128097171s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.124431914 +0000 UTC m=+223.520485937" watchObservedRunningTime="2026-03-08 19:35:22.128097171 +0000 UTC m=+223.524151204" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.132019 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.632002726 +0000 UTC m=+224.028056749 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.156710 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" podStartSLOduration=187.156676748 podStartE2EDuration="3m7.156676748s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.155161597 +0000 UTC m=+223.551215620" watchObservedRunningTime="2026-03-08 19:35:22.156676748 +0000 UTC m=+223.552730771" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.200263 4885 ???:1] "http: TLS handshake error from 192.168.126.11:59164: no serving certificate available for the kubelet" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.201088 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" podStartSLOduration=187.201071577 podStartE2EDuration="3m7.201071577s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.200630426 +0000 UTC m=+223.596684449" watchObservedRunningTime="2026-03-08 19:35:22.201071577 +0000 UTC m=+223.597125600" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.233421 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.233839 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.733823446 +0000 UTC m=+224.129877459 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.337640 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.338319 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.838307236 +0000 UTC m=+224.234361259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.439667 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.440066 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:22.940050694 +0000 UTC m=+224.336104717 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.541469 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.541843 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.041827602 +0000 UTC m=+224.437881625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.556545 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:22 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:22 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:22 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.556601 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.593213 4885 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-j7xfr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.593486 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" podUID="494bb437-45dd-48e3-b932-9c3645e493ef" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.29:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.642506 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.642637 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.142612074 +0000 UTC m=+224.538666097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.642803 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.643115 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.143100037 +0000 UTC m=+224.539154060 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.743747 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.743937 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.243898219 +0000 UTC m=+224.639952242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.744385 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.744697 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.24468917 +0000 UTC m=+224.640743193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.845686 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.845874 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.345851182 +0000 UTC m=+224.741905205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.846005 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.846284 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.346276343 +0000 UTC m=+224.742330366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.885004 4885 ???:1] "http: TLS handshake error from 192.168.126.11:39194: no serving certificate available for the kubelet" Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.946893 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.947038 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.447011914 +0000 UTC m=+224.843065937 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:22 crc kubenswrapper[4885]: I0308 19:35:22.947095 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:22 crc kubenswrapper[4885]: E0308 19:35:22.947390 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.447382313 +0000 UTC m=+224.843436336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.032789 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-k2rwt" podStartSLOduration=187.032771062 podStartE2EDuration="3m7.032771062s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:22.295200111 +0000 UTC m=+223.691254134" watchObservedRunningTime="2026-03-08 19:35:23.032771062 +0000 UTC m=+224.428825085" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.032971 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6lcrf"] Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.033178 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" podUID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" containerName="controller-manager" containerID="cri-o://9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652" gracePeriod=30 Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.047692 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.048060 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.548041532 +0000 UTC m=+224.944095555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.049487 4885 generic.go:334] "Generic (PLEG): container finished" podID="1329795d-a8f9-4896-adba-23c2c0da9261" containerID="2dc1a91346ca1a3f953a586589c1cef9384b7780b322dbba277349a4d5f8d041" exitCode=0 Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.049635 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" event={"ID":"1329795d-a8f9-4896-adba-23c2c0da9261","Type":"ContainerDied","Data":"2dc1a91346ca1a3f953a586589c1cef9384b7780b322dbba277349a4d5f8d041"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.058938 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-bzvjp" event={"ID":"495803ea-175c-4ad0-ac77-0598ce8213c1","Type":"ContainerStarted","Data":"2c361cdfc1cff12d50ee9ee085504ab66dd55d8189793f12bcc353982149dab2"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.066614 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" event={"ID":"0bc91e4d-f2d9-494a-bca6-4a55cc82823b","Type":"ContainerStarted","Data":"ca39148a9626339987a5aa2ed03d067640ce3ea5cfb1c18cbede6b6f556b0d08"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.070423 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" event={"ID":"3c8bd61f-4965-4410-9ec7-b858a4529287","Type":"ContainerStarted","Data":"4c4e05bc564fcc66f0abbb235f48feb99eaafb896ab2420ec03c6029f7b0803c"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.081836 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" event={"ID":"ace2a8fd-20b4-40b6-a2ce-3e34454b3c71","Type":"ContainerStarted","Data":"b41eebef7022481e1ffd80510952eacb0584b1209a68ef76a2ea9db2aa437f4b"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.088486 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s5pkw" event={"ID":"6f932056-01e3-43aa-a91a-7f33d20445ba","Type":"ContainerStarted","Data":"6092b260bdb36ebe73d894ceb7e9ad1d6a208e6765216dfd3aecb3017986764f"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.088524 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-s5pkw" event={"ID":"6f932056-01e3-43aa-a91a-7f33d20445ba","Type":"ContainerStarted","Data":"598a74675b4618713ba1057715801f585ba07f1c27c5278513b4da0da2659659"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.088695 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.089200 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl"] Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.113357 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-hqtdl" podStartSLOduration=187.113340653 podStartE2EDuration="3m7.113340653s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:23.113127747 +0000 UTC m=+224.509181770" watchObservedRunningTime="2026-03-08 19:35:23.113340653 +0000 UTC m=+224.509394676" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.117737 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" event={"ID":"f5b425d2-db8e-45f3-a141-8ac7bd678491","Type":"ContainerStarted","Data":"fb6386a3971604b1cda9ebf974a2be03222ed9f37a2e6f751f3c4d327cd559dc"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.117771 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" event={"ID":"f5b425d2-db8e-45f3-a141-8ac7bd678491","Type":"ContainerStarted","Data":"9c5b6434d50072b4bb4603a93d7f28a8de87f7df4b6ec699653d852a93d711a6"} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.123086 4885 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ldvgz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.123143 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.138372 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h2bp5" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.143210 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-j7xfr" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.146515 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.151260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.152302 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.652286907 +0000 UTC m=+225.048340940 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.164467 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-m7ttr" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.176362 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" podStartSLOduration=187.176345241 podStartE2EDuration="3m7.176345241s" podCreationTimestamp="2026-03-08 19:32:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:23.165122941 +0000 UTC m=+224.561176964" watchObservedRunningTime="2026-03-08 19:35:23.176345241 +0000 UTC m=+224.572399254" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.222789 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-s5pkw" podStartSLOduration=9.222776176 podStartE2EDuration="9.222776176s" podCreationTimestamp="2026-03-08 19:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:23.220644789 +0000 UTC m=+224.616698802" watchObservedRunningTime="2026-03-08 19:35:23.222776176 +0000 UTC m=+224.618830199" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.252442 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.256567 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.756546721 +0000 UTC m=+225.152600744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.356136 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.356492 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.856481601 +0000 UTC m=+225.252535624 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.369910 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" podStartSLOduration=188.36989172 podStartE2EDuration="3m8.36989172s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:23.335060096 +0000 UTC m=+224.731114119" watchObservedRunningTime="2026-03-08 19:35:23.36989172 +0000 UTC m=+224.765945733" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.459422 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.459964 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:23.959949303 +0000 UTC m=+225.356003316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.557371 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:23 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:23 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:23 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.557608 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.561430 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.561804 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:24.061786054 +0000 UTC m=+225.457840077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.648202 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.662884 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.663389 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-08 19:35:24.163359607 +0000 UTC m=+225.559413630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.721555 4885 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.766319 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-proxy-ca-bundles\") pod \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.766407 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d98wm\" (UniqueName: \"kubernetes.io/projected/4c7583a8-a980-4ab2-a594-bf55ec72c91c-kube-api-access-d98wm\") pod \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.766426 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c7583a8-a980-4ab2-a594-bf55ec72c91c-serving-cert\") pod \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.766451 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-client-ca\") pod \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.766470 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-config\") pod \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\" (UID: \"4c7583a8-a980-4ab2-a594-bf55ec72c91c\") " Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.766619 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:23 crc kubenswrapper[4885]: E0308 19:35:23.766915 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-08 19:35:24.266904843 +0000 UTC m=+225.662958866 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4xs78" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.767641 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4c7583a8-a980-4ab2-a594-bf55ec72c91c" (UID: "4c7583a8-a980-4ab2-a594-bf55ec72c91c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.768745 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-config" (OuterVolumeSpecName: "config") pod "4c7583a8-a980-4ab2-a594-bf55ec72c91c" (UID: "4c7583a8-a980-4ab2-a594-bf55ec72c91c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.769105 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c7583a8-a980-4ab2-a594-bf55ec72c91c" (UID: "4c7583a8-a980-4ab2-a594-bf55ec72c91c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.781730 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c7583a8-a980-4ab2-a594-bf55ec72c91c-kube-api-access-d98wm" (OuterVolumeSpecName: "kube-api-access-d98wm") pod "4c7583a8-a980-4ab2-a594-bf55ec72c91c" (UID: "4c7583a8-a980-4ab2-a594-bf55ec72c91c"). InnerVolumeSpecName "kube-api-access-d98wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.782114 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c7583a8-a980-4ab2-a594-bf55ec72c91c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c7583a8-a980-4ab2-a594-bf55ec72c91c" (UID: "4c7583a8-a980-4ab2-a594-bf55ec72c91c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.798986 4885 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-08T19:35:23.721580357Z","Handler":null,"Name":""} Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.822090 4885 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.822121 4885 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.867827 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.868112 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d98wm\" (UniqueName: \"kubernetes.io/projected/4c7583a8-a980-4ab2-a594-bf55ec72c91c-kube-api-access-d98wm\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.868125 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c7583a8-a980-4ab2-a594-bf55ec72c91c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.868133 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.868142 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.868151 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c7583a8-a980-4ab2-a594-bf55ec72c91c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.874115 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.969197 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.974174 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 19:35:23 crc kubenswrapper[4885]: I0308 19:35:23.974204 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.021356 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4xs78\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.128040 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.147410 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" event={"ID":"0bc91e4d-f2d9-494a-bca6-4a55cc82823b","Type":"ContainerStarted","Data":"dddedbc40f09532d80f83f22ccff53538b994d272f369aa4c46ad8964b06a01e"} Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.147460 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" event={"ID":"0bc91e4d-f2d9-494a-bca6-4a55cc82823b","Type":"ContainerStarted","Data":"8799e4a06d4c939fdc0004857b25547c74206974d9020f02811fdeb4e0bcc37f"} Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.173136 4885 generic.go:334] "Generic (PLEG): container finished" podID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" containerID="9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652" exitCode=0 Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.173202 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" event={"ID":"4c7583a8-a980-4ab2-a594-bf55ec72c91c","Type":"ContainerDied","Data":"9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652"} Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.173250 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" event={"ID":"4c7583a8-a980-4ab2-a594-bf55ec72c91c","Type":"ContainerDied","Data":"4a34d899e46ec2504a4c18a48ad21ba4015069c3cce16e7b4c57e8e3b93a6b39"} Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.173266 4885 scope.go:117] "RemoveContainer" containerID="9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.173266 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6lcrf" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.174159 4885 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ldvgz container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.174193 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.174965 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" podUID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" containerName="route-controller-manager" containerID="cri-o://8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e" gracePeriod=30 Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.181637 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.200786 4885 scope.go:117] "RemoveContainer" containerID="9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652" Mar 08 19:35:24 crc kubenswrapper[4885]: E0308 19:35:24.202438 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652\": container with ID starting with 9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652 not found: ID does not exist" containerID="9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.202485 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652"} err="failed to get container status \"9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652\": rpc error: code = NotFound desc = could not find container \"9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652\": container with ID starting with 9f41f0fd1a6e53c6fe0a158a23b3771ef52205446766575fa2e8dec0c8939652 not found: ID does not exist" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.214983 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-tw9z2" podStartSLOduration=10.214964854 podStartE2EDuration="10.214964854s" podCreationTimestamp="2026-03-08 19:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:24.188153545 +0000 UTC m=+225.584207568" watchObservedRunningTime="2026-03-08 19:35:24.214964854 +0000 UTC m=+225.611018877" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.236472 4885 ???:1] "http: TLS handshake error from 192.168.126.11:39208: no serving certificate available for the kubelet" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.248883 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6lcrf"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.256853 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6lcrf"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.330728 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gnjnd"] Mar 08 19:35:24 crc kubenswrapper[4885]: E0308 19:35:24.331154 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" containerName="controller-manager" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.331170 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" containerName="controller-manager" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.331260 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" containerName="controller-manager" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.331963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.339506 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gnjnd"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.339825 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.487486 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cp8v\" (UniqueName: \"kubernetes.io/projected/2a6b85b3-0bb1-4199-983f-615a6c932f09-kube-api-access-8cp8v\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.487525 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-utilities\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.487554 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-catalog-content\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.516418 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xpctw"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.519409 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.524260 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.527382 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xpctw"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.560447 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:24 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:24 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:24 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.560499 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.589749 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-utilities\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.589816 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-catalog-content\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.589936 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cp8v\" (UniqueName: \"kubernetes.io/projected/2a6b85b3-0bb1-4199-983f-615a6c932f09-kube-api-access-8cp8v\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.589957 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-utilities\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.589977 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs66k\" (UniqueName: \"kubernetes.io/projected/7d8fbc68-3714-4fe4-9f62-857c5dc05661-kube-api-access-gs66k\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.589999 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-catalog-content\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.590582 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-catalog-content\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.591468 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-utilities\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.616912 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cp8v\" (UniqueName: \"kubernetes.io/projected/2a6b85b3-0bb1-4199-983f-615a6c932f09-kube-api-access-8cp8v\") pod \"community-operators-gnjnd\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.666410 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4xs78"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.679796 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.693509 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-catalog-content\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.693671 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs66k\" (UniqueName: \"kubernetes.io/projected/7d8fbc68-3714-4fe4-9f62-857c5dc05661-kube-api-access-gs66k\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.693736 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-utilities\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.694219 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-utilities\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.694507 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-catalog-content\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.706199 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t9fn7"] Mar 08 19:35:24 crc kubenswrapper[4885]: E0308 19:35:24.706382 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1329795d-a8f9-4896-adba-23c2c0da9261" containerName="collect-profiles" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.706392 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1329795d-a8f9-4896-adba-23c2c0da9261" containerName="collect-profiles" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.706486 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1329795d-a8f9-4896-adba-23c2c0da9261" containerName="collect-profiles" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.707168 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.715788 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t9fn7"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.738346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs66k\" (UniqueName: \"kubernetes.io/projected/7d8fbc68-3714-4fe4-9f62-857c5dc05661-kube-api-access-gs66k\") pod \"certified-operators-xpctw\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.777074 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.795155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1329795d-a8f9-4896-adba-23c2c0da9261-secret-volume\") pod \"1329795d-a8f9-4896-adba-23c2c0da9261\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.796248 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1329795d-a8f9-4896-adba-23c2c0da9261-config-volume\") pod \"1329795d-a8f9-4896-adba-23c2c0da9261\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.796417 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4fv6\" (UniqueName: \"kubernetes.io/projected/1329795d-a8f9-4896-adba-23c2c0da9261-kube-api-access-k4fv6\") pod \"1329795d-a8f9-4896-adba-23c2c0da9261\" (UID: \"1329795d-a8f9-4896-adba-23c2c0da9261\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.796594 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-catalog-content\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.796627 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-utilities\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.796690 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhfws\" (UniqueName: \"kubernetes.io/projected/038004f7-92de-42b0-8951-447dfdaf2f83-kube-api-access-zhfws\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.797226 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1329795d-a8f9-4896-adba-23c2c0da9261-config-volume" (OuterVolumeSpecName: "config-volume") pod "1329795d-a8f9-4896-adba-23c2c0da9261" (UID: "1329795d-a8f9-4896-adba-23c2c0da9261"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.801850 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1329795d-a8f9-4896-adba-23c2c0da9261-kube-api-access-k4fv6" (OuterVolumeSpecName: "kube-api-access-k4fv6") pod "1329795d-a8f9-4896-adba-23c2c0da9261" (UID: "1329795d-a8f9-4896-adba-23c2c0da9261"). InnerVolumeSpecName "kube-api-access-k4fv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.803278 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.804897 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1329795d-a8f9-4896-adba-23c2c0da9261-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1329795d-a8f9-4896-adba-23c2c0da9261" (UID: "1329795d-a8f9-4896-adba-23c2c0da9261"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.849766 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899068 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-client-ca\") pod \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899112 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-serving-cert\") pod \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899180 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxk9p\" (UniqueName: \"kubernetes.io/projected/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-kube-api-access-lxk9p\") pod \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899202 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-config\") pod \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\" (UID: \"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0\") " Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899364 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-utilities\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899408 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhfws\" (UniqueName: \"kubernetes.io/projected/038004f7-92de-42b0-8951-447dfdaf2f83-kube-api-access-zhfws\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899491 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-catalog-content\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899536 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1329795d-a8f9-4896-adba-23c2c0da9261-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899548 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1329795d-a8f9-4896-adba-23c2c0da9261-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.899557 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4fv6\" (UniqueName: \"kubernetes.io/projected/1329795d-a8f9-4896-adba-23c2c0da9261-kube-api-access-k4fv6\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.900027 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-catalog-content\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.900780 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-config" (OuterVolumeSpecName: "config") pod "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" (UID: "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.901015 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-utilities\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.901464 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-client-ca" (OuterVolumeSpecName: "client-ca") pod "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" (UID: "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.905427 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-kube-api-access-lxk9p" (OuterVolumeSpecName: "kube-api-access-lxk9p") pod "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" (UID: "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0"). InnerVolumeSpecName "kube-api-access-lxk9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.907567 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" (UID: "f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.907635 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-59wjr"] Mar 08 19:35:24 crc kubenswrapper[4885]: E0308 19:35:24.907855 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" containerName="route-controller-manager" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.907871 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" containerName="route-controller-manager" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.907993 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" containerName="route-controller-manager" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.908664 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.915820 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59wjr"] Mar 08 19:35:24 crc kubenswrapper[4885]: I0308 19:35:24.927609 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhfws\" (UniqueName: \"kubernetes.io/projected/038004f7-92de-42b0-8951-447dfdaf2f83-kube-api-access-zhfws\") pod \"community-operators-t9fn7\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002098 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-utilities\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002537 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-catalog-content\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002630 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdjrg\" (UniqueName: \"kubernetes.io/projected/7346fb7f-6125-49c7-a422-cc169bc7e045-kube-api-access-bdjrg\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002719 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002733 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002745 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxk9p\" (UniqueName: \"kubernetes.io/projected/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-kube-api-access-lxk9p\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.002758 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.060580 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.105849 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-catalog-content\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.105907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdjrg\" (UniqueName: \"kubernetes.io/projected/7346fb7f-6125-49c7-a422-cc169bc7e045-kube-api-access-bdjrg\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.105969 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-utilities\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.106352 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-utilities\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.106390 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xpctw"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.106598 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-catalog-content\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.123409 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdjrg\" (UniqueName: \"kubernetes.io/projected/7346fb7f-6125-49c7-a422-cc169bc7e045-kube-api-access-bdjrg\") pod \"certified-operators-59wjr\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: W0308 19:35:25.135201 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d8fbc68_3714_4fe4_9f62_857c5dc05661.slice/crio-9bf4496f530b593c8f965a319860f693e2e64a4e57a6c4d734640ea6410547bb WatchSource:0}: Error finding container 9bf4496f530b593c8f965a319860f693e2e64a4e57a6c4d734640ea6410547bb: Status 404 returned error can't find the container with id 9bf4496f530b593c8f965a319860f693e2e64a4e57a6c4d734640ea6410547bb Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.193787 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" event={"ID":"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1","Type":"ContainerStarted","Data":"f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c"} Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.193834 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" event={"ID":"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1","Type":"ContainerStarted","Data":"d5dd902e3ef717231619c64b1b91e79b07a9f0b3233c92d0567cafca72b99c09"} Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.195098 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.203466 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.204241 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.204296 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv" event={"ID":"1329795d-a8f9-4896-adba-23c2c0da9261","Type":"ContainerDied","Data":"134df6f2646bd48f24f5f5499ba5d32597cef8343a7f0bc84c8f90089d6df4b2"} Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.204329 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="134df6f2646bd48f24f5f5499ba5d32597cef8343a7f0bc84c8f90089d6df4b2" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.204424 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.207441 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dfb477bd6-857d4"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.208102 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.208307 4885 generic.go:334] "Generic (PLEG): container finished" podID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" containerID="8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e" exitCode=0 Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.208367 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" event={"ID":"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0","Type":"ContainerDied","Data":"8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e"} Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.208393 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" event={"ID":"f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0","Type":"ContainerDied","Data":"40c776312848989ccb2eddcff1adbedd8af2200ce48ffdc1e4635ef09792719e"} Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.208410 4885 scope.go:117] "RemoveContainer" containerID="8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.208502 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.211885 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.215071 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.215387 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.215953 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.216226 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.216389 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.222169 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.228487 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.229127 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.231333 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerStarted","Data":"9bf4496f530b593c8f965a319860f693e2e64a4e57a6c4d734640ea6410547bb"} Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.232258 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" podStartSLOduration=190.232241923 podStartE2EDuration="3m10.232241923s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:25.229483419 +0000 UTC m=+226.625537462" watchObservedRunningTime="2026-03-08 19:35:25.232241923 +0000 UTC m=+226.628295946" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.258372 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dfb477bd6-857d4"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.278811 4885 scope.go:117] "RemoveContainer" containerID="8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e" Mar 08 19:35:25 crc kubenswrapper[4885]: E0308 19:35:25.279556 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e\": container with ID starting with 8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e not found: ID does not exist" containerID="8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.279598 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e"} err="failed to get container status \"8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e\": rpc error: code = NotFound desc = could not find container \"8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e\": container with ID starting with 8ce808f68becc326ec3b2e504608271d01f76084fbb018502d2b63dc01b4a25e not found: ID does not exist" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.291124 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gnjnd"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.297189 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t9fn7"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.303206 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.306134 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bg5wl"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307541 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-client-ca\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307600 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-client-ca\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307634 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d700393-14b8-4abe-b77b-b2bfd718f024-serving-cert\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307700 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8bff1-96c5-4c44-8e09-8f9785072c99-serving-cert\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7qs7\" (UniqueName: \"kubernetes.io/projected/26f8bff1-96c5-4c44-8e09-8f9785072c99-kube-api-access-q7qs7\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307766 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krn88\" (UniqueName: \"kubernetes.io/projected/7d700393-14b8-4abe-b77b-b2bfd718f024-kube-api-access-krn88\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307829 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-config\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307864 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-proxy-ca-bundles\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.307950 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-config\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.380142 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c7583a8-a980-4ab2-a594-bf55ec72c91c" path="/var/lib/kubelet/pods/4c7583a8-a980-4ab2-a594-bf55ec72c91c/volumes" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.381013 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.381569 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0" path="/var/lib/kubelet/pods/f39cc9ba-d468-4c1f-b461-aaefbb0ef4c0/volumes" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409075 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-client-ca\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409110 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-client-ca\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d700393-14b8-4abe-b77b-b2bfd718f024-serving-cert\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409158 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8bff1-96c5-4c44-8e09-8f9785072c99-serving-cert\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409181 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7qs7\" (UniqueName: \"kubernetes.io/projected/26f8bff1-96c5-4c44-8e09-8f9785072c99-kube-api-access-q7qs7\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409197 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krn88\" (UniqueName: \"kubernetes.io/projected/7d700393-14b8-4abe-b77b-b2bfd718f024-kube-api-access-krn88\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409217 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-config\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409235 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-proxy-ca-bundles\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.409268 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-config\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.410501 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-client-ca\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.411262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-config\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.411512 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-config\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.411576 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-proxy-ca-bundles\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.411903 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-client-ca\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.415890 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d700393-14b8-4abe-b77b-b2bfd718f024-serving-cert\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.416181 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8bff1-96c5-4c44-8e09-8f9785072c99-serving-cert\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.428361 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krn88\" (UniqueName: \"kubernetes.io/projected/7d700393-14b8-4abe-b77b-b2bfd718f024-kube-api-access-krn88\") pod \"controller-manager-6dfb477bd6-857d4\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.431307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7qs7\" (UniqueName: \"kubernetes.io/projected/26f8bff1-96c5-4c44-8e09-8f9785072c99-kube-api-access-q7qs7\") pod \"route-controller-manager-5b7fc57755-9b4xh\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.556326 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:25 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:25 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:25 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.556620 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.559316 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.561831 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.671015 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-59wjr"] Mar 08 19:35:25 crc kubenswrapper[4885]: I0308 19:35:25.794418 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.091408 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dfb477bd6-857d4"] Mar 08 19:35:26 crc kubenswrapper[4885]: W0308 19:35:26.105789 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d700393_14b8_4abe_b77b_b2bfd718f024.slice/crio-f4b7c93c9aa4f548e3748b7d04e19d0043328272783835d3fd64c609b4f368bf WatchSource:0}: Error finding container f4b7c93c9aa4f548e3748b7d04e19d0043328272783835d3fd64c609b4f368bf: Status 404 returned error can't find the container with id f4b7c93c9aa4f548e3748b7d04e19d0043328272783835d3fd64c609b4f368bf Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.240452 4885 generic.go:334] "Generic (PLEG): container finished" podID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerID="87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54" exitCode=0 Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.240523 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerDied","Data":"87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.242351 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" event={"ID":"7d700393-14b8-4abe-b77b-b2bfd718f024","Type":"ContainerStarted","Data":"f4b7c93c9aa4f548e3748b7d04e19d0043328272783835d3fd64c609b4f368bf"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.244485 4885 generic.go:334] "Generic (PLEG): container finished" podID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerID="3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd" exitCode=0 Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.244545 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerDied","Data":"3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.244565 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerStarted","Data":"e265209fea1ffc2ce0afb5176cc04ccaf2989324d8da3b88689366337825e2af"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.252071 4885 generic.go:334] "Generic (PLEG): container finished" podID="038004f7-92de-42b0-8951-447dfdaf2f83" containerID="3a239ba546d78b237c7c4654423d382cd43547556be78002c1edfa9e41f28b19" exitCode=0 Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.252156 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerDied","Data":"3a239ba546d78b237c7c4654423d382cd43547556be78002c1edfa9e41f28b19"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.252199 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerStarted","Data":"75e0661a1b23d764b6ba36fa438a7e6bc5398d8d64dd93e640a640cee2d85a8d"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.256168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" event={"ID":"26f8bff1-96c5-4c44-8e09-8f9785072c99","Type":"ContainerStarted","Data":"c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.256200 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" event={"ID":"26f8bff1-96c5-4c44-8e09-8f9785072c99","Type":"ContainerStarted","Data":"bfeb46af4657b2486ffad869effd2392e9613e3a9f1f17a4e0c902d67a6fb26b"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.256403 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.273304 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" podStartSLOduration=3.27328499 podStartE2EDuration="3.27328499s" podCreationTimestamp="2026-03-08 19:35:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:26.272785506 +0000 UTC m=+227.668839529" watchObservedRunningTime="2026-03-08 19:35:26.27328499 +0000 UTC m=+227.669339023" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.277029 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerDied","Data":"c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.277243 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerID="c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f" exitCode=0 Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.277319 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerStarted","Data":"de0e60604d3aa86bafd041642af24e9211dfd9322182b13ae9a6b56c608e4e2c"} Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.474424 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.507242 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-62xgk"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.508204 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.510099 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.544503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62xgk"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.555959 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:26 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:26 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:26 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.556246 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.624503 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st8f9\" (UniqueName: \"kubernetes.io/projected/05666e0b-c4ce-451a-ba67-ddb78866ef54-kube-api-access-st8f9\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.624556 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-utilities\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.624593 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-catalog-content\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.725406 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-utilities\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.725486 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-catalog-content\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.725585 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st8f9\" (UniqueName: \"kubernetes.io/projected/05666e0b-c4ce-451a-ba67-ddb78866ef54-kube-api-access-st8f9\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.726206 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-catalog-content\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.726238 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-utilities\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.732751 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.733604 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.736633 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.736941 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.748113 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.768172 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st8f9\" (UniqueName: \"kubernetes.io/projected/05666e0b-c4ce-451a-ba67-ddb78866ef54-kube-api-access-st8f9\") pod \"redhat-marketplace-62xgk\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.821910 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.827123 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.827167 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.833690 4885 ???:1] "http: TLS handshake error from 192.168.126.11:39224: no serving certificate available for the kubelet" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.904400 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cptvd"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.905821 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.917078 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cptvd"] Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.929425 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.929465 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.929534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:26 crc kubenswrapper[4885]: I0308 19:35:26.971169 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.030830 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-catalog-content\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.031016 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn2mq\" (UniqueName: \"kubernetes.io/projected/b30ce2c5-2b53-47aa-8470-394dd0d6256a-kube-api-access-jn2mq\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.031042 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-utilities\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.056306 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.120329 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.120616 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.121764 4885 patch_prober.go:28] interesting pod/console-f9d7485db-hsdmw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.121814 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hsdmw" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.132121 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn2mq\" (UniqueName: \"kubernetes.io/projected/b30ce2c5-2b53-47aa-8470-394dd0d6256a-kube-api-access-jn2mq\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.132196 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-utilities\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.132246 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-catalog-content\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.132977 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-catalog-content\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.133309 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-utilities\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.152998 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn2mq\" (UniqueName: \"kubernetes.io/projected/b30ce2c5-2b53-47aa-8470-394dd0d6256a-kube-api-access-jn2mq\") pod \"redhat-marketplace-cptvd\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.211776 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2b7w container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.211830 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-t2b7w" podUID="5ac2fbf9-c9bb-4ef8-988f-4407e688ad54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.211864 4885 patch_prober.go:28] interesting pod/downloads-7954f5f757-t2b7w container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" start-of-body= Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.211929 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-t2b7w" podUID="5ac2fbf9-c9bb-4ef8-988f-4407e688ad54" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.33:8080/\": dial tcp 10.217.0.33:8080: connect: connection refused" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.223040 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.244355 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.244540 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.253858 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.291912 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" event={"ID":"7d700393-14b8-4abe-b77b-b2bfd718f024","Type":"ContainerStarted","Data":"a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16"} Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.293759 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.298414 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.302345 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-fq2fp" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.312310 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" podStartSLOduration=4.312289752 podStartE2EDuration="4.312289752s" podCreationTimestamp="2026-03-08 19:35:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:27.308202533 +0000 UTC m=+228.704256566" watchObservedRunningTime="2026-03-08 19:35:27.312289752 +0000 UTC m=+228.708343765" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.317090 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.317143 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.331671 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.509748 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-prdq9"] Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.515121 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.521509 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.549956 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prdq9"] Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.560668 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.564548 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:27 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:27 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:27 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.564592 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.646308 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5wpq\" (UniqueName: \"kubernetes.io/projected/56c146b0-3448-4140-8cf0-8d637f7f22a9-kube-api-access-j5wpq\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.646500 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-utilities\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.646614 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-catalog-content\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.747712 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5wpq\" (UniqueName: \"kubernetes.io/projected/56c146b0-3448-4140-8cf0-8d637f7f22a9-kube-api-access-j5wpq\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.747774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-utilities\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.747799 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-catalog-content\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.748309 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-utilities\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.748337 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-catalog-content\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.782496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5wpq\" (UniqueName: \"kubernetes.io/projected/56c146b0-3448-4140-8cf0-8d637f7f22a9-kube-api-access-j5wpq\") pod \"redhat-operators-prdq9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.803597 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.804219 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.808648 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.811223 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.815743 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.843044 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.849679 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.849747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.903627 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pqxt7"] Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.904556 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.912276 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pqxt7"] Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.951114 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.951185 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-utilities\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.951228 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.951353 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.951420 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-catalog-content\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.951507 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x76vh\" (UniqueName: \"kubernetes.io/projected/8881ba5e-d9d1-42a9-98af-849e72053757-kube-api-access-x76vh\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:27 crc kubenswrapper[4885]: I0308 19:35:27.983879 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.026791 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.052528 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-utilities\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.052609 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-catalog-content\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.052641 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x76vh\" (UniqueName: \"kubernetes.io/projected/8881ba5e-d9d1-42a9-98af-849e72053757-kube-api-access-x76vh\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.053126 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-utilities\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.053215 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-catalog-content\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.068829 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x76vh\" (UniqueName: \"kubernetes.io/projected/8881ba5e-d9d1-42a9-98af-849e72053757-kube-api-access-x76vh\") pod \"redhat-operators-pqxt7\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.124304 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.223845 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.305892 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-spqp8" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.557808 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:28 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:28 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:28 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.557850 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:28 crc kubenswrapper[4885]: I0308 19:35:28.759981 4885 ???:1] "http: TLS handshake error from 192.168.126.11:39236: no serving certificate available for the kubelet" Mar 08 19:35:29 crc kubenswrapper[4885]: I0308 19:35:29.555381 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:29 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:29 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:29 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:29 crc kubenswrapper[4885]: I0308 19:35:29.555437 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:30 crc kubenswrapper[4885]: I0308 19:35:30.555181 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:30 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:30 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:30 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:30 crc kubenswrapper[4885]: I0308 19:35:30.555257 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:31 crc kubenswrapper[4885]: I0308 19:35:31.556503 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:31 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:31 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:31 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:31 crc kubenswrapper[4885]: I0308 19:35:31.556796 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:31 crc kubenswrapper[4885]: I0308 19:35:31.988346 4885 ???:1] "http: TLS handshake error from 192.168.126.11:39246: no serving certificate available for the kubelet" Mar 08 19:35:32 crc kubenswrapper[4885]: I0308 19:35:32.651622 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:32 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:32 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:32 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:32 crc kubenswrapper[4885]: I0308 19:35:32.651695 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:32 crc kubenswrapper[4885]: I0308 19:35:32.829026 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:35:32 crc kubenswrapper[4885]: I0308 19:35:32.829114 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:35:32 crc kubenswrapper[4885]: I0308 19:35:32.861762 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:35:33 crc kubenswrapper[4885]: I0308 19:35:33.051151 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-s5pkw" Mar 08 19:35:33 crc kubenswrapper[4885]: I0308 19:35:33.664120 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:33 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:33 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:33 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:33 crc kubenswrapper[4885]: I0308 19:35:33.664185 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.287590 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pqxt7"] Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.349571 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqxt7" event={"ID":"8881ba5e-d9d1-42a9-98af-849e72053757","Type":"ContainerStarted","Data":"789329547b46208ee5dc38fb335a56f42b713329e6d11c7a30e5d4042c3f9ea3"} Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.361177 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prdq9"] Mar 08 19:35:34 crc kubenswrapper[4885]: W0308 19:35:34.377578 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-2d0ba41af79384822819519e78d3dc7f4370f70eb9f77bb319b3453eb2e2a641 WatchSource:0}: Error finding container 2d0ba41af79384822819519e78d3dc7f4370f70eb9f77bb319b3453eb2e2a641: Status 404 returned error can't find the container with id 2d0ba41af79384822819519e78d3dc7f4370f70eb9f77bb319b3453eb2e2a641 Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.558419 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:34 crc kubenswrapper[4885]: [-]has-synced failed: reason withheld Mar 08 19:35:34 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:34 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.558463 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.588169 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.594525 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cptvd"] Mar 08 19:35:34 crc kubenswrapper[4885]: W0308 19:35:34.595903 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0d3903bd_e9e1_4d1e_a03b_886542a8f32c.slice/crio-b1bf8ba4cba738dec3a3f34c5ac73eaf8187f787c6b76bfe6996ae90d21c77b4 WatchSource:0}: Error finding container b1bf8ba4cba738dec3a3f34c5ac73eaf8187f787c6b76bfe6996ae90d21c77b4: Status 404 returned error can't find the container with id b1bf8ba4cba738dec3a3f34c5ac73eaf8187f787c6b76bfe6996ae90d21c77b4 Mar 08 19:35:34 crc kubenswrapper[4885]: W0308 19:35:34.599960 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb30ce2c5_2b53_47aa_8470_394dd0d6256a.slice/crio-abebee8a42d402151f81519cf2493ac06e94550afb46b420c76c20ec60907798 WatchSource:0}: Error finding container abebee8a42d402151f81519cf2493ac06e94550afb46b420c76c20ec60907798: Status 404 returned error can't find the container with id abebee8a42d402151f81519cf2493ac06e94550afb46b420c76c20ec60907798 Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.642409 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 08 19:35:34 crc kubenswrapper[4885]: I0308 19:35:34.657754 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62xgk"] Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.679602 4885 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-ftkzn container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.680088 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" podUID="5a244e04-1aec-4355-89c5-794667b5969f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.680839 4885 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-ftkzn container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.680892 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ftkzn" podUID="5a244e04-1aec-4355-89c5-794667b5969f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.806836 4885 patch_prober.go:28] interesting pod/controller-manager-6dfb477bd6-857d4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.806899 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" podUID="7d700393-14b8-4abe-b77b-b2bfd718f024" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.840543 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.840591 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.840810 4885 patch_prober.go:28] interesting pod/route-controller-manager-5b7fc57755-9b4xh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.840886 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" podUID="26f8bff1-96c5-4c44-8e09-8f9785072c99" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.845139 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage4258948150/2\": happened during read: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.845254 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdjrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-59wjr_openshift-marketplace(7346fb7f-6125-49c7-a422-cc169bc7e045): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage4258948150/2\": happened during read: context canceled" logger="UnhandledError" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.846224 4885 patch_prober.go:28] interesting pod/router-default-5444994796-lvfcn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 08 19:35:44 crc kubenswrapper[4885]: [+]has-synced ok Mar 08 19:35:44 crc kubenswrapper[4885]: [+]process-running ok Mar 08 19:35:44 crc kubenswrapper[4885]: healthz check failed Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.846672 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-lvfcn" podUID="e58e5e9a-de88-4209-8100-e9d4e415e68d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.846434 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage4258948150/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/certified-operators-59wjr" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.846872 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3548377267/3\": happened during read: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.847124 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8cp8v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gnjnd_openshift-marketplace(2a6b85b3-0bb1-4199-983f-615a6c932f09): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3548377267/3\": happened during read: context canceled" logger="UnhandledError" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.849341 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage3548377267/3\\\": happened during read: context canceled\"" pod="openshift-marketplace/community-operators-gnjnd" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.869983 4885 patch_prober.go:28] interesting pod/console-f9d7485db-hsdmw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.870214 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hsdmw" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 08 19:35:44 crc kubenswrapper[4885]: E0308 19:35:44.876152 4885 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="9.385s" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.876284 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0d3903bd-e9e1-4d1e-a03b-886542a8f32c","Type":"ContainerStarted","Data":"b1bf8ba4cba738dec3a3f34c5ac73eaf8187f787c6b76bfe6996ae90d21c77b4"} Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.908318 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.908574 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cptvd" event={"ID":"b30ce2c5-2b53-47aa-8470-394dd0d6256a","Type":"ContainerStarted","Data":"abebee8a42d402151f81519cf2493ac06e94550afb46b420c76c20ec60907798"} Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.910760 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-t2b7w" Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.920572 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerStarted","Data":"2d0ba41af79384822819519e78d3dc7f4370f70eb9f77bb319b3453eb2e2a641"} Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.926741 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dfb477bd6-857d4"] Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.926951 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" podUID="7d700393-14b8-4abe-b77b-b2bfd718f024" containerName="controller-manager" containerID="cri-o://a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16" gracePeriod=30 Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.930279 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh"] Mar 08 19:35:44 crc kubenswrapper[4885]: I0308 19:35:44.930421 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" podUID="26f8bff1-96c5-4c44-8e09-8f9785072c99" containerName="route-controller-manager" containerID="cri-o://c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534" gracePeriod=30 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.426110 4885 csr.go:261] certificate signing request csr-27k6w is approved, waiting to be issued Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.436047 4885 csr.go:257] certificate signing request csr-27k6w is issued Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.462253 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.466785 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.557815 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.560320 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-lvfcn" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627428 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-config\") pod \"26f8bff1-96c5-4c44-8e09-8f9785072c99\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627477 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8bff1-96c5-4c44-8e09-8f9785072c99-serving-cert\") pod \"26f8bff1-96c5-4c44-8e09-8f9785072c99\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627660 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-config\") pod \"7d700393-14b8-4abe-b77b-b2bfd718f024\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627721 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-proxy-ca-bundles\") pod \"7d700393-14b8-4abe-b77b-b2bfd718f024\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627741 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-client-ca\") pod \"26f8bff1-96c5-4c44-8e09-8f9785072c99\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627758 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-client-ca\") pod \"7d700393-14b8-4abe-b77b-b2bfd718f024\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627784 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krn88\" (UniqueName: \"kubernetes.io/projected/7d700393-14b8-4abe-b77b-b2bfd718f024-kube-api-access-krn88\") pod \"7d700393-14b8-4abe-b77b-b2bfd718f024\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627809 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d700393-14b8-4abe-b77b-b2bfd718f024-serving-cert\") pod \"7d700393-14b8-4abe-b77b-b2bfd718f024\" (UID: \"7d700393-14b8-4abe-b77b-b2bfd718f024\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.627839 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7qs7\" (UniqueName: \"kubernetes.io/projected/26f8bff1-96c5-4c44-8e09-8f9785072c99-kube-api-access-q7qs7\") pod \"26f8bff1-96c5-4c44-8e09-8f9785072c99\" (UID: \"26f8bff1-96c5-4c44-8e09-8f9785072c99\") " Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.628824 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7d700393-14b8-4abe-b77b-b2bfd718f024" (UID: "7d700393-14b8-4abe-b77b-b2bfd718f024"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.628839 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-client-ca" (OuterVolumeSpecName: "client-ca") pod "26f8bff1-96c5-4c44-8e09-8f9785072c99" (UID: "26f8bff1-96c5-4c44-8e09-8f9785072c99"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.628880 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-config" (OuterVolumeSpecName: "config") pod "7d700393-14b8-4abe-b77b-b2bfd718f024" (UID: "7d700393-14b8-4abe-b77b-b2bfd718f024"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.629108 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-client-ca" (OuterVolumeSpecName: "client-ca") pod "7d700393-14b8-4abe-b77b-b2bfd718f024" (UID: "7d700393-14b8-4abe-b77b-b2bfd718f024"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.629501 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-config" (OuterVolumeSpecName: "config") pod "26f8bff1-96c5-4c44-8e09-8f9785072c99" (UID: "26f8bff1-96c5-4c44-8e09-8f9785072c99"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.634870 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d700393-14b8-4abe-b77b-b2bfd718f024-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7d700393-14b8-4abe-b77b-b2bfd718f024" (UID: "7d700393-14b8-4abe-b77b-b2bfd718f024"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.637610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26f8bff1-96c5-4c44-8e09-8f9785072c99-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "26f8bff1-96c5-4c44-8e09-8f9785072c99" (UID: "26f8bff1-96c5-4c44-8e09-8f9785072c99"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.639162 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26f8bff1-96c5-4c44-8e09-8f9785072c99-kube-api-access-q7qs7" (OuterVolumeSpecName: "kube-api-access-q7qs7") pod "26f8bff1-96c5-4c44-8e09-8f9785072c99" (UID: "26f8bff1-96c5-4c44-8e09-8f9785072c99"). InnerVolumeSpecName "kube-api-access-q7qs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.644289 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d700393-14b8-4abe-b77b-b2bfd718f024-kube-api-access-krn88" (OuterVolumeSpecName: "kube-api-access-krn88") pod "7d700393-14b8-4abe-b77b-b2bfd718f024" (UID: "7d700393-14b8-4abe-b77b-b2bfd718f024"). InnerVolumeSpecName "kube-api-access-krn88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730377 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730429 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730451 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730468 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7d700393-14b8-4abe-b77b-b2bfd718f024-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730486 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krn88\" (UniqueName: \"kubernetes.io/projected/7d700393-14b8-4abe-b77b-b2bfd718f024-kube-api-access-krn88\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730502 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d700393-14b8-4abe-b77b-b2bfd718f024-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730519 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7qs7\" (UniqueName: \"kubernetes.io/projected/26f8bff1-96c5-4c44-8e09-8f9785072c99-kube-api-access-q7qs7\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730536 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/26f8bff1-96c5-4c44-8e09-8f9785072c99-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.730551 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/26f8bff1-96c5-4c44-8e09-8f9785072c99-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.928664 4885 generic.go:334] "Generic (PLEG): container finished" podID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerID="6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.928743 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerDied","Data":"6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.931136 4885 generic.go:334] "Generic (PLEG): container finished" podID="e7fdc7e5-2196-4f4f-84d4-dfc82848cb90" containerID="0ee19fdd420f7949dd0bf49daa93e801fc425de54a37700ece1f21c1a92a4055" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.931183 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90","Type":"ContainerDied","Data":"0ee19fdd420f7949dd0bf49daa93e801fc425de54a37700ece1f21c1a92a4055"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.931203 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90","Type":"ContainerStarted","Data":"464a16507c5cff61fcfdb5c5e82e0d8d5f3181505247b17da9c83a9f55dd19dc"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.934088 4885 generic.go:334] "Generic (PLEG): container finished" podID="7d700393-14b8-4abe-b77b-b2bfd718f024" containerID="a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.934141 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" event={"ID":"7d700393-14b8-4abe-b77b-b2bfd718f024","Type":"ContainerDied","Data":"a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.934158 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" event={"ID":"7d700393-14b8-4abe-b77b-b2bfd718f024","Type":"ContainerDied","Data":"f4b7c93c9aa4f548e3748b7d04e19d0043328272783835d3fd64c609b4f368bf"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.934174 4885 scope.go:117] "RemoveContainer" containerID="a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.934324 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dfb477bd6-857d4" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.936895 4885 generic.go:334] "Generic (PLEG): container finished" podID="8881ba5e-d9d1-42a9-98af-849e72053757" containerID="3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.937050 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqxt7" event={"ID":"8881ba5e-d9d1-42a9-98af-849e72053757","Type":"ContainerDied","Data":"3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.939063 4885 generic.go:334] "Generic (PLEG): container finished" podID="26f8bff1-96c5-4c44-8e09-8f9785072c99" containerID="c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.939187 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" event={"ID":"26f8bff1-96c5-4c44-8e09-8f9785072c99","Type":"ContainerDied","Data":"c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.939236 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" event={"ID":"26f8bff1-96c5-4c44-8e09-8f9785072c99","Type":"ContainerDied","Data":"bfeb46af4657b2486ffad869effd2392e9613e3a9f1f17a4e0c902d67a6fb26b"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.939356 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh" Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.943087 4885 generic.go:334] "Generic (PLEG): container finished" podID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerID="539ba9557c56dcf3cdbab11c5e667581fd8e8b4b7a9df312373694f7ca85489f" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.943157 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cptvd" event={"ID":"b30ce2c5-2b53-47aa-8470-394dd0d6256a","Type":"ContainerDied","Data":"539ba9557c56dcf3cdbab11c5e667581fd8e8b4b7a9df312373694f7ca85489f"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.945834 4885 generic.go:334] "Generic (PLEG): container finished" podID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerID="b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.946044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62xgk" event={"ID":"05666e0b-c4ce-451a-ba67-ddb78866ef54","Type":"ContainerDied","Data":"b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.946094 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62xgk" event={"ID":"05666e0b-c4ce-451a-ba67-ddb78866ef54","Type":"ContainerStarted","Data":"731327ad0ac3fd48c5dcf825c4aabc506f0114149e811eabdfb465d917e7e122"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.948324 4885 generic.go:334] "Generic (PLEG): container finished" podID="60da1edb-8474-4368-a6ae-0bb2b1b7b845" containerID="67a173c7d23f4e826672d366138f3bcda3d03e275f18ddffd328f814fe2d0924" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.948392 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" event={"ID":"60da1edb-8474-4368-a6ae-0bb2b1b7b845","Type":"ContainerDied","Data":"67a173c7d23f4e826672d366138f3bcda3d03e275f18ddffd328f814fe2d0924"} Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.952045 4885 generic.go:334] "Generic (PLEG): container finished" podID="0d3903bd-e9e1-4d1e-a03b-886542a8f32c" containerID="982a06c5fb1623eb2d10a88c4dd26658f16bff03da78f1ab525cd051e22aaa20" exitCode=0 Mar 08 19:35:45 crc kubenswrapper[4885]: I0308 19:35:45.952205 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0d3903bd-e9e1-4d1e-a03b-886542a8f32c","Type":"ContainerDied","Data":"982a06c5fb1623eb2d10a88c4dd26658f16bff03da78f1ab525cd051e22aaa20"} Mar 08 19:35:45 crc kubenswrapper[4885]: E0308 19:35:45.956944 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gnjnd" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" Mar 08 19:35:45 crc kubenswrapper[4885]: E0308 19:35:45.958293 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-59wjr" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.024704 4885 scope.go:117] "RemoveContainer" containerID="a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16" Mar 08 19:35:46 crc kubenswrapper[4885]: E0308 19:35:46.025249 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16\": container with ID starting with a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16 not found: ID does not exist" containerID="a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.025307 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16"} err="failed to get container status \"a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16\": rpc error: code = NotFound desc = could not find container \"a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16\": container with ID starting with a2f1f19894bbd7d28c72e4129d9358ba63bafbfb3bc8f30b6f824914620ccd16 not found: ID does not exist" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.025328 4885 scope.go:117] "RemoveContainer" containerID="c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.047676 4885 scope.go:117] "RemoveContainer" containerID="c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534" Mar 08 19:35:46 crc kubenswrapper[4885]: E0308 19:35:46.048154 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534\": container with ID starting with c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534 not found: ID does not exist" containerID="c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.048226 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534"} err="failed to get container status \"c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534\": rpc error: code = NotFound desc = could not find container \"c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534\": container with ID starting with c12967d0b6aa99bbff89379bbb56292b3fe954155f50d98f04cc5bdc2484e534 not found: ID does not exist" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.139053 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.142476 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7fc57755-9b4xh"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.151704 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dfb477bd6-857d4"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.157105 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dfb477bd6-857d4"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.438054 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-16 17:37:17.831812975 +0000 UTC Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.438094 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7534h1m31.393721884s for next certificate rotation Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.494602 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr"] Mar 08 19:35:46 crc kubenswrapper[4885]: E0308 19:35:46.495052 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26f8bff1-96c5-4c44-8e09-8f9785072c99" containerName="route-controller-manager" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.495075 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="26f8bff1-96c5-4c44-8e09-8f9785072c99" containerName="route-controller-manager" Mar 08 19:35:46 crc kubenswrapper[4885]: E0308 19:35:46.495108 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d700393-14b8-4abe-b77b-b2bfd718f024" containerName="controller-manager" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.495122 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d700393-14b8-4abe-b77b-b2bfd718f024" containerName="controller-manager" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.495314 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="26f8bff1-96c5-4c44-8e09-8f9785072c99" containerName="route-controller-manager" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.495336 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d700393-14b8-4abe-b77b-b2bfd718f024" containerName="controller-manager" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.496111 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.499826 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.499875 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.500298 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.500576 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.500862 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.504469 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.513362 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b4ff869f4-px66j"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.521038 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.528898 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.529160 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.530914 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.531331 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.531971 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.532152 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.532060 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.542601 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b4ff869f4-px66j"] Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.553601 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644533 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-client-ca\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644623 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-config\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644660 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-proxy-ca-bundles\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644687 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-client-ca\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644714 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm58j\" (UniqueName: \"kubernetes.io/projected/0d5c0c11-15e0-47ed-817c-939899828e1e-kube-api-access-cm58j\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644741 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-config\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644772 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab31615e-dc70-40ba-9b47-6a3f119e91d9-serving-cert\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644800 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5c0c11-15e0-47ed-817c-939899828e1e-serving-cert\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.644835 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c67vv\" (UniqueName: \"kubernetes.io/projected/ab31615e-dc70-40ba-9b47-6a3f119e91d9-kube-api-access-c67vv\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745573 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-client-ca\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745691 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-config\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745741 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-proxy-ca-bundles\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745777 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-client-ca\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745830 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm58j\" (UniqueName: \"kubernetes.io/projected/0d5c0c11-15e0-47ed-817c-939899828e1e-kube-api-access-cm58j\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745875 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-config\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.745984 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab31615e-dc70-40ba-9b47-6a3f119e91d9-serving-cert\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.746035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5c0c11-15e0-47ed-817c-939899828e1e-serving-cert\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.746095 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c67vv\" (UniqueName: \"kubernetes.io/projected/ab31615e-dc70-40ba-9b47-6a3f119e91d9-kube-api-access-c67vv\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.747589 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-client-ca\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.748811 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-client-ca\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.748994 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-config\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.750334 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-proxy-ca-bundles\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.751616 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-config\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.756445 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab31615e-dc70-40ba-9b47-6a3f119e91d9-serving-cert\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.756456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5c0c11-15e0-47ed-817c-939899828e1e-serving-cert\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.771856 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm58j\" (UniqueName: \"kubernetes.io/projected/0d5c0c11-15e0-47ed-817c-939899828e1e-kube-api-access-cm58j\") pod \"controller-manager-7b4ff869f4-px66j\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.774531 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c67vv\" (UniqueName: \"kubernetes.io/projected/ab31615e-dc70-40ba-9b47-6a3f119e91d9-kube-api-access-c67vv\") pod \"route-controller-manager-57b7c8cb86-7xrpr\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.831673 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:46 crc kubenswrapper[4885]: I0308 19:35:46.852136 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.082989 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b4ff869f4-px66j"] Mar 08 19:35:47 crc kubenswrapper[4885]: W0308 19:35:47.101600 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d5c0c11_15e0_47ed_817c_939899828e1e.slice/crio-aa200f5debabe8fc7be4d573b21b524afcb2289d907cd50fd947f48edbba3041 WatchSource:0}: Error finding container aa200f5debabe8fc7be4d573b21b524afcb2289d907cd50fd947f48edbba3041: Status 404 returned error can't find the container with id aa200f5debabe8fc7be4d573b21b524afcb2289d907cd50fd947f48edbba3041 Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.121186 4885 patch_prober.go:28] interesting pod/console-f9d7485db-hsdmw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.121244 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hsdmw" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" probeResult="failure" output="Get \"https://10.217.0.10:8443/health\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.243418 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.249196 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.252434 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.328490 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr"] Mar 08 19:35:47 crc kubenswrapper[4885]: W0308 19:35:47.336479 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab31615e_dc70_40ba_9b47_6a3f119e91d9.slice/crio-8dc334277d583f00a5dd853f83e99a7d51476aa8a86ed90046e854475f10a759 WatchSource:0}: Error finding container 8dc334277d583f00a5dd853f83e99a7d51476aa8a86ed90046e854475f10a759: Status 404 returned error can't find the container with id 8dc334277d583f00a5dd853f83e99a7d51476aa8a86ed90046e854475f10a759 Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.360754 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kube-api-access\") pod \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.360852 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kube-api-access\") pod \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.360883 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qtctx\" (UniqueName: \"kubernetes.io/projected/60da1edb-8474-4368-a6ae-0bb2b1b7b845-kube-api-access-qtctx\") pod \"60da1edb-8474-4368-a6ae-0bb2b1b7b845\" (UID: \"60da1edb-8474-4368-a6ae-0bb2b1b7b845\") " Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.360939 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kubelet-dir\") pod \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\" (UID: \"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90\") " Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.360983 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kubelet-dir\") pod \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\" (UID: \"0d3903bd-e9e1-4d1e-a03b-886542a8f32c\") " Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.361332 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0d3903bd-e9e1-4d1e-a03b-886542a8f32c" (UID: "0d3903bd-e9e1-4d1e-a03b-886542a8f32c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.361380 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e7fdc7e5-2196-4f4f-84d4-dfc82848cb90" (UID: "e7fdc7e5-2196-4f4f-84d4-dfc82848cb90"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.366750 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7fdc7e5-2196-4f4f-84d4-dfc82848cb90" (UID: "e7fdc7e5-2196-4f4f-84d4-dfc82848cb90"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.367301 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60da1edb-8474-4368-a6ae-0bb2b1b7b845-kube-api-access-qtctx" (OuterVolumeSpecName: "kube-api-access-qtctx") pod "60da1edb-8474-4368-a6ae-0bb2b1b7b845" (UID: "60da1edb-8474-4368-a6ae-0bb2b1b7b845"). InnerVolumeSpecName "kube-api-access-qtctx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.368450 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0d3903bd-e9e1-4d1e-a03b-886542a8f32c" (UID: "0d3903bd-e9e1-4d1e-a03b-886542a8f32c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.376332 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26f8bff1-96c5-4c44-8e09-8f9785072c99" path="/var/lib/kubelet/pods/26f8bff1-96c5-4c44-8e09-8f9785072c99/volumes" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.377153 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d700393-14b8-4abe-b77b-b2bfd718f024" path="/var/lib/kubelet/pods/7d700393-14b8-4abe-b77b-b2bfd718f024/volumes" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.438577 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-12 09:55:51.588079722 +0000 UTC Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.438892 4885 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6686h20m4.149191463s for next certificate rotation Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.463093 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.463128 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.463141 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qtctx\" (UniqueName: \"kubernetes.io/projected/60da1edb-8474-4368-a6ae-0bb2b1b7b845-kube-api-access-qtctx\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.463155 4885 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7fdc7e5-2196-4f4f-84d4-dfc82848cb90-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.463165 4885 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0d3903bd-e9e1-4d1e-a03b-886542a8f32c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.966986 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0d3903bd-e9e1-4d1e-a03b-886542a8f32c","Type":"ContainerDied","Data":"b1bf8ba4cba738dec3a3f34c5ac73eaf8187f787c6b76bfe6996ae90d21c77b4"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.967020 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1bf8ba4cba738dec3a3f34c5ac73eaf8187f787c6b76bfe6996ae90d21c77b4" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.967071 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.969444 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" event={"ID":"0d5c0c11-15e0-47ed-817c-939899828e1e","Type":"ContainerStarted","Data":"e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.969495 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" event={"ID":"0d5c0c11-15e0-47ed-817c-939899828e1e","Type":"ContainerStarted","Data":"aa200f5debabe8fc7be4d573b21b524afcb2289d907cd50fd947f48edbba3041"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.969825 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.971142 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" event={"ID":"ab31615e-dc70-40ba-9b47-6a3f119e91d9","Type":"ContainerStarted","Data":"7e9e48dffa2c1d1b35dc16a09da7078a95e25a71cb56e7dff87781cbf3d61f90"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.971302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" event={"ID":"ab31615e-dc70-40ba-9b47-6a3f119e91d9","Type":"ContainerStarted","Data":"8dc334277d583f00a5dd853f83e99a7d51476aa8a86ed90046e854475f10a759"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.971441 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.973366 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.973419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e7fdc7e5-2196-4f4f-84d4-dfc82848cb90","Type":"ContainerDied","Data":"464a16507c5cff61fcfdb5c5e82e0d8d5f3181505247b17da9c83a9f55dd19dc"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.973459 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="464a16507c5cff61fcfdb5c5e82e0d8d5f3181505247b17da9c83a9f55dd19dc" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.975307 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" event={"ID":"60da1edb-8474-4368-a6ae-0bb2b1b7b845","Type":"ContainerDied","Data":"37aa47577a13b111a43340382f6d3fe85ac7c15ce7d0d5052fe2dc63b6ee4d2c"} Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.975365 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="37aa47577a13b111a43340382f6d3fe85ac7c15ce7d0d5052fe2dc63b6ee4d2c" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.975368 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549974-jjqkh" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.981343 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:35:47 crc kubenswrapper[4885]: I0308 19:35:47.987342 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:35:48 crc kubenswrapper[4885]: I0308 19:35:48.004766 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" podStartSLOduration=4.004738468 podStartE2EDuration="4.004738468s" podCreationTimestamp="2026-03-08 19:35:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:47.9992101 +0000 UTC m=+249.395264203" watchObservedRunningTime="2026-03-08 19:35:48.004738468 +0000 UTC m=+249.400792531" Mar 08 19:35:48 crc kubenswrapper[4885]: I0308 19:35:48.026224 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" podStartSLOduration=4.026209123 podStartE2EDuration="4.026209123s" podCreationTimestamp="2026-03-08 19:35:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:35:48.02459011 +0000 UTC m=+249.420644133" watchObservedRunningTime="2026-03-08 19:35:48.026209123 +0000 UTC m=+249.422263146" Mar 08 19:35:54 crc kubenswrapper[4885]: E0308 19:35:54.494420 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage4151231740/3\": happened during read: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 08 19:35:54 crc kubenswrapper[4885]: E0308 19:35:54.494897 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gs66k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-xpctw_openshift-marketplace(7d8fbc68-3714-4fe4-9f62-857c5dc05661): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage4151231740/3\": happened during read: context canceled" logger="UnhandledError" Mar 08 19:35:54 crc kubenswrapper[4885]: E0308 19:35:54.496150 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage4151231740/3\\\": happened during read: context canceled\"" pod="openshift-marketplace/certified-operators-xpctw" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" Mar 08 19:35:54 crc kubenswrapper[4885]: E0308 19:35:54.519562 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2691475163/2\": happened during read: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 08 19:35:54 crc kubenswrapper[4885]: E0308 19:35:54.519777 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhfws,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-t9fn7_openshift-marketplace(038004f7-92de-42b0-8951-447dfdaf2f83): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2691475163/2\": happened during read: context canceled" logger="UnhandledError" Mar 08 19:35:54 crc kubenswrapper[4885]: E0308 19:35:54.521058 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage2691475163/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/community-operators-t9fn7" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" Mar 08 19:35:55 crc kubenswrapper[4885]: E0308 19:35:55.018874 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-xpctw" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" Mar 08 19:35:55 crc kubenswrapper[4885]: E0308 19:35:55.018943 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-t9fn7" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" Mar 08 19:35:55 crc kubenswrapper[4885]: E0308 19:35:55.216280 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-conmon-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:35:57 crc kubenswrapper[4885]: I0308 19:35:57.127280 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:57 crc kubenswrapper[4885]: I0308 19:35:57.134451 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:35:57 crc kubenswrapper[4885]: I0308 19:35:57.699166 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zvl2r" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.149189 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549976-nhqg4"] Mar 08 19:36:00 crc kubenswrapper[4885]: E0308 19:36:00.149507 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d3903bd-e9e1-4d1e-a03b-886542a8f32c" containerName="pruner" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.149528 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3903bd-e9e1-4d1e-a03b-886542a8f32c" containerName="pruner" Mar 08 19:36:00 crc kubenswrapper[4885]: E0308 19:36:00.149559 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60da1edb-8474-4368-a6ae-0bb2b1b7b845" containerName="oc" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.149574 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="60da1edb-8474-4368-a6ae-0bb2b1b7b845" containerName="oc" Mar 08 19:36:00 crc kubenswrapper[4885]: E0308 19:36:00.149595 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7fdc7e5-2196-4f4f-84d4-dfc82848cb90" containerName="pruner" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.149609 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7fdc7e5-2196-4f4f-84d4-dfc82848cb90" containerName="pruner" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.149810 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7fdc7e5-2196-4f4f-84d4-dfc82848cb90" containerName="pruner" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.150050 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="60da1edb-8474-4368-a6ae-0bb2b1b7b845" containerName="oc" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.150071 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d3903bd-e9e1-4d1e-a03b-886542a8f32c" containerName="pruner" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.150607 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.154848 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.154900 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.155050 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.180754 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbjqw\" (UniqueName: \"kubernetes.io/projected/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58-kube-api-access-zbjqw\") pod \"auto-csr-approver-29549976-nhqg4\" (UID: \"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58\") " pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.184319 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549976-nhqg4"] Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.282051 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbjqw\" (UniqueName: \"kubernetes.io/projected/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58-kube-api-access-zbjqw\") pod \"auto-csr-approver-29549976-nhqg4\" (UID: \"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58\") " pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.316671 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbjqw\" (UniqueName: \"kubernetes.io/projected/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58-kube-api-access-zbjqw\") pod \"auto-csr-approver-29549976-nhqg4\" (UID: \"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58\") " pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.487351 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:00 crc kubenswrapper[4885]: I0308 19:36:00.744636 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549976-nhqg4"] Mar 08 19:36:01 crc kubenswrapper[4885]: I0308 19:36:01.052483 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" event={"ID":"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58","Type":"ContainerStarted","Data":"d6279e33c9c34db2ad8eae44a300197105d2b923e0ae139b24d2c1b666cdd843"} Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.600093 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b4ff869f4-px66j"] Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.600428 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" podUID="0d5c0c11-15e0-47ed-817c-939899828e1e" containerName="controller-manager" containerID="cri-o://e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b" gracePeriod=30 Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.693970 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr"] Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.694345 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" podUID="ab31615e-dc70-40ba-9b47-6a3f119e91d9" containerName="route-controller-manager" containerID="cri-o://7e9e48dffa2c1d1b35dc16a09da7078a95e25a71cb56e7dff87781cbf3d61f90" gracePeriod=30 Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.819448 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.819510 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:36:02 crc kubenswrapper[4885]: I0308 19:36:02.962785 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.036199 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5c0c11-15e0-47ed-817c-939899828e1e-serving-cert\") pod \"0d5c0c11-15e0-47ed-817c-939899828e1e\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.036270 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-client-ca\") pod \"0d5c0c11-15e0-47ed-817c-939899828e1e\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.036288 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm58j\" (UniqueName: \"kubernetes.io/projected/0d5c0c11-15e0-47ed-817c-939899828e1e-kube-api-access-cm58j\") pod \"0d5c0c11-15e0-47ed-817c-939899828e1e\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.036348 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-proxy-ca-bundles\") pod \"0d5c0c11-15e0-47ed-817c-939899828e1e\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.036400 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-config\") pod \"0d5c0c11-15e0-47ed-817c-939899828e1e\" (UID: \"0d5c0c11-15e0-47ed-817c-939899828e1e\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.037498 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0d5c0c11-15e0-47ed-817c-939899828e1e" (UID: "0d5c0c11-15e0-47ed-817c-939899828e1e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.037523 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-client-ca" (OuterVolumeSpecName: "client-ca") pod "0d5c0c11-15e0-47ed-817c-939899828e1e" (UID: "0d5c0c11-15e0-47ed-817c-939899828e1e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.037614 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-config" (OuterVolumeSpecName: "config") pod "0d5c0c11-15e0-47ed-817c-939899828e1e" (UID: "0d5c0c11-15e0-47ed-817c-939899828e1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.044222 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d5c0c11-15e0-47ed-817c-939899828e1e-kube-api-access-cm58j" (OuterVolumeSpecName: "kube-api-access-cm58j") pod "0d5c0c11-15e0-47ed-817c-939899828e1e" (UID: "0d5c0c11-15e0-47ed-817c-939899828e1e"). InnerVolumeSpecName "kube-api-access-cm58j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.048558 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d5c0c11-15e0-47ed-817c-939899828e1e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0d5c0c11-15e0-47ed-817c-939899828e1e" (UID: "0d5c0c11-15e0-47ed-817c-939899828e1e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.064584 4885 generic.go:334] "Generic (PLEG): container finished" podID="0d5c0c11-15e0-47ed-817c-939899828e1e" containerID="e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b" exitCode=0 Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.064666 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.064785 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" event={"ID":"0d5c0c11-15e0-47ed-817c-939899828e1e","Type":"ContainerDied","Data":"e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b"} Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.064821 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b4ff869f4-px66j" event={"ID":"0d5c0c11-15e0-47ed-817c-939899828e1e","Type":"ContainerDied","Data":"aa200f5debabe8fc7be4d573b21b524afcb2289d907cd50fd947f48edbba3041"} Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.064843 4885 scope.go:117] "RemoveContainer" containerID="e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.067054 4885 generic.go:334] "Generic (PLEG): container finished" podID="ab31615e-dc70-40ba-9b47-6a3f119e91d9" containerID="7e9e48dffa2c1d1b35dc16a09da7078a95e25a71cb56e7dff87781cbf3d61f90" exitCode=0 Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.067095 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" event={"ID":"ab31615e-dc70-40ba-9b47-6a3f119e91d9","Type":"ContainerDied","Data":"7e9e48dffa2c1d1b35dc16a09da7078a95e25a71cb56e7dff87781cbf3d61f90"} Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.067117 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" event={"ID":"ab31615e-dc70-40ba-9b47-6a3f119e91d9","Type":"ContainerDied","Data":"8dc334277d583f00a5dd853f83e99a7d51476aa8a86ed90046e854475f10a759"} Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.067130 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dc334277d583f00a5dd853f83e99a7d51476aa8a86ed90046e854475f10a759" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.078468 4885 scope.go:117] "RemoveContainer" containerID="e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b" Mar 08 19:36:03 crc kubenswrapper[4885]: E0308 19:36:03.078769 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b\": container with ID starting with e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b not found: ID does not exist" containerID="e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.078793 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b"} err="failed to get container status \"e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b\": rpc error: code = NotFound desc = could not find container \"e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b\": container with ID starting with e5a89edfe6ebd777eda8a544bb5b8e0a4a601716445553bdbbd1581cc40ede1b not found: ID does not exist" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.079195 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.119265 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b4ff869f4-px66j"] Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.121772 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b4ff869f4-px66j"] Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138133 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab31615e-dc70-40ba-9b47-6a3f119e91d9-serving-cert\") pod \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138187 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c67vv\" (UniqueName: \"kubernetes.io/projected/ab31615e-dc70-40ba-9b47-6a3f119e91d9-kube-api-access-c67vv\") pod \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138238 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-config\") pod \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138345 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-client-ca\") pod \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\" (UID: \"ab31615e-dc70-40ba-9b47-6a3f119e91d9\") " Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138647 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d5c0c11-15e0-47ed-817c-939899828e1e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138659 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138690 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm58j\" (UniqueName: \"kubernetes.io/projected/0d5c0c11-15e0-47ed-817c-939899828e1e-kube-api-access-cm58j\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138713 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.138730 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d5c0c11-15e0-47ed-817c-939899828e1e-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.139163 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-config" (OuterVolumeSpecName: "config") pod "ab31615e-dc70-40ba-9b47-6a3f119e91d9" (UID: "ab31615e-dc70-40ba-9b47-6a3f119e91d9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.139157 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-client-ca" (OuterVolumeSpecName: "client-ca") pod "ab31615e-dc70-40ba-9b47-6a3f119e91d9" (UID: "ab31615e-dc70-40ba-9b47-6a3f119e91d9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.140873 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab31615e-dc70-40ba-9b47-6a3f119e91d9-kube-api-access-c67vv" (OuterVolumeSpecName: "kube-api-access-c67vv") pod "ab31615e-dc70-40ba-9b47-6a3f119e91d9" (UID: "ab31615e-dc70-40ba-9b47-6a3f119e91d9"). InnerVolumeSpecName "kube-api-access-c67vv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.140880 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab31615e-dc70-40ba-9b47-6a3f119e91d9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ab31615e-dc70-40ba-9b47-6a3f119e91d9" (UID: "ab31615e-dc70-40ba-9b47-6a3f119e91d9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.239569 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab31615e-dc70-40ba-9b47-6a3f119e91d9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.239629 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c67vv\" (UniqueName: \"kubernetes.io/projected/ab31615e-dc70-40ba-9b47-6a3f119e91d9-kube-api-access-c67vv\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.239665 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.239683 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab31615e-dc70-40ba-9b47-6a3f119e91d9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:03 crc kubenswrapper[4885]: I0308 19:36:03.384207 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d5c0c11-15e0-47ed-817c-939899828e1e" path="/var/lib/kubelet/pods/0d5c0c11-15e0-47ed-817c-939899828e1e/volumes" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.074723 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.102995 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr"] Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.107989 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-57b7c8cb86-7xrpr"] Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.518805 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d6687ccbf-8tvf8"] Mar 08 19:36:04 crc kubenswrapper[4885]: E0308 19:36:04.519152 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d5c0c11-15e0-47ed-817c-939899828e1e" containerName="controller-manager" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.519168 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d5c0c11-15e0-47ed-817c-939899828e1e" containerName="controller-manager" Mar 08 19:36:04 crc kubenswrapper[4885]: E0308 19:36:04.519645 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab31615e-dc70-40ba-9b47-6a3f119e91d9" containerName="route-controller-manager" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.519658 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab31615e-dc70-40ba-9b47-6a3f119e91d9" containerName="route-controller-manager" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.519790 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d5c0c11-15e0-47ed-817c-939899828e1e" containerName="controller-manager" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.519801 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab31615e-dc70-40ba-9b47-6a3f119e91d9" containerName="route-controller-manager" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.522419 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.528402 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.528710 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.529672 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.530853 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25"] Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.531623 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.531713 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.531969 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.533289 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.540963 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.541163 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d6687ccbf-8tvf8"] Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.546119 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25"] Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.580744 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-client-ca\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.580911 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87mqf\" (UniqueName: \"kubernetes.io/projected/97017523-4956-4bde-84a3-5859a2edf389-kube-api-access-87mqf\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581290 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtmxc\" (UniqueName: \"kubernetes.io/projected/0514086c-91a5-4b80-8920-b2b78d4faba3-kube-api-access-vtmxc\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581337 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-client-ca\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581400 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-config\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-proxy-ca-bundles\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581474 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-config\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581499 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0514086c-91a5-4b80-8920-b2b78d4faba3-serving-cert\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.581847 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97017523-4956-4bde-84a3-5859a2edf389-serving-cert\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.583151 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.583532 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.583832 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.583551 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.584244 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.584472 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684161 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-client-ca\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87mqf\" (UniqueName: \"kubernetes.io/projected/97017523-4956-4bde-84a3-5859a2edf389-kube-api-access-87mqf\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684337 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtmxc\" (UniqueName: \"kubernetes.io/projected/0514086c-91a5-4b80-8920-b2b78d4faba3-kube-api-access-vtmxc\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684377 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-client-ca\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-config\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684440 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-proxy-ca-bundles\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684476 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-config\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684508 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0514086c-91a5-4b80-8920-b2b78d4faba3-serving-cert\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.684571 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97017523-4956-4bde-84a3-5859a2edf389-serving-cert\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.685774 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-proxy-ca-bundles\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.686183 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-client-ca\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.686422 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-config\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.687935 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-client-ca\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.690613 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-config\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.692030 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0514086c-91a5-4b80-8920-b2b78d4faba3-serving-cert\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.692052 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97017523-4956-4bde-84a3-5859a2edf389-serving-cert\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.703551 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87mqf\" (UniqueName: \"kubernetes.io/projected/97017523-4956-4bde-84a3-5859a2edf389-kube-api-access-87mqf\") pod \"controller-manager-d6687ccbf-8tvf8\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.704032 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtmxc\" (UniqueName: \"kubernetes.io/projected/0514086c-91a5-4b80-8920-b2b78d4faba3-kube-api-access-vtmxc\") pod \"route-controller-manager-6c8756cb88-5rd25\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.908264 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:04 crc kubenswrapper[4885]: I0308 19:36:04.922960 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.241407 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d6687ccbf-8tvf8"] Mar 08 19:36:05 crc kubenswrapper[4885]: E0308 19:36:05.331032 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-conmon-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.348085 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25"] Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.383621 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab31615e-dc70-40ba-9b47-6a3f119e91d9" path="/var/lib/kubelet/pods/ab31615e-dc70-40ba-9b47-6a3f119e91d9/volumes" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.586161 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.586868 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.589330 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.589610 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.600044 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.700861 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5016b299-d93d-4bf9-9e51-a10e68da79bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.700930 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5016b299-d93d-4bf9-9e51-a10e68da79bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.802063 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5016b299-d93d-4bf9-9e51-a10e68da79bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.802118 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5016b299-d93d-4bf9-9e51-a10e68da79bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.802166 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5016b299-d93d-4bf9-9e51-a10e68da79bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.818762 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5016b299-d93d-4bf9-9e51-a10e68da79bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:05 crc kubenswrapper[4885]: I0308 19:36:05.905285 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.098786 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" event={"ID":"0514086c-91a5-4b80-8920-b2b78d4faba3","Type":"ContainerStarted","Data":"d780b5eabbfcbe56d150e28f4acfba07ce728041d152bf3d5c0c48b9dc32a456"} Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.098825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" event={"ID":"0514086c-91a5-4b80-8920-b2b78d4faba3","Type":"ContainerStarted","Data":"51d88566446210e3b83617f522140b5e6b40693eb161e53013b9f354546417e9"} Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.099504 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.102507 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" event={"ID":"97017523-4956-4bde-84a3-5859a2edf389","Type":"ContainerStarted","Data":"f930e10771b84878f3ee1e71b5063ef75713f3fd89c0b6b9084dd1d3903c6042"} Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.102546 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" event={"ID":"97017523-4956-4bde-84a3-5859a2edf389","Type":"ContainerStarted","Data":"6e9fa8f3dc6a11417fee7c798a28fbc65a8413f8866bef37973386caf66851c6"} Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.103102 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.107297 4885 generic.go:334] "Generic (PLEG): container finished" podID="5a29b0f8-8eee-4c05-9bbe-bebb70f16e58" containerID="0685a4906cd8df57a6fc2f16599ba5b339b14b8ee4e2f165c183f54473c7f2ff" exitCode=0 Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.107336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" event={"ID":"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58","Type":"ContainerDied","Data":"0685a4906cd8df57a6fc2f16599ba5b339b14b8ee4e2f165c183f54473c7f2ff"} Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.115815 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" podStartSLOduration=4.115798223 podStartE2EDuration="4.115798223s" podCreationTimestamp="2026-03-08 19:36:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:36:06.112959026 +0000 UTC m=+267.509013049" watchObservedRunningTime="2026-03-08 19:36:06.115798223 +0000 UTC m=+267.511852246" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.122669 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.133198 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.138029 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" podStartSLOduration=4.138016768 podStartE2EDuration="4.138016768s" podCreationTimestamp="2026-03-08 19:36:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:36:06.133390834 +0000 UTC m=+267.529444857" watchObservedRunningTime="2026-03-08 19:36:06.138016768 +0000 UTC m=+267.534070791" Mar 08 19:36:06 crc kubenswrapper[4885]: I0308 19:36:06.216533 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 08 19:36:06 crc kubenswrapper[4885]: W0308 19:36:06.274638 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5016b299_d93d_4bf9_9e51_a10e68da79bc.slice/crio-a263fbc4aa22cd2336112ae2b8d4601e8a2428667d3f47683314a7fcfe77edd6 WatchSource:0}: Error finding container a263fbc4aa22cd2336112ae2b8d4601e8a2428667d3f47683314a7fcfe77edd6: Status 404 returned error can't find the container with id a263fbc4aa22cd2336112ae2b8d4601e8a2428667d3f47683314a7fcfe77edd6 Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.117750 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5016b299-d93d-4bf9-9e51-a10e68da79bc","Type":"ContainerStarted","Data":"7ebd4befa1406c40708347ef7d1192142a51ec8351b345f6e5b1c9291b85b870"} Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.118139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5016b299-d93d-4bf9-9e51-a10e68da79bc","Type":"ContainerStarted","Data":"a263fbc4aa22cd2336112ae2b8d4601e8a2428667d3f47683314a7fcfe77edd6"} Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.448117 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.464201 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.464185318 podStartE2EDuration="2.464185318s" podCreationTimestamp="2026-03-08 19:36:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:36:07.138474447 +0000 UTC m=+268.534528470" watchObservedRunningTime="2026-03-08 19:36:07.464185318 +0000 UTC m=+268.860239341" Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.527210 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbjqw\" (UniqueName: \"kubernetes.io/projected/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58-kube-api-access-zbjqw\") pod \"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58\" (UID: \"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58\") " Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.532753 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58-kube-api-access-zbjqw" (OuterVolumeSpecName: "kube-api-access-zbjqw") pod "5a29b0f8-8eee-4c05-9bbe-bebb70f16e58" (UID: "5a29b0f8-8eee-4c05-9bbe-bebb70f16e58"). InnerVolumeSpecName "kube-api-access-zbjqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:07 crc kubenswrapper[4885]: I0308 19:36:07.628787 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbjqw\" (UniqueName: \"kubernetes.io/projected/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58-kube-api-access-zbjqw\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:08 crc kubenswrapper[4885]: I0308 19:36:08.128064 4885 generic.go:334] "Generic (PLEG): container finished" podID="5016b299-d93d-4bf9-9e51-a10e68da79bc" containerID="7ebd4befa1406c40708347ef7d1192142a51ec8351b345f6e5b1c9291b85b870" exitCode=0 Mar 08 19:36:08 crc kubenswrapper[4885]: I0308 19:36:08.128156 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5016b299-d93d-4bf9-9e51-a10e68da79bc","Type":"ContainerDied","Data":"7ebd4befa1406c40708347ef7d1192142a51ec8351b345f6e5b1c9291b85b870"} Mar 08 19:36:08 crc kubenswrapper[4885]: I0308 19:36:08.132845 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" Mar 08 19:36:08 crc kubenswrapper[4885]: I0308 19:36:08.136696 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549976-nhqg4" event={"ID":"5a29b0f8-8eee-4c05-9bbe-bebb70f16e58","Type":"ContainerDied","Data":"d6279e33c9c34db2ad8eae44a300197105d2b923e0ae139b24d2c1b666cdd843"} Mar 08 19:36:08 crc kubenswrapper[4885]: I0308 19:36:08.136735 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6279e33c9c34db2ad8eae44a300197105d2b923e0ae139b24d2c1b666cdd843" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.045048 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 19:36:12 crc kubenswrapper[4885]: E0308 19:36:12.045579 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a29b0f8-8eee-4c05-9bbe-bebb70f16e58" containerName="oc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.045592 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a29b0f8-8eee-4c05-9bbe-bebb70f16e58" containerName="oc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.045689 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a29b0f8-8eee-4c05-9bbe-bebb70f16e58" containerName="oc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.046107 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.060099 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.234471 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-var-lock\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.234527 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf6a8691-f048-4173-8c9e-06bca13e37e1-kube-api-access\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.234583 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.336367 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf6a8691-f048-4173-8c9e-06bca13e37e1-kube-api-access\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.336445 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-var-lock\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.336507 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.336553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-var-lock\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.337378 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.353652 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf6a8691-f048-4173-8c9e-06bca13e37e1-kube-api-access\") pod \"installer-9-crc\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:12 crc kubenswrapper[4885]: I0308 19:36:12.403115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:36:15 crc kubenswrapper[4885]: E0308 19:36:15.475448 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-conmon-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.334406 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.395832 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5016b299-d93d-4bf9-9e51-a10e68da79bc-kubelet-dir\") pod \"5016b299-d93d-4bf9-9e51-a10e68da79bc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.395971 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5016b299-d93d-4bf9-9e51-a10e68da79bc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5016b299-d93d-4bf9-9e51-a10e68da79bc" (UID: "5016b299-d93d-4bf9-9e51-a10e68da79bc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.395994 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5016b299-d93d-4bf9-9e51-a10e68da79bc-kube-api-access\") pod \"5016b299-d93d-4bf9-9e51-a10e68da79bc\" (UID: \"5016b299-d93d-4bf9-9e51-a10e68da79bc\") " Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.396351 4885 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5016b299-d93d-4bf9-9e51-a10e68da79bc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.404172 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5016b299-d93d-4bf9-9e51-a10e68da79bc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5016b299-d93d-4bf9-9e51-a10e68da79bc" (UID: "5016b299-d93d-4bf9-9e51-a10e68da79bc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:17 crc kubenswrapper[4885]: I0308 19:36:17.497305 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5016b299-d93d-4bf9-9e51-a10e68da79bc-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:18 crc kubenswrapper[4885]: I0308 19:36:18.093619 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5016b299-d93d-4bf9-9e51-a10e68da79bc","Type":"ContainerDied","Data":"a263fbc4aa22cd2336112ae2b8d4601e8a2428667d3f47683314a7fcfe77edd6"} Mar 08 19:36:18 crc kubenswrapper[4885]: I0308 19:36:18.093656 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a263fbc4aa22cd2336112ae2b8d4601e8a2428667d3f47683314a7fcfe77edd6" Mar 08 19:36:18 crc kubenswrapper[4885]: I0308 19:36:18.093672 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 08 19:36:25 crc kubenswrapper[4885]: E0308 19:36:25.587037 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-conmon-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:36:29 crc kubenswrapper[4885]: I0308 19:36:29.400585 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2bp7t"] Mar 08 19:36:30 crc kubenswrapper[4885]: E0308 19:36:30.345093 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 08 19:36:30 crc kubenswrapper[4885]: E0308 19:36:30.345470 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jn2mq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-cptvd_openshift-marketplace(b30ce2c5-2b53-47aa-8470-394dd0d6256a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 19:36:30 crc kubenswrapper[4885]: E0308 19:36:30.346696 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-cptvd" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.335034 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-cptvd" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.460123 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.460676 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdjrg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-59wjr_openshift-marketplace(7346fb7f-6125-49c7-a422-cc169bc7e045): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.462233 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-59wjr" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.471439 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.471585 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x76vh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-pqxt7_openshift-marketplace(8881ba5e-d9d1-42a9-98af-849e72053757): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.472985 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-pqxt7" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.493692 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.493857 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-st8f9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-62xgk_openshift-marketplace(05666e0b-c4ce-451a-ba67-ddb78866ef54): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 08 19:36:32 crc kubenswrapper[4885]: E0308 19:36:32.495469 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-62xgk" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" Mar 08 19:36:32 crc kubenswrapper[4885]: I0308 19:36:32.735145 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 08 19:36:32 crc kubenswrapper[4885]: I0308 19:36:32.846531 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:36:32 crc kubenswrapper[4885]: I0308 19:36:32.846894 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:36:32 crc kubenswrapper[4885]: I0308 19:36:32.846983 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:36:32 crc kubenswrapper[4885]: I0308 19:36:32.847499 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:36:32 crc kubenswrapper[4885]: I0308 19:36:32.847555 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a" gracePeriod=600 Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.169059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cf6a8691-f048-4173-8c9e-06bca13e37e1","Type":"ContainerStarted","Data":"d0cc1d7c926212abd6b6240d8d958b1f68d5c10c839d13411e786e80784f25ac"} Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.170401 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerStarted","Data":"06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313"} Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.171801 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerStarted","Data":"aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908"} Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.174674 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerStarted","Data":"119e5e86f78bbb6aaaede8af002474c8df38720baf17de30416c30bd76579ad1"} Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.176132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerStarted","Data":"fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b"} Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.177644 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a" exitCode=0 Mar 08 19:36:33 crc kubenswrapper[4885]: I0308 19:36:33.177730 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a"} Mar 08 19:36:33 crc kubenswrapper[4885]: E0308 19:36:33.178601 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-62xgk" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" Mar 08 19:36:33 crc kubenswrapper[4885]: E0308 19:36:33.180719 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-pqxt7" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.185557 4885 generic.go:334] "Generic (PLEG): container finished" podID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerID="06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313" exitCode=0 Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.185688 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerDied","Data":"06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313"} Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.189280 4885 generic.go:334] "Generic (PLEG): container finished" podID="038004f7-92de-42b0-8951-447dfdaf2f83" containerID="119e5e86f78bbb6aaaede8af002474c8df38720baf17de30416c30bd76579ad1" exitCode=0 Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.189383 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerDied","Data":"119e5e86f78bbb6aaaede8af002474c8df38720baf17de30416c30bd76579ad1"} Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.194661 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerID="fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b" exitCode=0 Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.194738 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerDied","Data":"fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b"} Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.205206 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"7301efe622f6965ef0088239fdd6ca59b9a8395c4d2bba8dc311752a026260dc"} Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.207025 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cf6a8691-f048-4173-8c9e-06bca13e37e1","Type":"ContainerStarted","Data":"3c326d6a1a9e49cd242fcac137bf8c9ce49e4cb4b6826f444d12dd45c38d8ec9"} Mar 08 19:36:34 crc kubenswrapper[4885]: I0308 19:36:34.255694 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=22.255661059 podStartE2EDuration="22.255661059s" podCreationTimestamp="2026-03-08 19:36:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:36:34.235142618 +0000 UTC m=+295.631196681" watchObservedRunningTime="2026-03-08 19:36:34.255661059 +0000 UTC m=+295.651715082" Mar 08 19:36:35 crc kubenswrapper[4885]: I0308 19:36:35.220466 4885 generic.go:334] "Generic (PLEG): container finished" podID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerID="aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908" exitCode=0 Mar 08 19:36:35 crc kubenswrapper[4885]: I0308 19:36:35.220517 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerDied","Data":"aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908"} Mar 08 19:36:35 crc kubenswrapper[4885]: E0308 19:36:35.690696 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56c146b0_3448_4140_8cf0_8d637f7f22a9.slice/crio-conmon-6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:36:37 crc kubenswrapper[4885]: I0308 19:36:37.234426 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerStarted","Data":"27737080ffcd1af304d093dd7fb9b8c98d2e1d3a468d0c510cc2a7d938b9d1b7"} Mar 08 19:36:37 crc kubenswrapper[4885]: I0308 19:36:37.246139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerStarted","Data":"246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a"} Mar 08 19:36:37 crc kubenswrapper[4885]: I0308 19:36:37.259281 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t9fn7" podStartSLOduration=3.470038712 podStartE2EDuration="1m13.259262465s" podCreationTimestamp="2026-03-08 19:35:24 +0000 UTC" firstStartedPulling="2026-03-08 19:35:26.264941907 +0000 UTC m=+227.660995930" lastFinishedPulling="2026-03-08 19:36:36.05416566 +0000 UTC m=+297.450219683" observedRunningTime="2026-03-08 19:36:37.25348836 +0000 UTC m=+298.649542413" watchObservedRunningTime="2026-03-08 19:36:37.259262465 +0000 UTC m=+298.655316488" Mar 08 19:36:38 crc kubenswrapper[4885]: I0308 19:36:38.254365 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerStarted","Data":"b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a"} Mar 08 19:36:38 crc kubenswrapper[4885]: I0308 19:36:38.256853 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerStarted","Data":"cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80"} Mar 08 19:36:38 crc kubenswrapper[4885]: I0308 19:36:38.293528 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xpctw" podStartSLOduration=3.2402418 podStartE2EDuration="1m14.293481558s" podCreationTimestamp="2026-03-08 19:35:24 +0000 UTC" firstStartedPulling="2026-03-08 19:35:26.241822136 +0000 UTC m=+227.637876149" lastFinishedPulling="2026-03-08 19:36:37.295061884 +0000 UTC m=+298.691115907" observedRunningTime="2026-03-08 19:36:38.293243772 +0000 UTC m=+299.689297795" watchObservedRunningTime="2026-03-08 19:36:38.293481558 +0000 UTC m=+299.689535611" Mar 08 19:36:38 crc kubenswrapper[4885]: I0308 19:36:38.307008 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gnjnd" podStartSLOduration=3.525347463 podStartE2EDuration="1m14.30696809s" podCreationTimestamp="2026-03-08 19:35:24 +0000 UTC" firstStartedPulling="2026-03-08 19:35:26.278605963 +0000 UTC m=+227.674659996" lastFinishedPulling="2026-03-08 19:36:37.06022659 +0000 UTC m=+298.456280623" observedRunningTime="2026-03-08 19:36:38.30657801 +0000 UTC m=+299.702632033" watchObservedRunningTime="2026-03-08 19:36:38.30696809 +0000 UTC m=+299.703022113" Mar 08 19:36:38 crc kubenswrapper[4885]: I0308 19:36:38.324204 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-prdq9" podStartSLOduration=19.747049504 podStartE2EDuration="1m11.324189312s" podCreationTimestamp="2026-03-08 19:35:27 +0000 UTC" firstStartedPulling="2026-03-08 19:35:45.931406339 +0000 UTC m=+247.327460372" lastFinishedPulling="2026-03-08 19:36:37.508546157 +0000 UTC m=+298.904600180" observedRunningTime="2026-03-08 19:36:38.321537931 +0000 UTC m=+299.717591954" watchObservedRunningTime="2026-03-08 19:36:38.324189312 +0000 UTC m=+299.720243335" Mar 08 19:36:42 crc kubenswrapper[4885]: I0308 19:36:42.534645 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d6687ccbf-8tvf8"] Mar 08 19:36:42 crc kubenswrapper[4885]: I0308 19:36:42.535216 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" podUID="97017523-4956-4bde-84a3-5859a2edf389" containerName="controller-manager" containerID="cri-o://f930e10771b84878f3ee1e71b5063ef75713f3fd89c0b6b9084dd1d3903c6042" gracePeriod=30 Mar 08 19:36:42 crc kubenswrapper[4885]: I0308 19:36:42.641318 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25"] Mar 08 19:36:42 crc kubenswrapper[4885]: I0308 19:36:42.641591 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" podUID="0514086c-91a5-4b80-8920-b2b78d4faba3" containerName="route-controller-manager" containerID="cri-o://d780b5eabbfcbe56d150e28f4acfba07ce728041d152bf3d5c0c48b9dc32a456" gracePeriod=30 Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.289379 4885 generic.go:334] "Generic (PLEG): container finished" podID="97017523-4956-4bde-84a3-5859a2edf389" containerID="f930e10771b84878f3ee1e71b5063ef75713f3fd89c0b6b9084dd1d3903c6042" exitCode=0 Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.289515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" event={"ID":"97017523-4956-4bde-84a3-5859a2edf389","Type":"ContainerDied","Data":"f930e10771b84878f3ee1e71b5063ef75713f3fd89c0b6b9084dd1d3903c6042"} Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.292585 4885 generic.go:334] "Generic (PLEG): container finished" podID="0514086c-91a5-4b80-8920-b2b78d4faba3" containerID="d780b5eabbfcbe56d150e28f4acfba07ce728041d152bf3d5c0c48b9dc32a456" exitCode=0 Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.292671 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" event={"ID":"0514086c-91a5-4b80-8920-b2b78d4faba3","Type":"ContainerDied","Data":"d780b5eabbfcbe56d150e28f4acfba07ce728041d152bf3d5c0c48b9dc32a456"} Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.803459 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.803514 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.851576 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.851611 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.908768 4885 patch_prober.go:28] interesting pod/controller-manager-d6687ccbf-8tvf8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Mar 08 19:36:44 crc kubenswrapper[4885]: I0308 19:36:44.908811 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" podUID="97017523-4956-4bde-84a3-5859a2edf389" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.061318 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.061747 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.132410 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.162788 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst"] Mar 08 19:36:45 crc kubenswrapper[4885]: E0308 19:36:45.163063 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5016b299-d93d-4bf9-9e51-a10e68da79bc" containerName="pruner" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.163078 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5016b299-d93d-4bf9-9e51-a10e68da79bc" containerName="pruner" Mar 08 19:36:45 crc kubenswrapper[4885]: E0308 19:36:45.163109 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0514086c-91a5-4b80-8920-b2b78d4faba3" containerName="route-controller-manager" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.163118 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0514086c-91a5-4b80-8920-b2b78d4faba3" containerName="route-controller-manager" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.163237 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0514086c-91a5-4b80-8920-b2b78d4faba3" containerName="route-controller-manager" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.163254 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5016b299-d93d-4bf9-9e51-a10e68da79bc" containerName="pruner" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.163729 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.173589 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst"] Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.231023 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.300563 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.301039 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d6687ccbf-8tvf8" event={"ID":"97017523-4956-4bde-84a3-5859a2edf389","Type":"ContainerDied","Data":"6e9fa8f3dc6a11417fee7c798a28fbc65a8413f8866bef37973386caf66851c6"} Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.301073 4885 scope.go:117] "RemoveContainer" containerID="f930e10771b84878f3ee1e71b5063ef75713f3fd89c0b6b9084dd1d3903c6042" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.306999 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.307008 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" event={"ID":"0514086c-91a5-4b80-8920-b2b78d4faba3","Type":"ContainerDied","Data":"51d88566446210e3b83617f522140b5e6b40693eb161e53013b9f354546417e9"} Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.317557 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-config\") pod \"0514086c-91a5-4b80-8920-b2b78d4faba3\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.317639 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0514086c-91a5-4b80-8920-b2b78d4faba3-serving-cert\") pod \"0514086c-91a5-4b80-8920-b2b78d4faba3\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.317665 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-client-ca\") pod \"0514086c-91a5-4b80-8920-b2b78d4faba3\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.317726 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vtmxc\" (UniqueName: \"kubernetes.io/projected/0514086c-91a5-4b80-8920-b2b78d4faba3-kube-api-access-vtmxc\") pod \"0514086c-91a5-4b80-8920-b2b78d4faba3\" (UID: \"0514086c-91a5-4b80-8920-b2b78d4faba3\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318388 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-client-ca" (OuterVolumeSpecName: "client-ca") pod "0514086c-91a5-4b80-8920-b2b78d4faba3" (UID: "0514086c-91a5-4b80-8920-b2b78d4faba3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318397 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-config" (OuterVolumeSpecName: "config") pod "0514086c-91a5-4b80-8920-b2b78d4faba3" (UID: "0514086c-91a5-4b80-8920-b2b78d4faba3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318654 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-client-ca\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318718 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-serving-cert\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptfss\" (UniqueName: \"kubernetes.io/projected/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-kube-api-access-ptfss\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318819 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-config\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318857 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.318867 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0514086c-91a5-4b80-8920-b2b78d4faba3-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.323572 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0514086c-91a5-4b80-8920-b2b78d4faba3-kube-api-access-vtmxc" (OuterVolumeSpecName: "kube-api-access-vtmxc") pod "0514086c-91a5-4b80-8920-b2b78d4faba3" (UID: "0514086c-91a5-4b80-8920-b2b78d4faba3"). InnerVolumeSpecName "kube-api-access-vtmxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.323799 4885 scope.go:117] "RemoveContainer" containerID="d780b5eabbfcbe56d150e28f4acfba07ce728041d152bf3d5c0c48b9dc32a456" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.323971 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0514086c-91a5-4b80-8920-b2b78d4faba3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0514086c-91a5-4b80-8920-b2b78d4faba3" (UID: "0514086c-91a5-4b80-8920-b2b78d4faba3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.396703 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.399013 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.406362 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.421292 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87mqf\" (UniqueName: \"kubernetes.io/projected/97017523-4956-4bde-84a3-5859a2edf389-kube-api-access-87mqf\") pod \"97017523-4956-4bde-84a3-5859a2edf389\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.421399 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-proxy-ca-bundles\") pod \"97017523-4956-4bde-84a3-5859a2edf389\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.421829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97017523-4956-4bde-84a3-5859a2edf389-serving-cert\") pod \"97017523-4956-4bde-84a3-5859a2edf389\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.421865 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-client-ca\") pod \"97017523-4956-4bde-84a3-5859a2edf389\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.421905 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "97017523-4956-4bde-84a3-5859a2edf389" (UID: "97017523-4956-4bde-84a3-5859a2edf389"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.421911 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-config\") pod \"97017523-4956-4bde-84a3-5859a2edf389\" (UID: \"97017523-4956-4bde-84a3-5859a2edf389\") " Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.422399 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-config" (OuterVolumeSpecName: "config") pod "97017523-4956-4bde-84a3-5859a2edf389" (UID: "97017523-4956-4bde-84a3-5859a2edf389"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423195 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-client-ca" (OuterVolumeSpecName: "client-ca") pod "97017523-4956-4bde-84a3-5859a2edf389" (UID: "97017523-4956-4bde-84a3-5859a2edf389"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423368 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-serving-cert\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423494 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptfss\" (UniqueName: \"kubernetes.io/projected/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-kube-api-access-ptfss\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423639 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-config\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423776 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-client-ca\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423823 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423839 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423852 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0514086c-91a5-4b80-8920-b2b78d4faba3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423865 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/97017523-4956-4bde-84a3-5859a2edf389-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.423878 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vtmxc\" (UniqueName: \"kubernetes.io/projected/0514086c-91a5-4b80-8920-b2b78d4faba3-kube-api-access-vtmxc\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.425430 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-client-ca\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.426106 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97017523-4956-4bde-84a3-5859a2edf389-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "97017523-4956-4bde-84a3-5859a2edf389" (UID: "97017523-4956-4bde-84a3-5859a2edf389"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.426151 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-config\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.427452 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97017523-4956-4bde-84a3-5859a2edf389-kube-api-access-87mqf" (OuterVolumeSpecName: "kube-api-access-87mqf") pod "97017523-4956-4bde-84a3-5859a2edf389" (UID: "97017523-4956-4bde-84a3-5859a2edf389"). InnerVolumeSpecName "kube-api-access-87mqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.428382 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-serving-cert\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.446907 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptfss\" (UniqueName: \"kubernetes.io/projected/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-kube-api-access-ptfss\") pod \"route-controller-manager-84c688968f-n9xst\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.452140 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.452739 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.482158 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.524910 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87mqf\" (UniqueName: \"kubernetes.io/projected/97017523-4956-4bde-84a3-5859a2edf389-kube-api-access-87mqf\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.524967 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/97017523-4956-4bde-84a3-5859a2edf389-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.651244 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25"] Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.654228 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25"] Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.656635 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d6687ccbf-8tvf8"] Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.658869 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d6687ccbf-8tvf8"] Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.691589 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst"] Mar 08 19:36:45 crc kubenswrapper[4885]: W0308 19:36:45.692666 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bb8d3c0_c05c_4557_b84d_94c2b20add8e.slice/crio-8af151f617b1a4bcfdce5bb8f6b08876de153fa4a744d447aa3b6026b9364569 WatchSource:0}: Error finding container 8af151f617b1a4bcfdce5bb8f6b08876de153fa4a744d447aa3b6026b9364569: Status 404 returned error can't find the container with id 8af151f617b1a4bcfdce5bb8f6b08876de153fa4a744d447aa3b6026b9364569 Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.924040 4885 patch_prober.go:28] interesting pod/route-controller-manager-6c8756cb88-5rd25 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 08 19:36:45 crc kubenswrapper[4885]: I0308 19:36:45.924105 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c8756cb88-5rd25" podUID="0514086c-91a5-4b80-8920-b2b78d4faba3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.62:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 08 19:36:46 crc kubenswrapper[4885]: I0308 19:36:46.321663 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" event={"ID":"7bb8d3c0-c05c-4557-b84d-94c2b20add8e","Type":"ContainerStarted","Data":"8af151f617b1a4bcfdce5bb8f6b08876de153fa4a744d447aa3b6026b9364569"} Mar 08 19:36:46 crc kubenswrapper[4885]: E0308 19:36:46.371501 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-59wjr" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" Mar 08 19:36:46 crc kubenswrapper[4885]: I0308 19:36:46.379046 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.333295 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" event={"ID":"7bb8d3c0-c05c-4557-b84d-94c2b20add8e","Type":"ContainerStarted","Data":"e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef"} Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.357941 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" podStartSLOduration=5.357895835 podStartE2EDuration="5.357895835s" podCreationTimestamp="2026-03-08 19:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:36:47.357197107 +0000 UTC m=+308.753251130" watchObservedRunningTime="2026-03-08 19:36:47.357895835 +0000 UTC m=+308.753949868" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.380275 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0514086c-91a5-4b80-8920-b2b78d4faba3" path="/var/lib/kubelet/pods/0514086c-91a5-4b80-8920-b2b78d4faba3/volumes" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.381411 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97017523-4956-4bde-84a3-5859a2edf389" path="/var/lib/kubelet/pods/97017523-4956-4bde-84a3-5859a2edf389/volumes" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.554915 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j"] Mar 08 19:36:47 crc kubenswrapper[4885]: E0308 19:36:47.555269 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97017523-4956-4bde-84a3-5859a2edf389" containerName="controller-manager" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.555292 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="97017523-4956-4bde-84a3-5859a2edf389" containerName="controller-manager" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.555474 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="97017523-4956-4bde-84a3-5859a2edf389" containerName="controller-manager" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.556077 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.566229 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.566621 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.574246 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.574345 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.574482 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.574668 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.584691 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.585557 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j"] Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.625434 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t9fn7"] Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.650795 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-config\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.650836 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-client-ca\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.650881 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ebec2c-a495-4d93-b35f-ddd022a21564-serving-cert\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.651023 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwkdh\" (UniqueName: \"kubernetes.io/projected/20ebec2c-a495-4d93-b35f-ddd022a21564-kube-api-access-zwkdh\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.651163 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-proxy-ca-bundles\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.753124 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-config\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.753245 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-client-ca\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.753987 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ebec2c-a495-4d93-b35f-ddd022a21564-serving-cert\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.754050 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwkdh\" (UniqueName: \"kubernetes.io/projected/20ebec2c-a495-4d93-b35f-ddd022a21564-kube-api-access-zwkdh\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.754134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-proxy-ca-bundles\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.755266 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-client-ca\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.755907 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-proxy-ca-bundles\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.758380 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-config\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.763616 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ebec2c-a495-4d93-b35f-ddd022a21564-serving-cert\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.790053 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwkdh\" (UniqueName: \"kubernetes.io/projected/20ebec2c-a495-4d93-b35f-ddd022a21564-kube-api-access-zwkdh\") pod \"controller-manager-686c6dfdcf-q5d5j\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.843658 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.844025 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.886348 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:47 crc kubenswrapper[4885]: I0308 19:36:47.916176 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:36:48 crc kubenswrapper[4885]: I0308 19:36:48.348423 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t9fn7" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="registry-server" containerID="cri-o://27737080ffcd1af304d093dd7fb9b8c98d2e1d3a468d0c510cc2a7d938b9d1b7" gracePeriod=2 Mar 08 19:36:48 crc kubenswrapper[4885]: I0308 19:36:48.349831 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:48 crc kubenswrapper[4885]: I0308 19:36:48.356996 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j"] Mar 08 19:36:48 crc kubenswrapper[4885]: I0308 19:36:48.361630 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:36:48 crc kubenswrapper[4885]: I0308 19:36:48.433276 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:36:49 crc kubenswrapper[4885]: I0308 19:36:49.355715 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" event={"ID":"20ebec2c-a495-4d93-b35f-ddd022a21564","Type":"ContainerStarted","Data":"84690645be4eef59546f1222f77afbe3b95d771a6460611313ea0fa767e54a31"} Mar 08 19:36:49 crc kubenswrapper[4885]: I0308 19:36:49.358847 4885 generic.go:334] "Generic (PLEG): container finished" podID="038004f7-92de-42b0-8951-447dfdaf2f83" containerID="27737080ffcd1af304d093dd7fb9b8c98d2e1d3a468d0c510cc2a7d938b9d1b7" exitCode=0 Mar 08 19:36:49 crc kubenswrapper[4885]: I0308 19:36:49.358939 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerDied","Data":"27737080ffcd1af304d093dd7fb9b8c98d2e1d3a468d0c510cc2a7d938b9d1b7"} Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.196834 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.222561 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-catalog-content\") pod \"038004f7-92de-42b0-8951-447dfdaf2f83\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.222674 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-utilities\") pod \"038004f7-92de-42b0-8951-447dfdaf2f83\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.222706 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhfws\" (UniqueName: \"kubernetes.io/projected/038004f7-92de-42b0-8951-447dfdaf2f83-kube-api-access-zhfws\") pod \"038004f7-92de-42b0-8951-447dfdaf2f83\" (UID: \"038004f7-92de-42b0-8951-447dfdaf2f83\") " Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.226764 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-utilities" (OuterVolumeSpecName: "utilities") pod "038004f7-92de-42b0-8951-447dfdaf2f83" (UID: "038004f7-92de-42b0-8951-447dfdaf2f83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.234666 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/038004f7-92de-42b0-8951-447dfdaf2f83-kube-api-access-zhfws" (OuterVolumeSpecName: "kube-api-access-zhfws") pod "038004f7-92de-42b0-8951-447dfdaf2f83" (UID: "038004f7-92de-42b0-8951-447dfdaf2f83"). InnerVolumeSpecName "kube-api-access-zhfws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.324765 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.324812 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhfws\" (UniqueName: \"kubernetes.io/projected/038004f7-92de-42b0-8951-447dfdaf2f83-kube-api-access-zhfws\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.328352 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "038004f7-92de-42b0-8951-447dfdaf2f83" (UID: "038004f7-92de-42b0-8951-447dfdaf2f83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.373407 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t9fn7" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.377503 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.377529 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t9fn7" event={"ID":"038004f7-92de-42b0-8951-447dfdaf2f83","Type":"ContainerDied","Data":"75e0661a1b23d764b6ba36fa438a7e6bc5398d8d64dd93e640a640cee2d85a8d"} Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.377546 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" event={"ID":"20ebec2c-a495-4d93-b35f-ddd022a21564","Type":"ContainerStarted","Data":"c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1"} Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.377562 4885 scope.go:117] "RemoveContainer" containerID="27737080ffcd1af304d093dd7fb9b8c98d2e1d3a468d0c510cc2a7d938b9d1b7" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.411378 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" podStartSLOduration=9.411364815 podStartE2EDuration="9.411364815s" podCreationTimestamp="2026-03-08 19:36:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:36:51.410335538 +0000 UTC m=+312.806389551" watchObservedRunningTime="2026-03-08 19:36:51.411364815 +0000 UTC m=+312.807418838" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.422476 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t9fn7"] Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.425300 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/038004f7-92de-42b0-8951-447dfdaf2f83-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.427133 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.429008 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t9fn7"] Mar 08 19:36:51 crc kubenswrapper[4885]: I0308 19:36:51.844745 4885 scope.go:117] "RemoveContainer" containerID="119e5e86f78bbb6aaaede8af002474c8df38720baf17de30416c30bd76579ad1" Mar 08 19:36:52 crc kubenswrapper[4885]: I0308 19:36:52.109031 4885 scope.go:117] "RemoveContainer" containerID="3a239ba546d78b237c7c4654423d382cd43547556be78002c1edfa9e41f28b19" Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.376637 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" path="/var/lib/kubelet/pods/038004f7-92de-42b0-8951-447dfdaf2f83/volumes" Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.394540 4885 generic.go:334] "Generic (PLEG): container finished" podID="8881ba5e-d9d1-42a9-98af-849e72053757" containerID="2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7" exitCode=0 Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.394655 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqxt7" event={"ID":"8881ba5e-d9d1-42a9-98af-849e72053757","Type":"ContainerDied","Data":"2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7"} Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.397455 4885 generic.go:334] "Generic (PLEG): container finished" podID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerID="189818c391ca54e42f66a3022db5b2d8456e8ff7e65867b70d5d4849a35c1bc1" exitCode=0 Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.397550 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cptvd" event={"ID":"b30ce2c5-2b53-47aa-8470-394dd0d6256a","Type":"ContainerDied","Data":"189818c391ca54e42f66a3022db5b2d8456e8ff7e65867b70d5d4849a35c1bc1"} Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.401276 4885 generic.go:334] "Generic (PLEG): container finished" podID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerID="67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de" exitCode=0 Mar 08 19:36:53 crc kubenswrapper[4885]: I0308 19:36:53.401470 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62xgk" event={"ID":"05666e0b-c4ce-451a-ba67-ddb78866ef54","Type":"ContainerDied","Data":"67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de"} Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.411692 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cptvd" event={"ID":"b30ce2c5-2b53-47aa-8470-394dd0d6256a","Type":"ContainerStarted","Data":"47c92a0374e476a5eb0fbf68fbb5d0f2fff3e28d76c4d73cdd5e57511a785c76"} Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.415082 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62xgk" event={"ID":"05666e0b-c4ce-451a-ba67-ddb78866ef54","Type":"ContainerStarted","Data":"2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876"} Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.418198 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqxt7" event={"ID":"8881ba5e-d9d1-42a9-98af-849e72053757","Type":"ContainerStarted","Data":"ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c"} Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.441581 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cptvd" podStartSLOduration=20.574997071 podStartE2EDuration="1m28.441553915s" podCreationTimestamp="2026-03-08 19:35:26 +0000 UTC" firstStartedPulling="2026-03-08 19:35:45.945204319 +0000 UTC m=+247.341258352" lastFinishedPulling="2026-03-08 19:36:53.811761163 +0000 UTC m=+315.207815196" observedRunningTime="2026-03-08 19:36:54.438569864 +0000 UTC m=+315.834623897" watchObservedRunningTime="2026-03-08 19:36:54.441553915 +0000 UTC m=+315.837607938" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.459081 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" podUID="d008be41-8eac-496a-9c3d-083014dc402c" containerName="oauth-openshift" containerID="cri-o://4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72" gracePeriod=15 Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.490548 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-62xgk" podStartSLOduration=20.652805907 podStartE2EDuration="1m28.490525828s" podCreationTimestamp="2026-03-08 19:35:26 +0000 UTC" firstStartedPulling="2026-03-08 19:35:45.94751796 +0000 UTC m=+247.343571983" lastFinishedPulling="2026-03-08 19:36:53.785237841 +0000 UTC m=+315.181291904" observedRunningTime="2026-03-08 19:36:54.486367697 +0000 UTC m=+315.882421720" watchObservedRunningTime="2026-03-08 19:36:54.490525828 +0000 UTC m=+315.886579851" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.490665 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pqxt7" podStartSLOduration=19.610406121 podStartE2EDuration="1m27.490659682s" podCreationTimestamp="2026-03-08 19:35:27 +0000 UTC" firstStartedPulling="2026-03-08 19:35:45.941191531 +0000 UTC m=+247.337245584" lastFinishedPulling="2026-03-08 19:36:53.821445112 +0000 UTC m=+315.217499145" observedRunningTime="2026-03-08 19:36:54.462831906 +0000 UTC m=+315.858885939" watchObservedRunningTime="2026-03-08 19:36:54.490659682 +0000 UTC m=+315.886713715" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.836045 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888437 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-ocp-branding-template\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888523 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-cliconfig\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888551 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-router-certs\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888576 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-service-ca\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888597 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d008be41-8eac-496a-9c3d-083014dc402c-audit-dir\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888650 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-session\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888675 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-trusted-ca-bundle\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888701 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjdvb\" (UniqueName: \"kubernetes.io/projected/d008be41-8eac-496a-9c3d-083014dc402c-kube-api-access-sjdvb\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888721 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-serving-cert\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888746 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-idp-0-file-data\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888781 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-provider-selection\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888805 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-error\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888849 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-login\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.888881 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-audit-policies\") pod \"d008be41-8eac-496a-9c3d-083014dc402c\" (UID: \"d008be41-8eac-496a-9c3d-083014dc402c\") " Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.890000 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.890093 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.892441 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.892518 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d008be41-8eac-496a-9c3d-083014dc402c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.893164 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.902584 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.903473 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.904070 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.904327 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.905275 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.905508 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.906831 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.908762 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.916043 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d008be41-8eac-496a-9c3d-083014dc402c-kube-api-access-sjdvb" (OuterVolumeSpecName: "kube-api-access-sjdvb") pod "d008be41-8eac-496a-9c3d-083014dc402c" (UID: "d008be41-8eac-496a-9c3d-083014dc402c"). InnerVolumeSpecName "kube-api-access-sjdvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.990753 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.991070 4885 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.991507 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.991608 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.991707 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.991807 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.991936 4885 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d008be41-8eac-496a-9c3d-083014dc402c-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992039 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992126 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992221 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992306 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjdvb\" (UniqueName: \"kubernetes.io/projected/d008be41-8eac-496a-9c3d-083014dc402c-kube-api-access-sjdvb\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992394 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992487 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:54 crc kubenswrapper[4885]: I0308 19:36:54.992588 4885 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d008be41-8eac-496a-9c3d-083014dc402c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.423682 4885 generic.go:334] "Generic (PLEG): container finished" podID="d008be41-8eac-496a-9c3d-083014dc402c" containerID="4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72" exitCode=0 Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.423720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" event={"ID":"d008be41-8eac-496a-9c3d-083014dc402c","Type":"ContainerDied","Data":"4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72"} Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.423743 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" event={"ID":"d008be41-8eac-496a-9c3d-083014dc402c","Type":"ContainerDied","Data":"f8be22b6ff0acffcb9ccf46c1c57e368d284120713d460abb8b81012323f6dcf"} Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.423759 4885 scope.go:117] "RemoveContainer" containerID="4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72" Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.423857 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-2bp7t" Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.440413 4885 scope.go:117] "RemoveContainer" containerID="4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72" Mar 08 19:36:55 crc kubenswrapper[4885]: E0308 19:36:55.441374 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72\": container with ID starting with 4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72 not found: ID does not exist" containerID="4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72" Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.441406 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72"} err="failed to get container status \"4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72\": rpc error: code = NotFound desc = could not find container \"4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72\": container with ID starting with 4db0c40694663e900f383cdc5f984a4922ee88d5efb25632325705940d064d72 not found: ID does not exist" Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.458152 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2bp7t"] Mar 08 19:36:55 crc kubenswrapper[4885]: I0308 19:36:55.468884 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-2bp7t"] Mar 08 19:36:56 crc kubenswrapper[4885]: I0308 19:36:56.822548 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:36:56 crc kubenswrapper[4885]: I0308 19:36:56.822607 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:36:56 crc kubenswrapper[4885]: I0308 19:36:56.877455 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:36:57 crc kubenswrapper[4885]: I0308 19:36:57.223357 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:36:57 crc kubenswrapper[4885]: I0308 19:36:57.223706 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:36:57 crc kubenswrapper[4885]: I0308 19:36:57.275446 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:36:57 crc kubenswrapper[4885]: I0308 19:36:57.376205 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d008be41-8eac-496a-9c3d-083014dc402c" path="/var/lib/kubelet/pods/d008be41-8eac-496a-9c3d-083014dc402c/volumes" Mar 08 19:36:58 crc kubenswrapper[4885]: I0308 19:36:58.224419 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:36:58 crc kubenswrapper[4885]: I0308 19:36:58.224474 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:36:58 crc kubenswrapper[4885]: I0308 19:36:58.447639 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerStarted","Data":"0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58"} Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.294685 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pqxt7" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="registry-server" probeResult="failure" output=< Mar 08 19:36:59 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 19:36:59 crc kubenswrapper[4885]: > Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.457394 4885 generic.go:334] "Generic (PLEG): container finished" podID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerID="0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58" exitCode=0 Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.457456 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerDied","Data":"0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58"} Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.568798 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-69ddfccf99-7ddh9"] Mar 08 19:36:59 crc kubenswrapper[4885]: E0308 19:36:59.570086 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="registry-server" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.570178 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="registry-server" Mar 08 19:36:59 crc kubenswrapper[4885]: E0308 19:36:59.570265 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="extract-utilities" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.570285 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="extract-utilities" Mar 08 19:36:59 crc kubenswrapper[4885]: E0308 19:36:59.570357 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="extract-content" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.570374 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="extract-content" Mar 08 19:36:59 crc kubenswrapper[4885]: E0308 19:36:59.570442 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d008be41-8eac-496a-9c3d-083014dc402c" containerName="oauth-openshift" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.570456 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d008be41-8eac-496a-9c3d-083014dc402c" containerName="oauth-openshift" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.571217 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d008be41-8eac-496a-9c3d-083014dc402c" containerName="oauth-openshift" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.571262 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="038004f7-92de-42b0-8951-447dfdaf2f83" containerName="registry-server" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.572192 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.582853 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.583169 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.583432 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.583609 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.583905 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.584419 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.585839 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.586188 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.586803 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.587352 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.587399 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.587865 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.596305 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-69ddfccf99-7ddh9"] Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.604251 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.609253 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.617988 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.652529 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.652624 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.652737 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.652786 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0a3ef361-6968-4088-a2f1-ca52eb3715b6-audit-dir\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.652824 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.652857 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653004 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-error\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653143 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653198 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-session\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653337 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qddj\" (UniqueName: \"kubernetes.io/projected/0a3ef361-6968-4088-a2f1-ca52eb3715b6-kube-api-access-5qddj\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653399 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653433 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-login\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653516 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-audit-policies\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.653655 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.754557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.755884 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0a3ef361-6968-4088-a2f1-ca52eb3715b6-audit-dir\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.755988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756049 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756100 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-error\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756242 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756311 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-session\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756419 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qddj\" (UniqueName: \"kubernetes.io/projected/0a3ef361-6968-4088-a2f1-ca52eb3715b6-kube-api-access-5qddj\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756482 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756531 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-login\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756582 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-audit-policies\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756634 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756686 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.756773 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.758068 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.760153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-audit-policies\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.760879 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.761742 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0a3ef361-6968-4088-a2f1-ca52eb3715b6-audit-dir\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.762251 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.766327 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.767830 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.767880 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-error\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.768303 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.768851 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-session\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.769098 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-template-login\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.770254 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.773412 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0a3ef361-6968-4088-a2f1-ca52eb3715b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.783238 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qddj\" (UniqueName: \"kubernetes.io/projected/0a3ef361-6968-4088-a2f1-ca52eb3715b6-kube-api-access-5qddj\") pod \"oauth-openshift-69ddfccf99-7ddh9\" (UID: \"0a3ef361-6968-4088-a2f1-ca52eb3715b6\") " pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:36:59 crc kubenswrapper[4885]: I0308 19:36:59.928428 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:37:00 crc kubenswrapper[4885]: I0308 19:37:00.377164 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-69ddfccf99-7ddh9"] Mar 08 19:37:00 crc kubenswrapper[4885]: I0308 19:37:00.465915 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" event={"ID":"0a3ef361-6968-4088-a2f1-ca52eb3715b6","Type":"ContainerStarted","Data":"c963215a061577702d5c9e97cdfac6b159d77b238aa349674f278b04c35add44"} Mar 08 19:37:00 crc kubenswrapper[4885]: I0308 19:37:00.468842 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerStarted","Data":"51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2"} Mar 08 19:37:01 crc kubenswrapper[4885]: I0308 19:37:01.479271 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" event={"ID":"0a3ef361-6968-4088-a2f1-ca52eb3715b6","Type":"ContainerStarted","Data":"b84642920228051864b70e57a044678fd7a3795095fb55824c922cf77c0242b1"} Mar 08 19:37:01 crc kubenswrapper[4885]: I0308 19:37:01.479662 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:37:01 crc kubenswrapper[4885]: I0308 19:37:01.489113 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" Mar 08 19:37:01 crc kubenswrapper[4885]: I0308 19:37:01.513275 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-59wjr" podStartSLOduration=3.795651429 podStartE2EDuration="1m37.513251593s" podCreationTimestamp="2026-03-08 19:35:24 +0000 UTC" firstStartedPulling="2026-03-08 19:35:26.264809003 +0000 UTC m=+227.660863026" lastFinishedPulling="2026-03-08 19:36:59.982409127 +0000 UTC m=+321.378463190" observedRunningTime="2026-03-08 19:37:00.48838211 +0000 UTC m=+321.884436173" watchObservedRunningTime="2026-03-08 19:37:01.513251593 +0000 UTC m=+322.909305656" Mar 08 19:37:01 crc kubenswrapper[4885]: I0308 19:37:01.516314 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-69ddfccf99-7ddh9" podStartSLOduration=32.516300135 podStartE2EDuration="32.516300135s" podCreationTimestamp="2026-03-08 19:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:37:01.510450698 +0000 UTC m=+322.906504751" watchObservedRunningTime="2026-03-08 19:37:01.516300135 +0000 UTC m=+322.912354198" Mar 08 19:37:02 crc kubenswrapper[4885]: I0308 19:37:02.573571 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j"] Mar 08 19:37:02 crc kubenswrapper[4885]: I0308 19:37:02.573903 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" podUID="20ebec2c-a495-4d93-b35f-ddd022a21564" containerName="controller-manager" containerID="cri-o://c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1" gracePeriod=30 Mar 08 19:37:02 crc kubenswrapper[4885]: I0308 19:37:02.592228 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst"] Mar 08 19:37:02 crc kubenswrapper[4885]: I0308 19:37:02.592477 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" podUID="7bb8d3c0-c05c-4557-b84d-94c2b20add8e" containerName="route-controller-manager" containerID="cri-o://e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef" gracePeriod=30 Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.166057 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.168506 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.306140 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-serving-cert\") pod \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.306190 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-client-ca\") pod \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.306214 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwkdh\" (UniqueName: \"kubernetes.io/projected/20ebec2c-a495-4d93-b35f-ddd022a21564-kube-api-access-zwkdh\") pod \"20ebec2c-a495-4d93-b35f-ddd022a21564\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.306363 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-client-ca\") pod \"20ebec2c-a495-4d93-b35f-ddd022a21564\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.306393 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-config\") pod \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.307147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-client-ca" (OuterVolumeSpecName: "client-ca") pod "20ebec2c-a495-4d93-b35f-ddd022a21564" (UID: "20ebec2c-a495-4d93-b35f-ddd022a21564"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.307200 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-config" (OuterVolumeSpecName: "config") pod "7bb8d3c0-c05c-4557-b84d-94c2b20add8e" (UID: "7bb8d3c0-c05c-4557-b84d-94c2b20add8e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.307246 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-config\") pod \"20ebec2c-a495-4d93-b35f-ddd022a21564\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.307470 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-client-ca" (OuterVolumeSpecName: "client-ca") pod "7bb8d3c0-c05c-4557-b84d-94c2b20add8e" (UID: "7bb8d3c0-c05c-4557-b84d-94c2b20add8e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.307542 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ebec2c-a495-4d93-b35f-ddd022a21564-serving-cert\") pod \"20ebec2c-a495-4d93-b35f-ddd022a21564\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.307989 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptfss\" (UniqueName: \"kubernetes.io/projected/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-kube-api-access-ptfss\") pod \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\" (UID: \"7bb8d3c0-c05c-4557-b84d-94c2b20add8e\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308012 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-config" (OuterVolumeSpecName: "config") pod "20ebec2c-a495-4d93-b35f-ddd022a21564" (UID: "20ebec2c-a495-4d93-b35f-ddd022a21564"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308038 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-proxy-ca-bundles\") pod \"20ebec2c-a495-4d93-b35f-ddd022a21564\" (UID: \"20ebec2c-a495-4d93-b35f-ddd022a21564\") " Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308505 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308544 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308562 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308577 4885 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.308632 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "20ebec2c-a495-4d93-b35f-ddd022a21564" (UID: "20ebec2c-a495-4d93-b35f-ddd022a21564"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.310755 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ebec2c-a495-4d93-b35f-ddd022a21564-kube-api-access-zwkdh" (OuterVolumeSpecName: "kube-api-access-zwkdh") pod "20ebec2c-a495-4d93-b35f-ddd022a21564" (UID: "20ebec2c-a495-4d93-b35f-ddd022a21564"). InnerVolumeSpecName "kube-api-access-zwkdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.310946 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7bb8d3c0-c05c-4557-b84d-94c2b20add8e" (UID: "7bb8d3c0-c05c-4557-b84d-94c2b20add8e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.311249 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-kube-api-access-ptfss" (OuterVolumeSpecName: "kube-api-access-ptfss") pod "7bb8d3c0-c05c-4557-b84d-94c2b20add8e" (UID: "7bb8d3c0-c05c-4557-b84d-94c2b20add8e"). InnerVolumeSpecName "kube-api-access-ptfss". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.311293 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20ebec2c-a495-4d93-b35f-ddd022a21564-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "20ebec2c-a495-4d93-b35f-ddd022a21564" (UID: "20ebec2c-a495-4d93-b35f-ddd022a21564"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.409982 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20ebec2c-a495-4d93-b35f-ddd022a21564-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.410010 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptfss\" (UniqueName: \"kubernetes.io/projected/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-kube-api-access-ptfss\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.410020 4885 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/20ebec2c-a495-4d93-b35f-ddd022a21564-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.410030 4885 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bb8d3c0-c05c-4557-b84d-94c2b20add8e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.410041 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwkdh\" (UniqueName: \"kubernetes.io/projected/20ebec2c-a495-4d93-b35f-ddd022a21564-kube-api-access-zwkdh\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.494835 4885 generic.go:334] "Generic (PLEG): container finished" podID="7bb8d3c0-c05c-4557-b84d-94c2b20add8e" containerID="e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef" exitCode=0 Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.494899 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" event={"ID":"7bb8d3c0-c05c-4557-b84d-94c2b20add8e","Type":"ContainerDied","Data":"e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef"} Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.494949 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" event={"ID":"7bb8d3c0-c05c-4557-b84d-94c2b20add8e","Type":"ContainerDied","Data":"8af151f617b1a4bcfdce5bb8f6b08876de153fa4a744d447aa3b6026b9364569"} Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.494971 4885 scope.go:117] "RemoveContainer" containerID="e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.494968 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.499110 4885 generic.go:334] "Generic (PLEG): container finished" podID="20ebec2c-a495-4d93-b35f-ddd022a21564" containerID="c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1" exitCode=0 Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.499222 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.499284 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" event={"ID":"20ebec2c-a495-4d93-b35f-ddd022a21564","Type":"ContainerDied","Data":"c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1"} Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.499477 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j" event={"ID":"20ebec2c-a495-4d93-b35f-ddd022a21564","Type":"ContainerDied","Data":"84690645be4eef59546f1222f77afbe3b95d771a6460611313ea0fa767e54a31"} Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.533868 4885 scope.go:117] "RemoveContainer" containerID="e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef" Mar 08 19:37:03 crc kubenswrapper[4885]: E0308 19:37:03.535062 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef\": container with ID starting with e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef not found: ID does not exist" containerID="e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.535112 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef"} err="failed to get container status \"e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef\": rpc error: code = NotFound desc = could not find container \"e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef\": container with ID starting with e1dbb25f44f6649e559cae47dae9dc75eb8da1014db8020b26a73ff9b1ed62ef not found: ID does not exist" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.535145 4885 scope.go:117] "RemoveContainer" containerID="c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.538272 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst"] Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.548388 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-84c688968f-n9xst"] Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.553455 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j"] Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.557842 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-686c6dfdcf-q5d5j"] Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.558516 4885 scope.go:117] "RemoveContainer" containerID="c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1" Mar 08 19:37:03 crc kubenswrapper[4885]: E0308 19:37:03.559161 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1\": container with ID starting with c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1 not found: ID does not exist" containerID="c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1" Mar 08 19:37:03 crc kubenswrapper[4885]: I0308 19:37:03.559208 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1"} err="failed to get container status \"c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1\": rpc error: code = NotFound desc = could not find container \"c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1\": container with ID starting with c8bc79e56b5d29d7e362e08d43a2aa45b931302eb10ecf343d965981b2b972f1 not found: ID does not exist" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.573984 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56b9b7b549-bcjcp"] Mar 08 19:37:04 crc kubenswrapper[4885]: E0308 19:37:04.574366 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ebec2c-a495-4d93-b35f-ddd022a21564" containerName="controller-manager" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.574393 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ebec2c-a495-4d93-b35f-ddd022a21564" containerName="controller-manager" Mar 08 19:37:04 crc kubenswrapper[4885]: E0308 19:37:04.574439 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bb8d3c0-c05c-4557-b84d-94c2b20add8e" containerName="route-controller-manager" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.574457 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb8d3c0-c05c-4557-b84d-94c2b20add8e" containerName="route-controller-manager" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.574668 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bb8d3c0-c05c-4557-b84d-94c2b20add8e" containerName="route-controller-manager" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.574707 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ebec2c-a495-4d93-b35f-ddd022a21564" containerName="controller-manager" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.575538 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.579030 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.579885 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.580455 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.582428 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.582533 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.582666 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.583180 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb"] Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.584276 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.591070 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.591325 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56b9b7b549-bcjcp"] Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.594156 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.595268 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.595603 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.595792 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.596200 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb"] Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.596301 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.597648 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.729735 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e160affd-520c-4dad-89d4-d2a511c8a545-serving-cert\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.729784 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e160affd-520c-4dad-89d4-d2a511c8a545-client-ca\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.729823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-proxy-ca-bundles\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.729907 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-client-ca\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.729992 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-config\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.730040 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/563046aa-bf59-4d04-bc57-ab1250051468-serving-cert\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.730096 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e160affd-520c-4dad-89d4-d2a511c8a545-config\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.730136 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrx5q\" (UniqueName: \"kubernetes.io/projected/e160affd-520c-4dad-89d4-d2a511c8a545-kube-api-access-mrx5q\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.730163 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsvqv\" (UniqueName: \"kubernetes.io/projected/563046aa-bf59-4d04-bc57-ab1250051468-kube-api-access-gsvqv\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.831800 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-client-ca\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.831993 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-config\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832045 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/563046aa-bf59-4d04-bc57-ab1250051468-serving-cert\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832117 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e160affd-520c-4dad-89d4-d2a511c8a545-config\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832188 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrx5q\" (UniqueName: \"kubernetes.io/projected/e160affd-520c-4dad-89d4-d2a511c8a545-kube-api-access-mrx5q\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832242 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsvqv\" (UniqueName: \"kubernetes.io/projected/563046aa-bf59-4d04-bc57-ab1250051468-kube-api-access-gsvqv\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e160affd-520c-4dad-89d4-d2a511c8a545-client-ca\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832407 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e160affd-520c-4dad-89d4-d2a511c8a545-serving-cert\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.832471 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-proxy-ca-bundles\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.835115 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-client-ca\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.835648 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e160affd-520c-4dad-89d4-d2a511c8a545-client-ca\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.835756 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e160affd-520c-4dad-89d4-d2a511c8a545-config\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.836257 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-proxy-ca-bundles\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.836564 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/563046aa-bf59-4d04-bc57-ab1250051468-config\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.839405 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e160affd-520c-4dad-89d4-d2a511c8a545-serving-cert\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.841129 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/563046aa-bf59-4d04-bc57-ab1250051468-serving-cert\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.871235 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsvqv\" (UniqueName: \"kubernetes.io/projected/563046aa-bf59-4d04-bc57-ab1250051468-kube-api-access-gsvqv\") pod \"controller-manager-56b9b7b549-bcjcp\" (UID: \"563046aa-bf59-4d04-bc57-ab1250051468\") " pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.871308 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrx5q\" (UniqueName: \"kubernetes.io/projected/e160affd-520c-4dad-89d4-d2a511c8a545-kube-api-access-mrx5q\") pod \"route-controller-manager-f9c6f878d-9mvxb\" (UID: \"e160affd-520c-4dad-89d4-d2a511c8a545\") " pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.937756 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:04 crc kubenswrapper[4885]: I0308 19:37:04.944020 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.218141 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56b9b7b549-bcjcp"] Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.228804 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.229078 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:37:05 crc kubenswrapper[4885]: W0308 19:37:05.230437 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod563046aa_bf59_4d04_bc57_ab1250051468.slice/crio-ed38506332a797e1d338ce9cae3482924654d6406af0f29e4ccc6c80be0c346c WatchSource:0}: Error finding container ed38506332a797e1d338ce9cae3482924654d6406af0f29e4ccc6c80be0c346c: Status 404 returned error can't find the container with id ed38506332a797e1d338ce9cae3482924654d6406af0f29e4ccc6c80be0c346c Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.296197 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb"] Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.298467 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:37:05 crc kubenswrapper[4885]: W0308 19:37:05.302868 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode160affd_520c_4dad_89d4_d2a511c8a545.slice/crio-7e35ffef721c9715d35e5de766bc40f626e32792654484ee7e99e181b63cdcbf WatchSource:0}: Error finding container 7e35ffef721c9715d35e5de766bc40f626e32792654484ee7e99e181b63cdcbf: Status 404 returned error can't find the container with id 7e35ffef721c9715d35e5de766bc40f626e32792654484ee7e99e181b63cdcbf Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.383130 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20ebec2c-a495-4d93-b35f-ddd022a21564" path="/var/lib/kubelet/pods/20ebec2c-a495-4d93-b35f-ddd022a21564/volumes" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.384097 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb8d3c0-c05c-4557-b84d-94c2b20add8e" path="/var/lib/kubelet/pods/7bb8d3c0-c05c-4557-b84d-94c2b20add8e/volumes" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.520384 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" event={"ID":"563046aa-bf59-4d04-bc57-ab1250051468","Type":"ContainerStarted","Data":"66650e65234d81333ec0a7ba90489540e0f4da97f51390243281f124944d530c"} Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.520438 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" event={"ID":"563046aa-bf59-4d04-bc57-ab1250051468","Type":"ContainerStarted","Data":"ed38506332a797e1d338ce9cae3482924654d6406af0f29e4ccc6c80be0c346c"} Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.520602 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.524499 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" event={"ID":"e160affd-520c-4dad-89d4-d2a511c8a545","Type":"ContainerStarted","Data":"1f309038fd62bb3ff6b0bf5d6e8030902e0042a22ceb1bbd2bf6e7f8c5695924"} Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.524569 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" event={"ID":"e160affd-520c-4dad-89d4-d2a511c8a545","Type":"ContainerStarted","Data":"7e35ffef721c9715d35e5de766bc40f626e32792654484ee7e99e181b63cdcbf"} Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.527155 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.542145 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56b9b7b549-bcjcp" podStartSLOduration=3.542125083 podStartE2EDuration="3.542125083s" podCreationTimestamp="2026-03-08 19:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:37:05.541335022 +0000 UTC m=+326.937389275" watchObservedRunningTime="2026-03-08 19:37:05.542125083 +0000 UTC m=+326.938179106" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.567724 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" podStartSLOduration=3.567702334 podStartE2EDuration="3.567702334s" podCreationTimestamp="2026-03-08 19:37:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:37:05.564501579 +0000 UTC m=+326.960555602" watchObservedRunningTime="2026-03-08 19:37:05.567702334 +0000 UTC m=+326.963756357" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.578519 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:37:05 crc kubenswrapper[4885]: I0308 19:37:05.807141 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59wjr"] Mar 08 19:37:06 crc kubenswrapper[4885]: I0308 19:37:06.531732 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:06 crc kubenswrapper[4885]: I0308 19:37:06.541083 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f9c6f878d-9mvxb" Mar 08 19:37:06 crc kubenswrapper[4885]: I0308 19:37:06.905697 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:37:07 crc kubenswrapper[4885]: I0308 19:37:07.300105 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:37:07 crc kubenswrapper[4885]: I0308 19:37:07.535965 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-59wjr" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="registry-server" containerID="cri-o://51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2" gracePeriod=2 Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.062137 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.180979 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdjrg\" (UniqueName: \"kubernetes.io/projected/7346fb7f-6125-49c7-a422-cc169bc7e045-kube-api-access-bdjrg\") pod \"7346fb7f-6125-49c7-a422-cc169bc7e045\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.181106 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-utilities\") pod \"7346fb7f-6125-49c7-a422-cc169bc7e045\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.181174 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-catalog-content\") pod \"7346fb7f-6125-49c7-a422-cc169bc7e045\" (UID: \"7346fb7f-6125-49c7-a422-cc169bc7e045\") " Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.182620 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-utilities" (OuterVolumeSpecName: "utilities") pod "7346fb7f-6125-49c7-a422-cc169bc7e045" (UID: "7346fb7f-6125-49c7-a422-cc169bc7e045"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.186834 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7346fb7f-6125-49c7-a422-cc169bc7e045-kube-api-access-bdjrg" (OuterVolumeSpecName: "kube-api-access-bdjrg") pod "7346fb7f-6125-49c7-a422-cc169bc7e045" (UID: "7346fb7f-6125-49c7-a422-cc169bc7e045"). InnerVolumeSpecName "kube-api-access-bdjrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.206703 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cptvd"] Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.206942 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cptvd" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="registry-server" containerID="cri-o://47c92a0374e476a5eb0fbf68fbb5d0f2fff3e28d76c4d73cdd5e57511a785c76" gracePeriod=2 Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.265344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7346fb7f-6125-49c7-a422-cc169bc7e045" (UID: "7346fb7f-6125-49c7-a422-cc169bc7e045"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.284072 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.284222 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdjrg\" (UniqueName: \"kubernetes.io/projected/7346fb7f-6125-49c7-a422-cc169bc7e045-kube-api-access-bdjrg\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.284324 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7346fb7f-6125-49c7-a422-cc169bc7e045-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.288073 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.336237 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.548807 4885 generic.go:334] "Generic (PLEG): container finished" podID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerID="47c92a0374e476a5eb0fbf68fbb5d0f2fff3e28d76c4d73cdd5e57511a785c76" exitCode=0 Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.548970 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cptvd" event={"ID":"b30ce2c5-2b53-47aa-8470-394dd0d6256a","Type":"ContainerDied","Data":"47c92a0374e476a5eb0fbf68fbb5d0f2fff3e28d76c4d73cdd5e57511a785c76"} Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.560553 4885 generic.go:334] "Generic (PLEG): container finished" podID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerID="51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2" exitCode=0 Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.560635 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerDied","Data":"51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2"} Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.560677 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-59wjr" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.560705 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-59wjr" event={"ID":"7346fb7f-6125-49c7-a422-cc169bc7e045","Type":"ContainerDied","Data":"e265209fea1ffc2ce0afb5176cc04ccaf2989324d8da3b88689366337825e2af"} Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.560758 4885 scope.go:117] "RemoveContainer" containerID="51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.584756 4885 scope.go:117] "RemoveContainer" containerID="0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.602709 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-59wjr"] Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.606088 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-59wjr"] Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.634596 4885 scope.go:117] "RemoveContainer" containerID="3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.652481 4885 scope.go:117] "RemoveContainer" containerID="51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2" Mar 08 19:37:08 crc kubenswrapper[4885]: E0308 19:37:08.652868 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2\": container with ID starting with 51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2 not found: ID does not exist" containerID="51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.652943 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2"} err="failed to get container status \"51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2\": rpc error: code = NotFound desc = could not find container \"51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2\": container with ID starting with 51a9f1e7642162280763e188c5d54e6e36a9e79547d5278a37dbf540eeb4a4d2 not found: ID does not exist" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.652981 4885 scope.go:117] "RemoveContainer" containerID="0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58" Mar 08 19:37:08 crc kubenswrapper[4885]: E0308 19:37:08.653347 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58\": container with ID starting with 0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58 not found: ID does not exist" containerID="0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.653382 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58"} err="failed to get container status \"0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58\": rpc error: code = NotFound desc = could not find container \"0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58\": container with ID starting with 0eaeeee8a8068489b11571a2f5d12d03a623b648daab1954b9e05364ee753d58 not found: ID does not exist" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.653414 4885 scope.go:117] "RemoveContainer" containerID="3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd" Mar 08 19:37:08 crc kubenswrapper[4885]: E0308 19:37:08.653643 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd\": container with ID starting with 3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd not found: ID does not exist" containerID="3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.653681 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd"} err="failed to get container status \"3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd\": rpc error: code = NotFound desc = could not find container \"3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd\": container with ID starting with 3b7cd15f40add02efd41f84020134d33e42823c3f1452f6c1d7b8335f526e0bd not found: ID does not exist" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.732071 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.894501 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn2mq\" (UniqueName: \"kubernetes.io/projected/b30ce2c5-2b53-47aa-8470-394dd0d6256a-kube-api-access-jn2mq\") pod \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.894586 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-utilities\") pod \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.894675 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-catalog-content\") pod \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\" (UID: \"b30ce2c5-2b53-47aa-8470-394dd0d6256a\") " Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.896068 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-utilities" (OuterVolumeSpecName: "utilities") pod "b30ce2c5-2b53-47aa-8470-394dd0d6256a" (UID: "b30ce2c5-2b53-47aa-8470-394dd0d6256a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.900617 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b30ce2c5-2b53-47aa-8470-394dd0d6256a-kube-api-access-jn2mq" (OuterVolumeSpecName: "kube-api-access-jn2mq") pod "b30ce2c5-2b53-47aa-8470-394dd0d6256a" (UID: "b30ce2c5-2b53-47aa-8470-394dd0d6256a"). InnerVolumeSpecName "kube-api-access-jn2mq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.917398 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b30ce2c5-2b53-47aa-8470-394dd0d6256a" (UID: "b30ce2c5-2b53-47aa-8470-394dd0d6256a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.997321 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.997373 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn2mq\" (UniqueName: \"kubernetes.io/projected/b30ce2c5-2b53-47aa-8470-394dd0d6256a-kube-api-access-jn2mq\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:08 crc kubenswrapper[4885]: I0308 19:37:08.997395 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b30ce2c5-2b53-47aa-8470-394dd0d6256a-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.381182 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" path="/var/lib/kubelet/pods/7346fb7f-6125-49c7-a422-cc169bc7e045/volumes" Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.570188 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cptvd" event={"ID":"b30ce2c5-2b53-47aa-8470-394dd0d6256a","Type":"ContainerDied","Data":"abebee8a42d402151f81519cf2493ac06e94550afb46b420c76c20ec60907798"} Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.570247 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cptvd" Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.570263 4885 scope.go:117] "RemoveContainer" containerID="47c92a0374e476a5eb0fbf68fbb5d0f2fff3e28d76c4d73cdd5e57511a785c76" Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.591146 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cptvd"] Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.592629 4885 scope.go:117] "RemoveContainer" containerID="189818c391ca54e42f66a3022db5b2d8456e8ff7e65867b70d5d4849a35c1bc1" Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.603016 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cptvd"] Mar 08 19:37:09 crc kubenswrapper[4885]: I0308 19:37:09.617283 4885 scope.go:117] "RemoveContainer" containerID="539ba9557c56dcf3cdbab11c5e667581fd8e8b4b7a9df312373694f7ca85489f" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.265248 4885 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.265864 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="extract-content" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.265886 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="extract-content" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.265949 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="registry-server" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.265965 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="registry-server" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.265984 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="extract-utilities" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.265996 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="extract-utilities" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.266018 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="extract-utilities" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.266030 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="extract-utilities" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.266042 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="registry-server" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.266054 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="registry-server" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.266076 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="extract-content" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.266088 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="extract-content" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.266248 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7346fb7f-6125-49c7-a422-cc169bc7e045" containerName="registry-server" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.266265 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" containerName="registry-server" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.266787 4885 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.267002 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.267241 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad" gracePeriod=15 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.267332 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548" gracePeriod=15 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.267331 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0" gracePeriod=15 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.267396 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa" gracePeriod=15 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.267450 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65" gracePeriod=15 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.271359 4885 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.271795 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.271842 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.271879 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.271899 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.271974 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.271994 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272021 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272039 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272068 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272085 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272107 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272122 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272143 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272162 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272192 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272209 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272235 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272253 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272509 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272533 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272553 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272577 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272603 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272625 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272646 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272664 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 08 19:37:11 crc kubenswrapper[4885]: E0308 19:37:11.272911 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.272972 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.273229 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.385674 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b30ce2c5-2b53-47aa-8470-394dd0d6256a" path="/var/lib/kubelet/pods/b30ce2c5-2b53-47aa-8470-394dd0d6256a/volumes" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435574 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435659 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435691 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435739 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435774 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435797 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435817 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.435840 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538576 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538649 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538680 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538721 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538752 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538773 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538793 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538818 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.538898 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539026 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539059 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539088 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539118 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539190 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.539320 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.589138 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf6a8691-f048-4173-8c9e-06bca13e37e1" containerID="3c326d6a1a9e49cd242fcac137bf8c9ce49e4cb4b6826f444d12dd45c38d8ec9" exitCode=0 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.589202 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cf6a8691-f048-4173-8c9e-06bca13e37e1","Type":"ContainerDied","Data":"3c326d6a1a9e49cd242fcac137bf8c9ce49e4cb4b6826f444d12dd45c38d8ec9"} Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.590421 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.591521 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.592761 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.593351 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548" exitCode=0 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.593424 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65" exitCode=0 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.593439 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0" exitCode=0 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.593453 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa" exitCode=2 Mar 08 19:37:11 crc kubenswrapper[4885]: I0308 19:37:11.593504 4885 scope.go:117] "RemoveContainer" containerID="403fbda37c368628981e87c222aaa7d087667a599cc4854fe73e53ac4997f58e" Mar 08 19:37:12 crc kubenswrapper[4885]: I0308 19:37:12.350687 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:37:12 crc kubenswrapper[4885]: I0308 19:37:12.351996 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:37:12 crc kubenswrapper[4885]: W0308 19:37:12.352065 4885 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27289": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:12 crc kubenswrapper[4885]: E0308 19:37:12.352283 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27289\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:12 crc kubenswrapper[4885]: I0308 19:37:12.352203 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:37:12 crc kubenswrapper[4885]: I0308 19:37:12.352391 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:37:12 crc kubenswrapper[4885]: W0308 19:37:12.352830 4885 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:12 crc kubenswrapper[4885]: E0308 19:37:12.353012 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:12 crc kubenswrapper[4885]: W0308 19:37:12.353813 4885 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:12 crc kubenswrapper[4885]: E0308 19:37:12.353950 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:12 crc kubenswrapper[4885]: I0308 19:37:12.453663 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:37:12 crc kubenswrapper[4885]: W0308 19:37:12.454504 4885 reflector.go:561] object-"openshift-multus"/"metrics-daemon-secret": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27289": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:12 crc kubenswrapper[4885]: E0308 19:37:12.454643 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"metrics-daemon-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27289\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:12 crc kubenswrapper[4885]: I0308 19:37:12.610267 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.106043 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.107565 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.165184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf6a8691-f048-4173-8c9e-06bca13e37e1-kube-api-access\") pod \"cf6a8691-f048-4173-8c9e-06bca13e37e1\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.165302 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-var-lock\") pod \"cf6a8691-f048-4173-8c9e-06bca13e37e1\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.165354 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-kubelet-dir\") pod \"cf6a8691-f048-4173-8c9e-06bca13e37e1\" (UID: \"cf6a8691-f048-4173-8c9e-06bca13e37e1\") " Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.165521 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-var-lock" (OuterVolumeSpecName: "var-lock") pod "cf6a8691-f048-4173-8c9e-06bca13e37e1" (UID: "cf6a8691-f048-4173-8c9e-06bca13e37e1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.165630 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cf6a8691-f048-4173-8c9e-06bca13e37e1" (UID: "cf6a8691-f048-4173-8c9e-06bca13e37e1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.165998 4885 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-var-lock\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.166020 4885 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf6a8691-f048-4173-8c9e-06bca13e37e1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.175398 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf6a8691-f048-4173-8c9e-06bca13e37e1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cf6a8691-f048-4173-8c9e-06bca13e37e1" (UID: "cf6a8691-f048-4173-8c9e-06bca13e37e1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.267590 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf6a8691-f048-4173-8c9e-06bca13e37e1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.352147 4885 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.352328 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:39:15.352285225 +0000 UTC m=+456.748339288 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.352360 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.352839 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: W0308 19:37:13.353434 4885 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.353538 4885 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.353550 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.353628 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-08 19:39:15.353605841 +0000 UTC m=+456.749659904 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.405972 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.424961 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.441530 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.454075 4885 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: failed to sync secret cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: E0308 19:37:13.454171 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs podName:2f639c4e-64b8-45e9-bf33-c1d8c376b438 nodeName:}" failed. No retries permitted until 2026-03-08 19:39:15.454146686 +0000 UTC m=+456.850200719 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs") pod "network-metrics-daemon-jps4r" (UID: "2f639c4e-64b8-45e9-bf33-c1d8c376b438") : failed to sync secret cache: timed out waiting for the condition Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.620140 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.621779 4885 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad" exitCode=0 Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.621869 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25fd057e84a174b7a296049d5b9ff10b2152fa4531833fdfcd0463f9afce0854" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.623862 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cf6a8691-f048-4173-8c9e-06bca13e37e1","Type":"ContainerDied","Data":"d0cc1d7c926212abd6b6240d8d958b1f68d5c10c839d13411e786e80784f25ac"} Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.623896 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0cc1d7c926212abd6b6240d8d958b1f68d5c10c839d13411e786e80784f25ac" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.623990 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.685513 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.688766 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.690501 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.691445 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.692045 4885 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.880857 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.881096 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.881194 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.881519 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.881557 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.881554 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.882268 4885 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.882380 4885 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:13 crc kubenswrapper[4885]: I0308 19:37:13.882401 4885 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.352484 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.352530 4885 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.352620 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-08 19:39:16.35259092 +0000 UTC m=+457.748644973 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.353125 4885 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.353166 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.353326 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-08 19:39:16.353261408 +0000 UTC m=+457.749315461 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.387277 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jps4r" podUID="2f639c4e-64b8-45e9-bf33-c1d8c376b438" Mar 08 19:37:14 crc kubenswrapper[4885]: W0308 19:37:14.387281 4885 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27289": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:14 crc kubenswrapper[4885]: E0308 19:37:14.387417 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27289\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:14 crc kubenswrapper[4885]: I0308 19:37:14.630492 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:14 crc kubenswrapper[4885]: I0308 19:37:14.658492 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:14 crc kubenswrapper[4885]: I0308 19:37:14.659002 4885 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:15 crc kubenswrapper[4885]: W0308 19:37:15.205810 4885 reflector.go:561] object-"openshift-multus"/"metrics-daemon-secret": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27289": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:15 crc kubenswrapper[4885]: E0308 19:37:15.206001 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"metrics-daemon-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27289\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:15 crc kubenswrapper[4885]: W0308 19:37:15.332013 4885 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:15 crc kubenswrapper[4885]: E0308 19:37:15.332161 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:15 crc kubenswrapper[4885]: W0308 19:37:15.371579 4885 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:15 crc kubenswrapper[4885]: E0308 19:37:15.371709 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:15 crc kubenswrapper[4885]: I0308 19:37:15.380570 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 08 19:37:15 crc kubenswrapper[4885]: W0308 19:37:15.463416 4885 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:15 crc kubenswrapper[4885]: E0308 19:37:15.463568 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.334480 4885 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:16 crc kubenswrapper[4885]: I0308 19:37:16.335434 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.381664 4885 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.80:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189af4ddb4a2bba1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:37:16.379224993 +0000 UTC m=+337.775279056,LastTimestamp:2026-03-08 19:37:16.379224993 +0000 UTC m=+337.775279056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:37:16 crc kubenswrapper[4885]: I0308 19:37:16.648147 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"b1861bbe578eb14c832d685adadee713d3dcff480bb29f882dc4c33b60aa14a2"} Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.777852 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.778843 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.779797 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.781081 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.781854 4885 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: I0308 19:37:16.781948 4885 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.782394 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="200ms" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.831397 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:37:16Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:37:16Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:37:16Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-08T19:37:16Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.831853 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.832430 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.832688 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.832890 4885 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.832932 4885 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 08 19:37:16 crc kubenswrapper[4885]: E0308 19:37:16.983114 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="400ms" Mar 08 19:37:17 crc kubenswrapper[4885]: E0308 19:37:17.384782 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="800ms" Mar 08 19:37:17 crc kubenswrapper[4885]: I0308 19:37:17.658500 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bc127ac2c1aecb1907d28d88aaeef9c25966c90abdc300af9cb39fb411002f31"} Mar 08 19:37:17 crc kubenswrapper[4885]: I0308 19:37:17.659212 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:17 crc kubenswrapper[4885]: E0308 19:37:17.659337 4885 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:18 crc kubenswrapper[4885]: E0308 19:37:18.186375 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="1.6s" Mar 08 19:37:18 crc kubenswrapper[4885]: W0308 19:37:18.558823 4885 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27289": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:18 crc kubenswrapper[4885]: E0308 19:37:18.559015 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27289\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:18 crc kubenswrapper[4885]: E0308 19:37:18.666846 4885 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:37:19 crc kubenswrapper[4885]: W0308 19:37:19.143141 4885 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:19 crc kubenswrapper[4885]: E0308 19:37:19.143586 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:19 crc kubenswrapper[4885]: I0308 19:37:19.372292 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:19 crc kubenswrapper[4885]: E0308 19:37:19.787535 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="3.2s" Mar 08 19:37:20 crc kubenswrapper[4885]: W0308 19:37:20.881358 4885 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:20 crc kubenswrapper[4885]: E0308 19:37:20.881494 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:21 crc kubenswrapper[4885]: W0308 19:37:21.314048 4885 reflector.go:561] object-"openshift-multus"/"metrics-daemon-secret": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27289": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:21 crc kubenswrapper[4885]: E0308 19:37:21.314521 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"metrics-daemon-secret\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-multus/secrets?fieldSelector=metadata.name%3Dmetrics-daemon-secret&resourceVersion=27289\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:21 crc kubenswrapper[4885]: W0308 19:37:21.607718 4885 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27270": dial tcp 38.102.83.80:6443: connect: connection refused Mar 08 19:37:21 crc kubenswrapper[4885]: E0308 19:37:21.607831 4885 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27270\": dial tcp 38.102.83.80:6443: connect: connection refused" logger="UnhandledError" Mar 08 19:37:22 crc kubenswrapper[4885]: E0308 19:37:22.724711 4885 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.80:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189af4ddb4a2bba1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-08 19:37:16.379224993 +0000 UTC m=+337.775279056,LastTimestamp:2026-03-08 19:37:16.379224993 +0000 UTC m=+337.775279056,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 08 19:37:22 crc kubenswrapper[4885]: E0308 19:37:22.988627 4885 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.80:6443: connect: connection refused" interval="6.4s" Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.367412 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.719825 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.719952 4885 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52" exitCode=1 Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.720009 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52"} Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.720802 4885 scope.go:117] "RemoveContainer" containerID="544f8b2c205da50b650ebf12a7105c21812627191c03d20d88f2f8a9d20aad52" Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.722166 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:24 crc kubenswrapper[4885]: I0308 19:37:24.723901 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.367595 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.367595 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.367767 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.369297 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.369909 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.405337 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.405390 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:25 crc kubenswrapper[4885]: E0308 19:37:25.406048 4885 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.406761 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:25 crc kubenswrapper[4885]: W0308 19:37:25.432162 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-bb344e36d8c1ce8f197d35d375102988df813e29c12b45374f36e5542037f3b3 WatchSource:0}: Error finding container bb344e36d8c1ce8f197d35d375102988df813e29c12b45374f36e5542037f3b3: Status 404 returned error can't find the container with id bb344e36d8c1ce8f197d35d375102988df813e29c12b45374f36e5542037f3b3 Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.732508 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.732584 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6bc1d995ca761e047ab6d277b450a90b00ccbe1f56edb4d3d9e8120e33492bff"} Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.733660 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.734474 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.737177 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4ed7a863263333cde001b27bd1b307ff4d208443447ecd96c381d7415ed366e3"} Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.737220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bb344e36d8c1ce8f197d35d375102988df813e29c12b45374f36e5542037f3b3"} Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.737437 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.737452 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:25 crc kubenswrapper[4885]: E0308 19:37:25.737797 4885 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.738090 4885 status_manager.go:851] "Failed to get status for pod" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:25 crc kubenswrapper[4885]: I0308 19:37:25.738445 4885 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.80:6443: connect: connection refused" Mar 08 19:37:26 crc kubenswrapper[4885]: I0308 19:37:26.745603 4885 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="4ed7a863263333cde001b27bd1b307ff4d208443447ecd96c381d7415ed366e3" exitCode=0 Mar 08 19:37:26 crc kubenswrapper[4885]: I0308 19:37:26.745704 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"4ed7a863263333cde001b27bd1b307ff4d208443447ecd96c381d7415ed366e3"} Mar 08 19:37:26 crc kubenswrapper[4885]: I0308 19:37:26.745972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"55d4f0f71d83df0058dd5dde5c1d8869f5759f2716fa35e41ec4fea667417d68"} Mar 08 19:37:26 crc kubenswrapper[4885]: I0308 19:37:26.745993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f0731730456eb934faecc5c035c665ceda5f622a624ed0338e4ef304852a0178"} Mar 08 19:37:26 crc kubenswrapper[4885]: I0308 19:37:26.746005 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6d9577bfc5fa159c908c963144c9b39c1a869e64a4b45f5b11552dbac3be9ab8"} Mar 08 19:37:27 crc kubenswrapper[4885]: I0308 19:37:27.754318 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d6a9df53b2f9030019552a672fc0792ac6aca1b82d9b38caa44c6622c202930b"} Mar 08 19:37:27 crc kubenswrapper[4885]: I0308 19:37:27.754567 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:27 crc kubenswrapper[4885]: I0308 19:37:27.754597 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:27 crc kubenswrapper[4885]: I0308 19:37:27.754577 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"77cb8ad5c208d8a7d84e9642a997abac9e969ab2d1c37dc309ff6c8382e0ec83"} Mar 08 19:37:27 crc kubenswrapper[4885]: I0308 19:37:27.754686 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:28 crc kubenswrapper[4885]: I0308 19:37:28.367431 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:37:30 crc kubenswrapper[4885]: I0308 19:37:30.407968 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:30 crc kubenswrapper[4885]: I0308 19:37:30.408315 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:30 crc kubenswrapper[4885]: I0308 19:37:30.417063 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:30 crc kubenswrapper[4885]: I0308 19:37:30.471556 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:37:30 crc kubenswrapper[4885]: I0308 19:37:30.591893 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 08 19:37:30 crc kubenswrapper[4885]: I0308 19:37:30.606406 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 08 19:37:31 crc kubenswrapper[4885]: I0308 19:37:31.057963 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 08 19:37:32 crc kubenswrapper[4885]: I0308 19:37:32.768447 4885 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:32 crc kubenswrapper[4885]: I0308 19:37:32.862786 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f1e0cdcd-49a7-4ec6-a2dd-03087d3bd553" Mar 08 19:37:32 crc kubenswrapper[4885]: I0308 19:37:32.913874 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 08 19:37:33 crc kubenswrapper[4885]: I0308 19:37:33.386700 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 08 19:37:33 crc kubenswrapper[4885]: I0308 19:37:33.802782 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:33 crc kubenswrapper[4885]: I0308 19:37:33.802821 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:33 crc kubenswrapper[4885]: I0308 19:37:33.807274 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f1e0cdcd-49a7-4ec6-a2dd-03087d3bd553" Mar 08 19:37:33 crc kubenswrapper[4885]: I0308 19:37:33.809182 4885 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-crc" containerID="cri-o://6d9577bfc5fa159c908c963144c9b39c1a869e64a4b45f5b11552dbac3be9ab8" Mar 08 19:37:33 crc kubenswrapper[4885]: I0308 19:37:33.809212 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:34 crc kubenswrapper[4885]: I0308 19:37:34.667619 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:37:34 crc kubenswrapper[4885]: I0308 19:37:34.667955 4885 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 08 19:37:34 crc kubenswrapper[4885]: I0308 19:37:34.667997 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 08 19:37:34 crc kubenswrapper[4885]: I0308 19:37:34.807616 4885 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:34 crc kubenswrapper[4885]: I0308 19:37:34.807640 4885 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c3a1ba80-f794-4fb3-8b8e-012e8ec1c0bb" Mar 08 19:37:34 crc kubenswrapper[4885]: I0308 19:37:34.814694 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="f1e0cdcd-49a7-4ec6-a2dd-03087d3bd553" Mar 08 19:37:43 crc kubenswrapper[4885]: I0308 19:37:43.153881 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 08 19:37:43 crc kubenswrapper[4885]: I0308 19:37:43.260407 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 08 19:37:43 crc kubenswrapper[4885]: I0308 19:37:43.757249 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 08 19:37:43 crc kubenswrapper[4885]: I0308 19:37:43.789340 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.153544 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.161593 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.484659 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.641435 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.668117 4885 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.668220 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.832810 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.862544 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.919015 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 08 19:37:44 crc kubenswrapper[4885]: I0308 19:37:44.991570 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.116627 4885 scope.go:117] "RemoveContainer" containerID="664d05642fad23eda998924c93b00797c7b005b6ff83c6e9a5513e7f3d0e2f65" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.140353 4885 scope.go:117] "RemoveContainer" containerID="9cf7f4f5dae3e770f80ca054b5b4624b099012cc0fa05d1c0c901502f94ef1d0" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.142289 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.159874 4885 scope.go:117] "RemoveContainer" containerID="76ac038a4c3eddc9ba19ee236239a8cbff21ee5425da622ffed6976e503e8a7d" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.170048 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.192407 4885 scope.go:117] "RemoveContainer" containerID="f45f3990b48e9bb7da6f360ef411eed91371ed58e3dfb8e1d2e0f8f49294ffaa" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.221242 4885 scope.go:117] "RemoveContainer" containerID="89857219f1f05def789b6850115c78a625ec2b523e724cf671a424f5e779aaad" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.292937 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.311800 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.528456 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.622278 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.685109 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.701573 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.725951 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.815635 4885 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 08 19:37:45 crc kubenswrapper[4885]: I0308 19:37:45.968532 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.027391 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.110809 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.331420 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.405334 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.466876 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.479637 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.480472 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.512537 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.570591 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.670272 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.725230 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.791385 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.898214 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.898299 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.910865 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.947304 4885 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.961668 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 08 19:37:46 crc kubenswrapper[4885]: I0308 19:37:46.962025 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.011870 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.020177 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.103381 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.178024 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.287477 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.306742 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.324822 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.372843 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.448164 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.476006 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.527873 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.578783 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.715630 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.750672 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.764280 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.770798 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.800034 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.810332 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.821586 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.826061 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.844312 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 08 19:37:47 crc kubenswrapper[4885]: I0308 19:37:47.984134 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.029173 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.035855 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.167333 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.239842 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.251729 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.252858 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.529503 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.534854 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.536578 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.556464 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.706911 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.989663 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 08 19:37:48 crc kubenswrapper[4885]: I0308 19:37:48.997395 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.022998 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.135794 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.143051 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.252016 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.278034 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.327389 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.348306 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.353214 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.373091 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.406399 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.457141 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.548354 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.578068 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.593534 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.640202 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.652975 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.705407 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.780468 4885 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.789841 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.816036 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.909968 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.913464 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.976128 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.979371 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 08 19:37:49 crc kubenswrapper[4885]: I0308 19:37:49.982413 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.063161 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.152275 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.256248 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.265751 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.374364 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.418630 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.471590 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.476893 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.488061 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.504690 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.506479 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.636991 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.715520 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.814886 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.818626 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.855967 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.883402 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 08 19:37:50 crc kubenswrapper[4885]: I0308 19:37:50.954517 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.060783 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.065607 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.163178 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.228465 4885 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.254064 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.260000 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.466127 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.478161 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.527527 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.540510 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.549338 4885 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.620491 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.645052 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.663551 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.707220 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.749081 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.846750 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.850595 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.877161 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.888070 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.919164 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.940779 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 08 19:37:51 crc kubenswrapper[4885]: I0308 19:37:51.965626 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.020290 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.064984 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.090273 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.116535 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.124987 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.146214 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.166456 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.211475 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.280702 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.311281 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.311699 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.449075 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.497794 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.615167 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.617570 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.636380 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.841706 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.904579 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.958822 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 08 19:37:52 crc kubenswrapper[4885]: I0308 19:37:52.970075 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.081519 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.092894 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.221637 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.232447 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.284078 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.380506 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.413439 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.428434 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.497820 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.553560 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.581238 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.630479 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.769597 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.825704 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.874199 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.894121 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 08 19:37:53 crc kubenswrapper[4885]: I0308 19:37:53.906951 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.087470 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.159411 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.197052 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.372008 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.406221 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.572819 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.627378 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.674134 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.683204 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.711684 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.730387 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.732637 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.847648 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.889524 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.890826 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.950112 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.960758 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 08 19:37:54 crc kubenswrapper[4885]: I0308 19:37:54.989766 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.037870 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.045843 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.368858 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.458065 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.468005 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.579253 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.590395 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.594964 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.598476 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.745209 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.776462 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.794332 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.817913 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.840983 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 19:37:55 crc kubenswrapper[4885]: I0308 19:37:55.895352 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.018852 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.058952 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.083408 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.171998 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.251696 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.398851 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.427319 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.561624 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.723404 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.757345 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.807369 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.834966 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.902132 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.902159 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 08 19:37:56 crc kubenswrapper[4885]: I0308 19:37:56.980769 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.097559 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.250793 4885 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.258295 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.258371 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.272741 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.321185 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.326961 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.326942073 podStartE2EDuration="25.326942073s" podCreationTimestamp="2026-03-08 19:37:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:37:57.294277364 +0000 UTC m=+378.690331427" watchObservedRunningTime="2026-03-08 19:37:57.326942073 +0000 UTC m=+378.722996126" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.843113 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.886678 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.904775 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 08 19:37:57 crc kubenswrapper[4885]: I0308 19:37:57.905881 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.029127 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.420583 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.602208 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.643718 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.645765 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.714682 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.755769 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 08 19:37:58 crc kubenswrapper[4885]: I0308 19:37:58.912891 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 08 19:37:59 crc kubenswrapper[4885]: I0308 19:37:59.057029 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 08 19:37:59 crc kubenswrapper[4885]: I0308 19:37:59.103717 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 08 19:37:59 crc kubenswrapper[4885]: I0308 19:37:59.394044 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 08 19:37:59 crc kubenswrapper[4885]: I0308 19:37:59.530841 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 08 19:37:59 crc kubenswrapper[4885]: I0308 19:37:59.975656 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 08 19:38:01 crc kubenswrapper[4885]: I0308 19:38:01.098609 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.447879 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549978-9wjmp"] Mar 08 19:38:03 crc kubenswrapper[4885]: E0308 19:38:03.448293 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" containerName="installer" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.448303 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" containerName="installer" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.448388 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf6a8691-f048-4173-8c9e-06bca13e37e1" containerName="installer" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.448716 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.450120 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.450203 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.450256 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.455651 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549978-9wjmp"] Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.544391 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljx6n\" (UniqueName: \"kubernetes.io/projected/71a3a17a-ffa4-4d31-94fb-7e720297e94b-kube-api-access-ljx6n\") pod \"auto-csr-approver-29549978-9wjmp\" (UID: \"71a3a17a-ffa4-4d31-94fb-7e720297e94b\") " pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.645501 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljx6n\" (UniqueName: \"kubernetes.io/projected/71a3a17a-ffa4-4d31-94fb-7e720297e94b-kube-api-access-ljx6n\") pod \"auto-csr-approver-29549978-9wjmp\" (UID: \"71a3a17a-ffa4-4d31-94fb-7e720297e94b\") " pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.693680 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljx6n\" (UniqueName: \"kubernetes.io/projected/71a3a17a-ffa4-4d31-94fb-7e720297e94b-kube-api-access-ljx6n\") pod \"auto-csr-approver-29549978-9wjmp\" (UID: \"71a3a17a-ffa4-4d31-94fb-7e720297e94b\") " pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:03 crc kubenswrapper[4885]: I0308 19:38:03.761209 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:04 crc kubenswrapper[4885]: I0308 19:38:04.193906 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549978-9wjmp"] Mar 08 19:38:04 crc kubenswrapper[4885]: W0308 19:38:04.199356 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71a3a17a_ffa4_4d31_94fb_7e720297e94b.slice/crio-39e37b084713e3e60681ff2d807bedc8e9f56d2517ec3f5b9a8661bf849e5716 WatchSource:0}: Error finding container 39e37b084713e3e60681ff2d807bedc8e9f56d2517ec3f5b9a8661bf849e5716: Status 404 returned error can't find the container with id 39e37b084713e3e60681ff2d807bedc8e9f56d2517ec3f5b9a8661bf849e5716 Mar 08 19:38:05 crc kubenswrapper[4885]: I0308 19:38:05.009887 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" event={"ID":"71a3a17a-ffa4-4d31-94fb-7e720297e94b","Type":"ContainerStarted","Data":"39e37b084713e3e60681ff2d807bedc8e9f56d2517ec3f5b9a8661bf849e5716"} Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.017621 4885 generic.go:334] "Generic (PLEG): container finished" podID="71a3a17a-ffa4-4d31-94fb-7e720297e94b" containerID="2aa80e241984cb33e73f4238b00c6079576ad160fd6e5654000fab91ecb22f03" exitCode=0 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.017684 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" event={"ID":"71a3a17a-ffa4-4d31-94fb-7e720297e94b","Type":"ContainerDied","Data":"2aa80e241984cb33e73f4238b00c6079576ad160fd6e5654000fab91ecb22f03"} Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.234829 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xpctw"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.235267 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xpctw" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="registry-server" containerID="cri-o://b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a" gracePeriod=30 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.252533 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gnjnd"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.252886 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gnjnd" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="registry-server" containerID="cri-o://246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a" gracePeriod=30 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.265889 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ldvgz"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.266312 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" containerID="cri-o://598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01" gracePeriod=30 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.271710 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-62xgk"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.272080 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-62xgk" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="registry-server" containerID="cri-o://2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" gracePeriod=30 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.287521 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pqxt7"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.287826 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pqxt7" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="registry-server" containerID="cri-o://ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c" gracePeriod=30 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.291294 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-prdq9"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.291553 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-prdq9" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="registry-server" containerID="cri-o://cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80" gracePeriod=30 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.300094 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2774l"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.300836 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.314676 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2774l"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.378022 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb9h9\" (UniqueName: \"kubernetes.io/projected/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-kube-api-access-rb9h9\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.378117 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.378189 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.479802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb9h9\" (UniqueName: \"kubernetes.io/projected/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-kube-api-access-rb9h9\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.479871 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.479894 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.481160 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.485131 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.499083 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb9h9\" (UniqueName: \"kubernetes.io/projected/1e87323f-cf50-46ef-8e7c-cccd8a1e3601-kube-api-access-rb9h9\") pod \"marketplace-operator-79b997595-2774l\" (UID: \"1e87323f-cf50-46ef-8e7c-cccd8a1e3601\") " pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.773418 4885 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.773718 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://bc127ac2c1aecb1907d28d88aaeef9c25966c90abdc300af9cb39fb411002f31" gracePeriod=5 Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.790172 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.794088 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:38:06 crc kubenswrapper[4885]: E0308 19:38:06.829425 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876 is running failed: container process not found" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 19:38:06 crc kubenswrapper[4885]: E0308 19:38:06.829865 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876 is running failed: container process not found" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 19:38:06 crc kubenswrapper[4885]: E0308 19:38:06.830600 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876 is running failed: container process not found" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" cmd=["grpc_health_probe","-addr=:50051"] Mar 08 19:38:06 crc kubenswrapper[4885]: E0308 19:38:06.830641 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-62xgk" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="registry-server" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.876499 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.881235 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.885639 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-operator-metrics\") pod \"83de4c2d-767a-4635-8748-486dd45683a1\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.885712 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqf5f\" (UniqueName: \"kubernetes.io/projected/83de4c2d-767a-4635-8748-486dd45683a1-kube-api-access-pqf5f\") pod \"83de4c2d-767a-4635-8748-486dd45683a1\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.885794 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-trusted-ca\") pod \"83de4c2d-767a-4635-8748-486dd45683a1\" (UID: \"83de4c2d-767a-4635-8748-486dd45683a1\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.886682 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "83de4c2d-767a-4635-8748-486dd45683a1" (UID: "83de4c2d-767a-4635-8748-486dd45683a1"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.891356 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.910104 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83de4c2d-767a-4635-8748-486dd45683a1-kube-api-access-pqf5f" (OuterVolumeSpecName: "kube-api-access-pqf5f") pod "83de4c2d-767a-4635-8748-486dd45683a1" (UID: "83de4c2d-767a-4635-8748-486dd45683a1"). InnerVolumeSpecName "kube-api-access-pqf5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.910693 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "83de4c2d-767a-4635-8748-486dd45683a1" (UID: "83de4c2d-767a-4635-8748-486dd45683a1"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.919759 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.922645 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989121 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-utilities\") pod \"2a6b85b3-0bb1-4199-983f-615a6c932f09\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989174 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-catalog-content\") pod \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989201 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs66k\" (UniqueName: \"kubernetes.io/projected/7d8fbc68-3714-4fe4-9f62-857c5dc05661-kube-api-access-gs66k\") pod \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989219 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-st8f9\" (UniqueName: \"kubernetes.io/projected/05666e0b-c4ce-451a-ba67-ddb78866ef54-kube-api-access-st8f9\") pod \"05666e0b-c4ce-451a-ba67-ddb78866ef54\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989245 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-utilities\") pod \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\" (UID: \"7d8fbc68-3714-4fe4-9f62-857c5dc05661\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989282 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-catalog-content\") pod \"56c146b0-3448-4140-8cf0-8d637f7f22a9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989301 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-catalog-content\") pod \"8881ba5e-d9d1-42a9-98af-849e72053757\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989370 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-catalog-content\") pod \"05666e0b-c4ce-451a-ba67-ddb78866ef54\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989386 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-utilities\") pod \"8881ba5e-d9d1-42a9-98af-849e72053757\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989884 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-catalog-content\") pod \"2a6b85b3-0bb1-4199-983f-615a6c932f09\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989906 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5wpq\" (UniqueName: \"kubernetes.io/projected/56c146b0-3448-4140-8cf0-8d637f7f22a9-kube-api-access-j5wpq\") pod \"56c146b0-3448-4140-8cf0-8d637f7f22a9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989940 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x76vh\" (UniqueName: \"kubernetes.io/projected/8881ba5e-d9d1-42a9-98af-849e72053757-kube-api-access-x76vh\") pod \"8881ba5e-d9d1-42a9-98af-849e72053757\" (UID: \"8881ba5e-d9d1-42a9-98af-849e72053757\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989958 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-utilities\") pod \"56c146b0-3448-4140-8cf0-8d637f7f22a9\" (UID: \"56c146b0-3448-4140-8cf0-8d637f7f22a9\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.989976 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cp8v\" (UniqueName: \"kubernetes.io/projected/2a6b85b3-0bb1-4199-983f-615a6c932f09-kube-api-access-8cp8v\") pod \"2a6b85b3-0bb1-4199-983f-615a6c932f09\" (UID: \"2a6b85b3-0bb1-4199-983f-615a6c932f09\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.990000 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-utilities\") pod \"05666e0b-c4ce-451a-ba67-ddb78866ef54\" (UID: \"05666e0b-c4ce-451a-ba67-ddb78866ef54\") " Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.990235 4885 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.990247 4885 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/83de4c2d-767a-4635-8748-486dd45683a1-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.990233 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-utilities" (OuterVolumeSpecName: "utilities") pod "7d8fbc68-3714-4fe4-9f62-857c5dc05661" (UID: "7d8fbc68-3714-4fe4-9f62-857c5dc05661"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.990257 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqf5f\" (UniqueName: \"kubernetes.io/projected/83de4c2d-767a-4635-8748-486dd45683a1-kube-api-access-pqf5f\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.990905 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-utilities" (OuterVolumeSpecName: "utilities") pod "05666e0b-c4ce-451a-ba67-ddb78866ef54" (UID: "05666e0b-c4ce-451a-ba67-ddb78866ef54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.991697 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-utilities" (OuterVolumeSpecName: "utilities") pod "56c146b0-3448-4140-8cf0-8d637f7f22a9" (UID: "56c146b0-3448-4140-8cf0-8d637f7f22a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.992338 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d8fbc68-3714-4fe4-9f62-857c5dc05661-kube-api-access-gs66k" (OuterVolumeSpecName: "kube-api-access-gs66k") pod "7d8fbc68-3714-4fe4-9f62-857c5dc05661" (UID: "7d8fbc68-3714-4fe4-9f62-857c5dc05661"). InnerVolumeSpecName "kube-api-access-gs66k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.993488 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-utilities" (OuterVolumeSpecName: "utilities") pod "8881ba5e-d9d1-42a9-98af-849e72053757" (UID: "8881ba5e-d9d1-42a9-98af-849e72053757"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:06 crc kubenswrapper[4885]: I0308 19:38:06.995120 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a6b85b3-0bb1-4199-983f-615a6c932f09-kube-api-access-8cp8v" (OuterVolumeSpecName: "kube-api-access-8cp8v") pod "2a6b85b3-0bb1-4199-983f-615a6c932f09" (UID: "2a6b85b3-0bb1-4199-983f-615a6c932f09"). InnerVolumeSpecName "kube-api-access-8cp8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.005360 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05666e0b-c4ce-451a-ba67-ddb78866ef54-kube-api-access-st8f9" (OuterVolumeSpecName: "kube-api-access-st8f9") pod "05666e0b-c4ce-451a-ba67-ddb78866ef54" (UID: "05666e0b-c4ce-451a-ba67-ddb78866ef54"). InnerVolumeSpecName "kube-api-access-st8f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.005441 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c146b0-3448-4140-8cf0-8d637f7f22a9-kube-api-access-j5wpq" (OuterVolumeSpecName: "kube-api-access-j5wpq") pod "56c146b0-3448-4140-8cf0-8d637f7f22a9" (UID: "56c146b0-3448-4140-8cf0-8d637f7f22a9"). InnerVolumeSpecName "kube-api-access-j5wpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.012334 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-utilities" (OuterVolumeSpecName: "utilities") pod "2a6b85b3-0bb1-4199-983f-615a6c932f09" (UID: "2a6b85b3-0bb1-4199-983f-615a6c932f09"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.017025 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8881ba5e-d9d1-42a9-98af-849e72053757-kube-api-access-x76vh" (OuterVolumeSpecName: "kube-api-access-x76vh") pod "8881ba5e-d9d1-42a9-98af-849e72053757" (UID: "8881ba5e-d9d1-42a9-98af-849e72053757"). InnerVolumeSpecName "kube-api-access-x76vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.026883 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerID="246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a" exitCode=0 Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.026998 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gnjnd" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.027118 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerDied","Data":"246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.027160 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gnjnd" event={"ID":"2a6b85b3-0bb1-4199-983f-615a6c932f09","Type":"ContainerDied","Data":"de0e60604d3aa86bafd041642af24e9211dfd9322182b13ae9a6b56c608e4e2c"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.027177 4885 scope.go:117] "RemoveContainer" containerID="246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.032092 4885 generic.go:334] "Generic (PLEG): container finished" podID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerID="b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a" exitCode=0 Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.032163 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xpctw" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.032177 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerDied","Data":"b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.032225 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xpctw" event={"ID":"7d8fbc68-3714-4fe4-9f62-857c5dc05661","Type":"ContainerDied","Data":"9bf4496f530b593c8f965a319860f693e2e64a4e57a6c4d734640ea6410547bb"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.036259 4885 generic.go:334] "Generic (PLEG): container finished" podID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerID="cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80" exitCode=0 Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.036330 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prdq9" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.036941 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerDied","Data":"cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.036974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prdq9" event={"ID":"56c146b0-3448-4140-8cf0-8d637f7f22a9","Type":"ContainerDied","Data":"2d0ba41af79384822819519e78d3dc7f4370f70eb9f77bb319b3453eb2e2a641"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.041551 4885 generic.go:334] "Generic (PLEG): container finished" podID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" exitCode=0 Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.041947 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62xgk" event={"ID":"05666e0b-c4ce-451a-ba67-ddb78866ef54","Type":"ContainerDied","Data":"2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.042036 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62xgk" event={"ID":"05666e0b-c4ce-451a-ba67-ddb78866ef54","Type":"ContainerDied","Data":"731327ad0ac3fd48c5dcf825c4aabc506f0114149e811eabdfb465d917e7e122"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.042895 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62xgk" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.045724 4885 generic.go:334] "Generic (PLEG): container finished" podID="83de4c2d-767a-4635-8748-486dd45683a1" containerID="598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01" exitCode=0 Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.045869 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" event={"ID":"83de4c2d-767a-4635-8748-486dd45683a1","Type":"ContainerDied","Data":"598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.046229 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" event={"ID":"83de4c2d-767a-4635-8748-486dd45683a1","Type":"ContainerDied","Data":"2ea70402b3dbdea12ac7aa07af023bd9134877c1cdbc64f413fbc103681b29c0"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.047208 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ldvgz" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.049082 4885 generic.go:334] "Generic (PLEG): container finished" podID="8881ba5e-d9d1-42a9-98af-849e72053757" containerID="ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c" exitCode=0 Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.049242 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pqxt7" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.049483 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqxt7" event={"ID":"8881ba5e-d9d1-42a9-98af-849e72053757","Type":"ContainerDied","Data":"ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.049510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pqxt7" event={"ID":"8881ba5e-d9d1-42a9-98af-849e72053757","Type":"ContainerDied","Data":"789329547b46208ee5dc38fb335a56f42b713329e6d11c7a30e5d4042c3f9ea3"} Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.054285 4885 scope.go:117] "RemoveContainer" containerID="fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.055685 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a6b85b3-0bb1-4199-983f-615a6c932f09" (UID: "2a6b85b3-0bb1-4199-983f-615a6c932f09"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.067217 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05666e0b-c4ce-451a-ba67-ddb78866ef54" (UID: "05666e0b-c4ce-451a-ba67-ddb78866ef54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092604 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092633 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092643 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092652 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5wpq\" (UniqueName: \"kubernetes.io/projected/56c146b0-3448-4140-8cf0-8d637f7f22a9-kube-api-access-j5wpq\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092663 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x76vh\" (UniqueName: \"kubernetes.io/projected/8881ba5e-d9d1-42a9-98af-849e72053757-kube-api-access-x76vh\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092673 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092682 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cp8v\" (UniqueName: \"kubernetes.io/projected/2a6b85b3-0bb1-4199-983f-615a6c932f09-kube-api-access-8cp8v\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092690 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05666e0b-c4ce-451a-ba67-ddb78866ef54-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092698 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6b85b3-0bb1-4199-983f-615a6c932f09-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092706 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs66k\" (UniqueName: \"kubernetes.io/projected/7d8fbc68-3714-4fe4-9f62-857c5dc05661-kube-api-access-gs66k\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092715 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-st8f9\" (UniqueName: \"kubernetes.io/projected/05666e0b-c4ce-451a-ba67-ddb78866ef54-kube-api-access-st8f9\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.092724 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.098305 4885 scope.go:117] "RemoveContainer" containerID="c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.115630 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ldvgz"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.123828 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ldvgz"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.128284 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7d8fbc68-3714-4fe4-9f62-857c5dc05661" (UID: "7d8fbc68-3714-4fe4-9f62-857c5dc05661"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.154186 4885 scope.go:117] "RemoveContainer" containerID="246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.157269 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a\": container with ID starting with 246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a not found: ID does not exist" containerID="246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.157309 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a"} err="failed to get container status \"246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a\": rpc error: code = NotFound desc = could not find container \"246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a\": container with ID starting with 246dbcbfb81d5273eaaee0471f2f4a28720555c49b9862db2ee160b1c09c239a not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.157334 4885 scope.go:117] "RemoveContainer" containerID="fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.158722 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b\": container with ID starting with fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b not found: ID does not exist" containerID="fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.158768 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b"} err="failed to get container status \"fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b\": rpc error: code = NotFound desc = could not find container \"fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b\": container with ID starting with fac4bfd0ae75bed11c3280968c4f779189d5f8e25019676de4f0f830263b025b not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.158795 4885 scope.go:117] "RemoveContainer" containerID="c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.159404 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f\": container with ID starting with c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f not found: ID does not exist" containerID="c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.159423 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f"} err="failed to get container status \"c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f\": rpc error: code = NotFound desc = could not find container \"c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f\": container with ID starting with c66e8b4468c3447641e1c6fe112372174705ed885c7b65f4960f1f3261bb933f not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.159435 4885 scope.go:117] "RemoveContainer" containerID="b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.173689 4885 scope.go:117] "RemoveContainer" containerID="06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.194090 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7d8fbc68-3714-4fe4-9f62-857c5dc05661-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.209990 4885 scope.go:117] "RemoveContainer" containerID="87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.215224 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8881ba5e-d9d1-42a9-98af-849e72053757" (UID: "8881ba5e-d9d1-42a9-98af-849e72053757"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.239447 4885 scope.go:117] "RemoveContainer" containerID="b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.240047 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a\": container with ID starting with b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a not found: ID does not exist" containerID="b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.240089 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a"} err="failed to get container status \"b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a\": rpc error: code = NotFound desc = could not find container \"b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a\": container with ID starting with b19096f3e4797bc5cb4b5245b584752714247bd207efcc70a1948e5611d9df3a not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.240142 4885 scope.go:117] "RemoveContainer" containerID="06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.240586 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313\": container with ID starting with 06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313 not found: ID does not exist" containerID="06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.240622 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313"} err="failed to get container status \"06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313\": rpc error: code = NotFound desc = could not find container \"06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313\": container with ID starting with 06973e79d6207856ae8075f7baa9c9b6c6c35344c2846498471d98c65968f313 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.240648 4885 scope.go:117] "RemoveContainer" containerID="87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.241202 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54\": container with ID starting with 87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54 not found: ID does not exist" containerID="87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.241268 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54"} err="failed to get container status \"87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54\": rpc error: code = NotFound desc = could not find container \"87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54\": container with ID starting with 87739e22f4f8451eba802c00a2b12d7a2e92a3c902c4bec2a788f584fcf20f54 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.241297 4885 scope.go:117] "RemoveContainer" containerID="cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.259123 4885 scope.go:117] "RemoveContainer" containerID="aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.274066 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56c146b0-3448-4140-8cf0-8d637f7f22a9" (UID: "56c146b0-3448-4140-8cf0-8d637f7f22a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.278964 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.280736 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2774l"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.281270 4885 scope.go:117] "RemoveContainer" containerID="6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.295076 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56c146b0-3448-4140-8cf0-8d637f7f22a9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.295100 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8881ba5e-d9d1-42a9-98af-849e72053757-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.305275 4885 scope.go:117] "RemoveContainer" containerID="cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.305853 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80\": container with ID starting with cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80 not found: ID does not exist" containerID="cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.305910 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80"} err="failed to get container status \"cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80\": rpc error: code = NotFound desc = could not find container \"cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80\": container with ID starting with cb58e97fdecbaa51a02f1c5c32f3848bad5a2e255177bc9e5817f9e6575aae80 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.305955 4885 scope.go:117] "RemoveContainer" containerID="aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.306428 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908\": container with ID starting with aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908 not found: ID does not exist" containerID="aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.306492 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908"} err="failed to get container status \"aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908\": rpc error: code = NotFound desc = could not find container \"aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908\": container with ID starting with aff573d6cceaae4b4f85547646f2a529bc54a9a762417785ae5c31f81f7fb908 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.306507 4885 scope.go:117] "RemoveContainer" containerID="6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.307028 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076\": container with ID starting with 6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076 not found: ID does not exist" containerID="6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.307050 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076"} err="failed to get container status \"6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076\": rpc error: code = NotFound desc = could not find container \"6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076\": container with ID starting with 6a04c0e7083bac46d2718e4db5372fe2fef47c8ae140d5ba2e6d5459d4aac076 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.307063 4885 scope.go:117] "RemoveContainer" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.319154 4885 scope.go:117] "RemoveContainer" containerID="67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.338644 4885 scope.go:117] "RemoveContainer" containerID="b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.360452 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gnjnd"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.362956 4885 scope.go:117] "RemoveContainer" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.363005 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gnjnd"] Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.363406 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876\": container with ID starting with 2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876 not found: ID does not exist" containerID="2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.363441 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876"} err="failed to get container status \"2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876\": rpc error: code = NotFound desc = could not find container \"2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876\": container with ID starting with 2b6262655ed82d7ac2685dd611a74c69c06ed7e6874d50cbb100e1caa3b7d876 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.363467 4885 scope.go:117] "RemoveContainer" containerID="67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.363966 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de\": container with ID starting with 67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de not found: ID does not exist" containerID="67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.364012 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de"} err="failed to get container status \"67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de\": rpc error: code = NotFound desc = could not find container \"67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de\": container with ID starting with 67e47c4e63933b0b4624ba52ca2a1c36cbb0605803e17164889577bbac4c97de not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.364039 4885 scope.go:117] "RemoveContainer" containerID="b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.366784 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd\": container with ID starting with b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd not found: ID does not exist" containerID="b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.366821 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd"} err="failed to get container status \"b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd\": rpc error: code = NotFound desc = could not find container \"b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd\": container with ID starting with b1160924ab0cf12899bc1adfc06b76af477269094b44acff2d6e1b924ad0d9fd not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.366849 4885 scope.go:117] "RemoveContainer" containerID="598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.381646 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" path="/var/lib/kubelet/pods/2a6b85b3-0bb1-4199-983f-615a6c932f09/volumes" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.383630 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83de4c2d-767a-4635-8748-486dd45683a1" path="/var/lib/kubelet/pods/83de4c2d-767a-4635-8748-486dd45683a1/volumes" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.396627 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljx6n\" (UniqueName: \"kubernetes.io/projected/71a3a17a-ffa4-4d31-94fb-7e720297e94b-kube-api-access-ljx6n\") pod \"71a3a17a-ffa4-4d31-94fb-7e720297e94b\" (UID: \"71a3a17a-ffa4-4d31-94fb-7e720297e94b\") " Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.399590 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71a3a17a-ffa4-4d31-94fb-7e720297e94b-kube-api-access-ljx6n" (OuterVolumeSpecName: "kube-api-access-ljx6n") pod "71a3a17a-ffa4-4d31-94fb-7e720297e94b" (UID: "71a3a17a-ffa4-4d31-94fb-7e720297e94b"). InnerVolumeSpecName "kube-api-access-ljx6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.439474 4885 scope.go:117] "RemoveContainer" containerID="598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.440652 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01\": container with ID starting with 598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01 not found: ID does not exist" containerID="598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.440699 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01"} err="failed to get container status \"598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01\": rpc error: code = NotFound desc = could not find container \"598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01\": container with ID starting with 598022ff7da9c58662969723bd3fc8cd6d06f0365a5beb79b26d0ef3438c3b01 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.440736 4885 scope.go:117] "RemoveContainer" containerID="ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.452758 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-prdq9"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.459501 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-prdq9"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.477225 4885 scope.go:117] "RemoveContainer" containerID="2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.488684 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-62xgk"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.496125 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-62xgk"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.499206 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljx6n\" (UniqueName: \"kubernetes.io/projected/71a3a17a-ffa4-4d31-94fb-7e720297e94b-kube-api-access-ljx6n\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.503384 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pqxt7"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.512278 4885 scope.go:117] "RemoveContainer" containerID="3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.518391 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pqxt7"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.519986 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xpctw"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.522548 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xpctw"] Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.531985 4885 scope.go:117] "RemoveContainer" containerID="ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.532376 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c\": container with ID starting with ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c not found: ID does not exist" containerID="ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.532406 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c"} err="failed to get container status \"ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c\": rpc error: code = NotFound desc = could not find container \"ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c\": container with ID starting with ee9c5804e3ebbdaee96edb5fde057cb5b994bbf44a9811213ffcceffdc2f563c not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.532427 4885 scope.go:117] "RemoveContainer" containerID="2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.532691 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7\": container with ID starting with 2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7 not found: ID does not exist" containerID="2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.532711 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7"} err="failed to get container status \"2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7\": rpc error: code = NotFound desc = could not find container \"2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7\": container with ID starting with 2d76ef0fc8aad7d976bb086a294c36492f0ae8bf64e373d9aa97fbdacb52b5f7 not found: ID does not exist" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.532724 4885 scope.go:117] "RemoveContainer" containerID="3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2" Mar 08 19:38:07 crc kubenswrapper[4885]: E0308 19:38:07.533043 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2\": container with ID starting with 3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2 not found: ID does not exist" containerID="3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2" Mar 08 19:38:07 crc kubenswrapper[4885]: I0308 19:38:07.533071 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2"} err="failed to get container status \"3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2\": rpc error: code = NotFound desc = could not find container \"3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2\": container with ID starting with 3e732b10e561e31eff3ed0a4f6bd80eca75ef2669f3a3fe066abfe8aa1903bf2 not found: ID does not exist" Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.054737 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" event={"ID":"71a3a17a-ffa4-4d31-94fb-7e720297e94b","Type":"ContainerDied","Data":"39e37b084713e3e60681ff2d807bedc8e9f56d2517ec3f5b9a8661bf849e5716"} Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.055002 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39e37b084713e3e60681ff2d807bedc8e9f56d2517ec3f5b9a8661bf849e5716" Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.054895 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549978-9wjmp" Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.061088 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" event={"ID":"1e87323f-cf50-46ef-8e7c-cccd8a1e3601","Type":"ContainerStarted","Data":"0ccaa614eed33e87604a5ab4986e447e3b6f5d78d3eb03e7441a690f11f6b9be"} Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.061216 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.061452 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" event={"ID":"1e87323f-cf50-46ef-8e7c-cccd8a1e3601","Type":"ContainerStarted","Data":"95eb3b5189eee966a76657d1747943f212d25aa3f031105fcd582f411077fab9"} Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.064965 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" Mar 08 19:38:08 crc kubenswrapper[4885]: I0308 19:38:08.092273 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2774l" podStartSLOduration=2.092255352 podStartE2EDuration="2.092255352s" podCreationTimestamp="2026-03-08 19:38:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:38:08.089541049 +0000 UTC m=+389.485595072" watchObservedRunningTime="2026-03-08 19:38:08.092255352 +0000 UTC m=+389.488309375" Mar 08 19:38:09 crc kubenswrapper[4885]: I0308 19:38:09.383479 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" path="/var/lib/kubelet/pods/05666e0b-c4ce-451a-ba67-ddb78866ef54/volumes" Mar 08 19:38:09 crc kubenswrapper[4885]: I0308 19:38:09.384714 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" path="/var/lib/kubelet/pods/56c146b0-3448-4140-8cf0-8d637f7f22a9/volumes" Mar 08 19:38:09 crc kubenswrapper[4885]: I0308 19:38:09.386544 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" path="/var/lib/kubelet/pods/7d8fbc68-3714-4fe4-9f62-857c5dc05661/volumes" Mar 08 19:38:09 crc kubenswrapper[4885]: I0308 19:38:09.389178 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" path="/var/lib/kubelet/pods/8881ba5e-d9d1-42a9-98af-849e72053757/volumes" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.083601 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.083847 4885 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="bc127ac2c1aecb1907d28d88aaeef9c25966c90abdc300af9cb39fb411002f31" exitCode=137 Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.369720 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.369802 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.462840 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.462963 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463019 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463063 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463083 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463094 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463159 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463308 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.463259 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.464209 4885 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.464247 4885 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.464265 4885 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.464285 4885 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.474841 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:38:12 crc kubenswrapper[4885]: I0308 19:38:12.565848 4885 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 08 19:38:13 crc kubenswrapper[4885]: I0308 19:38:13.092029 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 08 19:38:13 crc kubenswrapper[4885]: I0308 19:38:13.092432 4885 scope.go:117] "RemoveContainer" containerID="bc127ac2c1aecb1907d28d88aaeef9c25966c90abdc300af9cb39fb411002f31" Mar 08 19:38:13 crc kubenswrapper[4885]: I0308 19:38:13.092522 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 08 19:38:13 crc kubenswrapper[4885]: I0308 19:38:13.377595 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 08 19:39:02 crc kubenswrapper[4885]: I0308 19:39:02.818266 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:39:02 crc kubenswrapper[4885]: I0308 19:39:02.818998 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.365677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.366578 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.368008 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.375434 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.468307 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.468538 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.474132 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f639c4e-64b8-45e9-bf33-c1d8c376b438-metrics-certs\") pod \"network-metrics-daemon-jps4r\" (UID: \"2f639c4e-64b8-45e9-bf33-c1d8c376b438\") " pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.772073 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 08 19:39:15 crc kubenswrapper[4885]: I0308 19:39:15.780606 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jps4r" Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.024132 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jps4r"] Mar 08 19:39:16 crc kubenswrapper[4885]: W0308 19:39:16.047388 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f639c4e_64b8_45e9_bf33_c1d8c376b438.slice/crio-9d9e55809354362080954d8039bfd7b4394e545b8a44bc907bf71e30d4ab7288 WatchSource:0}: Error finding container 9d9e55809354362080954d8039bfd7b4394e545b8a44bc907bf71e30d4ab7288: Status 404 returned error can't find the container with id 9d9e55809354362080954d8039bfd7b4394e545b8a44bc907bf71e30d4ab7288 Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.381755 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.382131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.388941 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.389431 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.496772 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jps4r" event={"ID":"2f639c4e-64b8-45e9-bf33-c1d8c376b438","Type":"ContainerStarted","Data":"972832e9972527dceaf81093e0a909c97ff2667d0ccbc2bc33966538d084245b"} Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.497020 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jps4r" event={"ID":"2f639c4e-64b8-45e9-bf33-c1d8c376b438","Type":"ContainerStarted","Data":"9d9e55809354362080954d8039bfd7b4394e545b8a44bc907bf71e30d4ab7288"} Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.497884 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"47af6b77aa1702865b3d217075ca59ab36c243e6bb360d65e62c73b372897c70"} Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.497958 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fb78f201747e8b6b0fbfb3771649a8e4fa3501fa682b9d3978812012cc07a527"} Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.569237 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 08 19:39:16 crc kubenswrapper[4885]: I0308 19:39:16.683267 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:39:16 crc kubenswrapper[4885]: W0308 19:39:16.915123 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-efca47b0d41da71b96ce3404845981447176613dea11a0dfe26aaff2ff39269f WatchSource:0}: Error finding container efca47b0d41da71b96ce3404845981447176613dea11a0dfe26aaff2ff39269f: Status 404 returned error can't find the container with id efca47b0d41da71b96ce3404845981447176613dea11a0dfe26aaff2ff39269f Mar 08 19:39:17 crc kubenswrapper[4885]: I0308 19:39:17.513111 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a005623d36a21c04f7a754369f96097597f0f99d9b302fa7f7b7061a8b391370"} Mar 08 19:39:17 crc kubenswrapper[4885]: I0308 19:39:17.513179 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"d61969643d68289f2dccff9b4bccdb6b82ba4fee905f240c6b6642ddcda13790"} Mar 08 19:39:17 crc kubenswrapper[4885]: I0308 19:39:17.515840 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jps4r" event={"ID":"2f639c4e-64b8-45e9-bf33-c1d8c376b438","Type":"ContainerStarted","Data":"dff68ddffa61ea6d8e83c3a07abf0ab18086e0f31a56d7e2b7b734bf137f28c6"} Mar 08 19:39:17 crc kubenswrapper[4885]: I0308 19:39:17.519210 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"744b4698473108a806e0119f4d388c2ce6e22a844ade5daf02a3040ee6078138"} Mar 08 19:39:17 crc kubenswrapper[4885]: I0308 19:39:17.519281 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"efca47b0d41da71b96ce3404845981447176613dea11a0dfe26aaff2ff39269f"} Mar 08 19:39:17 crc kubenswrapper[4885]: I0308 19:39:17.519509 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.742894 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jps4r" podStartSLOduration=434.742863298 podStartE2EDuration="7m14.742863298s" podCreationTimestamp="2026-03-08 19:32:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:39:17.606521587 +0000 UTC m=+459.002575650" watchObservedRunningTime="2026-03-08 19:39:29.742863298 +0000 UTC m=+471.138917361" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.751592 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7btr2"] Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752004 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752049 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752082 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752102 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752132 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752149 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752177 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752197 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752224 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752242 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752265 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752282 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752297 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752310 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752326 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752336 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752350 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752361 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752374 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752385 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="extract-utilities" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752402 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752413 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752448 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752459 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752477 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752487 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752504 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752514 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752527 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752537 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752548 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752558 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="extract-content" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752572 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71a3a17a-ffa4-4d31-94fb-7e720297e94b" containerName="oc" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752581 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="71a3a17a-ffa4-4d31-94fb-7e720297e94b" containerName="oc" Mar 08 19:39:29 crc kubenswrapper[4885]: E0308 19:39:29.752593 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752602 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752734 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a6b85b3-0bb1-4199-983f-615a6c932f09" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752748 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="05666e0b-c4ce-451a-ba67-ddb78866ef54" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752763 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8881ba5e-d9d1-42a9-98af-849e72053757" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752775 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="71a3a17a-ffa4-4d31-94fb-7e720297e94b" containerName="oc" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752791 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8fbc68-3714-4fe4-9f62-857c5dc05661" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752809 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="83de4c2d-767a-4635-8748-486dd45683a1" containerName="marketplace-operator" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752821 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c146b0-3448-4140-8cf0-8d637f7f22a9" containerName="registry-server" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.752837 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.753473 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.770565 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7btr2"] Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.867798 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cf151e9-2721-48a3-825e-c74a1caa0a76-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.867853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqc7z\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-kube-api-access-tqc7z\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.867870 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cf151e9-2721-48a3-825e-c74a1caa0a76-trusted-ca\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.867904 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.867946 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cf151e9-2721-48a3-825e-c74a1caa0a76-registry-certificates\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.867984 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cf151e9-2721-48a3-825e-c74a1caa0a76-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.868009 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-bound-sa-token\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.868023 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-registry-tls\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.891798 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969078 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cf151e9-2721-48a3-825e-c74a1caa0a76-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969447 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-bound-sa-token\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-registry-tls\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969542 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cf151e9-2721-48a3-825e-c74a1caa0a76-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969584 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqc7z\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-kube-api-access-tqc7z\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969621 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cf151e9-2721-48a3-825e-c74a1caa0a76-trusted-ca\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.969683 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cf151e9-2721-48a3-825e-c74a1caa0a76-registry-certificates\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.970392 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2cf151e9-2721-48a3-825e-c74a1caa0a76-ca-trust-extracted\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.971507 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2cf151e9-2721-48a3-825e-c74a1caa0a76-trusted-ca\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.972153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2cf151e9-2721-48a3-825e-c74a1caa0a76-registry-certificates\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.975685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-registry-tls\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.976016 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2cf151e9-2721-48a3-825e-c74a1caa0a76-installation-pull-secrets\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.983346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-bound-sa-token\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:29 crc kubenswrapper[4885]: I0308 19:39:29.988323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqc7z\" (UniqueName: \"kubernetes.io/projected/2cf151e9-2721-48a3-825e-c74a1caa0a76-kube-api-access-tqc7z\") pod \"image-registry-66df7c8f76-7btr2\" (UID: \"2cf151e9-2721-48a3-825e-c74a1caa0a76\") " pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:30 crc kubenswrapper[4885]: I0308 19:39:30.074853 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:30 crc kubenswrapper[4885]: I0308 19:39:30.597543 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-7btr2"] Mar 08 19:39:30 crc kubenswrapper[4885]: W0308 19:39:30.606356 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cf151e9_2721_48a3_825e_c74a1caa0a76.slice/crio-27a1367e28d4aba3733fe0fe4b54f6b2881715a0140b1fa7bee690265837269b WatchSource:0}: Error finding container 27a1367e28d4aba3733fe0fe4b54f6b2881715a0140b1fa7bee690265837269b: Status 404 returned error can't find the container with id 27a1367e28d4aba3733fe0fe4b54f6b2881715a0140b1fa7bee690265837269b Mar 08 19:39:30 crc kubenswrapper[4885]: I0308 19:39:30.699616 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" event={"ID":"2cf151e9-2721-48a3-825e-c74a1caa0a76","Type":"ContainerStarted","Data":"27a1367e28d4aba3733fe0fe4b54f6b2881715a0140b1fa7bee690265837269b"} Mar 08 19:39:31 crc kubenswrapper[4885]: I0308 19:39:31.714646 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" event={"ID":"2cf151e9-2721-48a3-825e-c74a1caa0a76","Type":"ContainerStarted","Data":"14d1f3da52a97070bff36e9722cd8d71ef20ee20c61a4b32cd602df90312ecfe"} Mar 08 19:39:31 crc kubenswrapper[4885]: I0308 19:39:31.715176 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:31 crc kubenswrapper[4885]: I0308 19:39:31.753501 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" podStartSLOduration=2.753428614 podStartE2EDuration="2.753428614s" podCreationTimestamp="2026-03-08 19:39:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:39:31.750779623 +0000 UTC m=+473.146833676" watchObservedRunningTime="2026-03-08 19:39:31.753428614 +0000 UTC m=+473.149482667" Mar 08 19:39:32 crc kubenswrapper[4885]: I0308 19:39:32.818068 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:39:32 crc kubenswrapper[4885]: I0308 19:39:32.818399 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.161145 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6ctxc"] Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.162109 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.163885 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.183631 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ctxc"] Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.216535 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751589b1-c864-424f-9315-13a7d880bcf6-utilities\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.216758 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnw7v\" (UniqueName: \"kubernetes.io/projected/751589b1-c864-424f-9315-13a7d880bcf6-kube-api-access-vnw7v\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.216818 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751589b1-c864-424f-9315-13a7d880bcf6-catalog-content\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.318645 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751589b1-c864-424f-9315-13a7d880bcf6-utilities\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.318866 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnw7v\" (UniqueName: \"kubernetes.io/projected/751589b1-c864-424f-9315-13a7d880bcf6-kube-api-access-vnw7v\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.318956 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751589b1-c864-424f-9315-13a7d880bcf6-catalog-content\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.319818 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751589b1-c864-424f-9315-13a7d880bcf6-utilities\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.319864 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751589b1-c864-424f-9315-13a7d880bcf6-catalog-content\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.346088 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnw7v\" (UniqueName: \"kubernetes.io/projected/751589b1-c864-424f-9315-13a7d880bcf6-kube-api-access-vnw7v\") pod \"redhat-marketplace-6ctxc\" (UID: \"751589b1-c864-424f-9315-13a7d880bcf6\") " pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.361428 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bs2qs"] Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.363443 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.365801 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.388818 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs2qs"] Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.420021 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w28js\" (UniqueName: \"kubernetes.io/projected/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-kube-api-access-w28js\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.420137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-utilities\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.420206 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-catalog-content\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.491502 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.521168 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w28js\" (UniqueName: \"kubernetes.io/projected/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-kube-api-access-w28js\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.521313 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-utilities\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.521411 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-catalog-content\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.521971 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-catalog-content\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.522069 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-utilities\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.539739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w28js\" (UniqueName: \"kubernetes.io/projected/bdc128cf-2f55-4964-8229-6aa7e1dd9f1e-kube-api-access-w28js\") pod \"redhat-operators-bs2qs\" (UID: \"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e\") " pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.688912 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.717985 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6ctxc"] Mar 08 19:39:33 crc kubenswrapper[4885]: W0308 19:39:33.726101 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod751589b1_c864_424f_9315_13a7d880bcf6.slice/crio-e4d9ea5fb345651a9abba046949c2263214c357d4835a6d860e4e3646f4268a4 WatchSource:0}: Error finding container e4d9ea5fb345651a9abba046949c2263214c357d4835a6d860e4e3646f4268a4: Status 404 returned error can't find the container with id e4d9ea5fb345651a9abba046949c2263214c357d4835a6d860e4e3646f4268a4 Mar 08 19:39:33 crc kubenswrapper[4885]: I0308 19:39:33.938383 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bs2qs"] Mar 08 19:39:33 crc kubenswrapper[4885]: W0308 19:39:33.962286 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdc128cf_2f55_4964_8229_6aa7e1dd9f1e.slice/crio-8dcf908b7f40608c33a9c435ac396917389b126d38beefad066b46752dca657c WatchSource:0}: Error finding container 8dcf908b7f40608c33a9c435ac396917389b126d38beefad066b46752dca657c: Status 404 returned error can't find the container with id 8dcf908b7f40608c33a9c435ac396917389b126d38beefad066b46752dca657c Mar 08 19:39:34 crc kubenswrapper[4885]: I0308 19:39:34.738330 4885 generic.go:334] "Generic (PLEG): container finished" podID="bdc128cf-2f55-4964-8229-6aa7e1dd9f1e" containerID="92283bbdc51675a1f97ec35414854fa5ba662317b0bdeaeb0793f1df4f4bbea0" exitCode=0 Mar 08 19:39:34 crc kubenswrapper[4885]: I0308 19:39:34.739272 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2qs" event={"ID":"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e","Type":"ContainerDied","Data":"92283bbdc51675a1f97ec35414854fa5ba662317b0bdeaeb0793f1df4f4bbea0"} Mar 08 19:39:34 crc kubenswrapper[4885]: I0308 19:39:34.739420 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2qs" event={"ID":"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e","Type":"ContainerStarted","Data":"8dcf908b7f40608c33a9c435ac396917389b126d38beefad066b46752dca657c"} Mar 08 19:39:34 crc kubenswrapper[4885]: I0308 19:39:34.745423 4885 generic.go:334] "Generic (PLEG): container finished" podID="751589b1-c864-424f-9315-13a7d880bcf6" containerID="c92e401a0ca9753fe02c9b9adf3141c04b04648a84aef52488db224f7d2d516c" exitCode=0 Mar 08 19:39:34 crc kubenswrapper[4885]: I0308 19:39:34.745493 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ctxc" event={"ID":"751589b1-c864-424f-9315-13a7d880bcf6","Type":"ContainerDied","Data":"c92e401a0ca9753fe02c9b9adf3141c04b04648a84aef52488db224f7d2d516c"} Mar 08 19:39:34 crc kubenswrapper[4885]: I0308 19:39:34.745536 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ctxc" event={"ID":"751589b1-c864-424f-9315-13a7d880bcf6","Type":"ContainerStarted","Data":"e4d9ea5fb345651a9abba046949c2263214c357d4835a6d860e4e3646f4268a4"} Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.577239 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2tkhk"] Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.579793 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.582590 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.589615 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2tkhk"] Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.749558 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-utilities\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.749629 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-catalog-content\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.749691 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prwnl\" (UniqueName: \"kubernetes.io/projected/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-kube-api-access-prwnl\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.754563 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ctxc" event={"ID":"751589b1-c864-424f-9315-13a7d880bcf6","Type":"ContainerStarted","Data":"32e6eb0387e68198e926f2c853105ade82a1d566e9e146dcc2cd1e280b25141e"} Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.755857 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2qs" event={"ID":"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e","Type":"ContainerStarted","Data":"0361b6e3aad753b5f6696df09aba7a79ad44c5e6aa3bcf8f446d9bb251ff92a2"} Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.762015 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xjhsv"] Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.763737 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.767875 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.783750 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xjhsv"] Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.851025 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-utilities\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.851341 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-catalog-content\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.851474 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prwnl\" (UniqueName: \"kubernetes.io/projected/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-kube-api-access-prwnl\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.851700 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-utilities\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.851702 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-catalog-content\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.871709 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prwnl\" (UniqueName: \"kubernetes.io/projected/2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9-kube-api-access-prwnl\") pod \"community-operators-2tkhk\" (UID: \"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9\") " pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.920243 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.952797 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2914e8af-92f9-40a3-99ea-a52bfaf31a36-utilities\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.952901 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thrcl\" (UniqueName: \"kubernetes.io/projected/2914e8af-92f9-40a3-99ea-a52bfaf31a36-kube-api-access-thrcl\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:35 crc kubenswrapper[4885]: I0308 19:39:35.953003 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2914e8af-92f9-40a3-99ea-a52bfaf31a36-catalog-content\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.055722 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2914e8af-92f9-40a3-99ea-a52bfaf31a36-catalog-content\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.055848 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2914e8af-92f9-40a3-99ea-a52bfaf31a36-utilities\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.055896 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thrcl\" (UniqueName: \"kubernetes.io/projected/2914e8af-92f9-40a3-99ea-a52bfaf31a36-kube-api-access-thrcl\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.056676 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2914e8af-92f9-40a3-99ea-a52bfaf31a36-catalog-content\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.056717 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2914e8af-92f9-40a3-99ea-a52bfaf31a36-utilities\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.079640 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thrcl\" (UniqueName: \"kubernetes.io/projected/2914e8af-92f9-40a3-99ea-a52bfaf31a36-kube-api-access-thrcl\") pod \"certified-operators-xjhsv\" (UID: \"2914e8af-92f9-40a3-99ea-a52bfaf31a36\") " pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.172565 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2tkhk"] Mar 08 19:39:36 crc kubenswrapper[4885]: W0308 19:39:36.184220 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b34f7ab_2ff3_40fd_8a23_82b9ff4536e9.slice/crio-27b83a6bb0603b7e9cf99e3e16945b5cd2e48181be158667c4512389cda41a3e WatchSource:0}: Error finding container 27b83a6bb0603b7e9cf99e3e16945b5cd2e48181be158667c4512389cda41a3e: Status 404 returned error can't find the container with id 27b83a6bb0603b7e9cf99e3e16945b5cd2e48181be158667c4512389cda41a3e Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.380850 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.567129 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xjhsv"] Mar 08 19:39:36 crc kubenswrapper[4885]: W0308 19:39:36.571419 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2914e8af_92f9_40a3_99ea_a52bfaf31a36.slice/crio-d4faa57a28891dc1d07e8aacecb90d5cb6b3925d898048a4f99fe3dafe91c8c2 WatchSource:0}: Error finding container d4faa57a28891dc1d07e8aacecb90d5cb6b3925d898048a4f99fe3dafe91c8c2: Status 404 returned error can't find the container with id d4faa57a28891dc1d07e8aacecb90d5cb6b3925d898048a4f99fe3dafe91c8c2 Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.761803 4885 generic.go:334] "Generic (PLEG): container finished" podID="2914e8af-92f9-40a3-99ea-a52bfaf31a36" containerID="97dffe85983491bd037e407581a64e9ae51f74e2c9fced366c765c47287882f7" exitCode=0 Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.761876 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjhsv" event={"ID":"2914e8af-92f9-40a3-99ea-a52bfaf31a36","Type":"ContainerDied","Data":"97dffe85983491bd037e407581a64e9ae51f74e2c9fced366c765c47287882f7"} Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.762204 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjhsv" event={"ID":"2914e8af-92f9-40a3-99ea-a52bfaf31a36","Type":"ContainerStarted","Data":"d4faa57a28891dc1d07e8aacecb90d5cb6b3925d898048a4f99fe3dafe91c8c2"} Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.765533 4885 generic.go:334] "Generic (PLEG): container finished" podID="751589b1-c864-424f-9315-13a7d880bcf6" containerID="32e6eb0387e68198e926f2c853105ade82a1d566e9e146dcc2cd1e280b25141e" exitCode=0 Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.765599 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ctxc" event={"ID":"751589b1-c864-424f-9315-13a7d880bcf6","Type":"ContainerDied","Data":"32e6eb0387e68198e926f2c853105ade82a1d566e9e146dcc2cd1e280b25141e"} Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.769610 4885 generic.go:334] "Generic (PLEG): container finished" podID="bdc128cf-2f55-4964-8229-6aa7e1dd9f1e" containerID="0361b6e3aad753b5f6696df09aba7a79ad44c5e6aa3bcf8f446d9bb251ff92a2" exitCode=0 Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.769672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2qs" event={"ID":"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e","Type":"ContainerDied","Data":"0361b6e3aad753b5f6696df09aba7a79ad44c5e6aa3bcf8f446d9bb251ff92a2"} Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.772237 4885 generic.go:334] "Generic (PLEG): container finished" podID="2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9" containerID="6049e9245cb496c144433f4e2fc90a4d6dac1fbaf2d8e3721d7ff19427bd4661" exitCode=0 Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.772275 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tkhk" event={"ID":"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9","Type":"ContainerDied","Data":"6049e9245cb496c144433f4e2fc90a4d6dac1fbaf2d8e3721d7ff19427bd4661"} Mar 08 19:39:36 crc kubenswrapper[4885]: I0308 19:39:36.772313 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tkhk" event={"ID":"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9","Type":"ContainerStarted","Data":"27b83a6bb0603b7e9cf99e3e16945b5cd2e48181be158667c4512389cda41a3e"} Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.781510 4885 generic.go:334] "Generic (PLEG): container finished" podID="2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9" containerID="d9226a8cd45f91a8c02e9b345d84059dcf8c500de7058e4c867cb59dcd4befef" exitCode=0 Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.781597 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tkhk" event={"ID":"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9","Type":"ContainerDied","Data":"d9226a8cd45f91a8c02e9b345d84059dcf8c500de7058e4c867cb59dcd4befef"} Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.785976 4885 generic.go:334] "Generic (PLEG): container finished" podID="2914e8af-92f9-40a3-99ea-a52bfaf31a36" containerID="37068f533f46fc4af386aba52810bc9625ff56194945929f07b856f065caeb4b" exitCode=0 Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.786030 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjhsv" event={"ID":"2914e8af-92f9-40a3-99ea-a52bfaf31a36","Type":"ContainerDied","Data":"37068f533f46fc4af386aba52810bc9625ff56194945929f07b856f065caeb4b"} Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.790426 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6ctxc" event={"ID":"751589b1-c864-424f-9315-13a7d880bcf6","Type":"ContainerStarted","Data":"332b2321d2cfb4d39de4da7d751c1878a3faece3fa394bd9b703c026b4619029"} Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.793087 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bs2qs" event={"ID":"bdc128cf-2f55-4964-8229-6aa7e1dd9f1e","Type":"ContainerStarted","Data":"897e4783123371a6a16ebc1dc821428baff397298d556a655505ce30473ca924"} Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.851643 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6ctxc" podStartSLOduration=2.43857022 podStartE2EDuration="4.851620207s" podCreationTimestamp="2026-03-08 19:39:33 +0000 UTC" firstStartedPulling="2026-03-08 19:39:34.749154764 +0000 UTC m=+476.145208797" lastFinishedPulling="2026-03-08 19:39:37.162204761 +0000 UTC m=+478.558258784" observedRunningTime="2026-03-08 19:39:37.83400231 +0000 UTC m=+479.230056343" watchObservedRunningTime="2026-03-08 19:39:37.851620207 +0000 UTC m=+479.247674230" Mar 08 19:39:37 crc kubenswrapper[4885]: I0308 19:39:37.853624 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bs2qs" podStartSLOduration=2.381635779 podStartE2EDuration="4.85361532s" podCreationTimestamp="2026-03-08 19:39:33 +0000 UTC" firstStartedPulling="2026-03-08 19:39:34.741150752 +0000 UTC m=+476.137204815" lastFinishedPulling="2026-03-08 19:39:37.213130303 +0000 UTC m=+478.609184356" observedRunningTime="2026-03-08 19:39:37.851773091 +0000 UTC m=+479.247827134" watchObservedRunningTime="2026-03-08 19:39:37.85361532 +0000 UTC m=+479.249669353" Mar 08 19:39:38 crc kubenswrapper[4885]: I0308 19:39:38.802132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2tkhk" event={"ID":"2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9","Type":"ContainerStarted","Data":"da8ad3e698a9cc0fcf4d86dc50c169cc94ede477a183a379520f4f7956dd3886"} Mar 08 19:39:38 crc kubenswrapper[4885]: I0308 19:39:38.810258 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xjhsv" event={"ID":"2914e8af-92f9-40a3-99ea-a52bfaf31a36","Type":"ContainerStarted","Data":"813d97879d71906ae6bfa2a82a92be63704d49c1e2aedbce45e1bbd8d42db093"} Mar 08 19:39:38 crc kubenswrapper[4885]: I0308 19:39:38.830693 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2tkhk" podStartSLOduration=2.447087582 podStartE2EDuration="3.830679059s" podCreationTimestamp="2026-03-08 19:39:35 +0000 UTC" firstStartedPulling="2026-03-08 19:39:36.775137679 +0000 UTC m=+478.171191732" lastFinishedPulling="2026-03-08 19:39:38.158729176 +0000 UTC m=+479.554783209" observedRunningTime="2026-03-08 19:39:38.827027592 +0000 UTC m=+480.223081615" watchObservedRunningTime="2026-03-08 19:39:38.830679059 +0000 UTC m=+480.226733082" Mar 08 19:39:38 crc kubenswrapper[4885]: I0308 19:39:38.851472 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xjhsv" podStartSLOduration=2.370454137 podStartE2EDuration="3.85145036s" podCreationTimestamp="2026-03-08 19:39:35 +0000 UTC" firstStartedPulling="2026-03-08 19:39:36.764903858 +0000 UTC m=+478.160957881" lastFinishedPulling="2026-03-08 19:39:38.245900081 +0000 UTC m=+479.641954104" observedRunningTime="2026-03-08 19:39:38.848583794 +0000 UTC m=+480.244637817" watchObservedRunningTime="2026-03-08 19:39:38.85145036 +0000 UTC m=+480.247504393" Mar 08 19:39:43 crc kubenswrapper[4885]: I0308 19:39:43.492952 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:43 crc kubenswrapper[4885]: I0308 19:39:43.493334 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:43 crc kubenswrapper[4885]: I0308 19:39:43.549242 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:43 crc kubenswrapper[4885]: I0308 19:39:43.689590 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:43 crc kubenswrapper[4885]: I0308 19:39:43.689667 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:43 crc kubenswrapper[4885]: I0308 19:39:43.910745 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6ctxc" Mar 08 19:39:44 crc kubenswrapper[4885]: I0308 19:39:44.734691 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bs2qs" podUID="bdc128cf-2f55-4964-8229-6aa7e1dd9f1e" containerName="registry-server" probeResult="failure" output=< Mar 08 19:39:44 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 19:39:44 crc kubenswrapper[4885]: > Mar 08 19:39:45 crc kubenswrapper[4885]: I0308 19:39:45.427977 4885 scope.go:117] "RemoveContainer" containerID="90ed4181a9dc63986e554e94dce2f763a09e37ede6c0fabdc28e147d26363548" Mar 08 19:39:45 crc kubenswrapper[4885]: I0308 19:39:45.920794 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:45 crc kubenswrapper[4885]: I0308 19:39:45.920890 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:45 crc kubenswrapper[4885]: I0308 19:39:45.972181 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:46 crc kubenswrapper[4885]: I0308 19:39:46.381439 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:46 crc kubenswrapper[4885]: I0308 19:39:46.381915 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:46 crc kubenswrapper[4885]: I0308 19:39:46.440787 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:46 crc kubenswrapper[4885]: I0308 19:39:46.915381 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xjhsv" Mar 08 19:39:46 crc kubenswrapper[4885]: I0308 19:39:46.936679 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2tkhk" Mar 08 19:39:50 crc kubenswrapper[4885]: I0308 19:39:50.082341 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-7btr2" Mar 08 19:39:50 crc kubenswrapper[4885]: I0308 19:39:50.154156 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4xs78"] Mar 08 19:39:53 crc kubenswrapper[4885]: I0308 19:39:53.754098 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:53 crc kubenswrapper[4885]: I0308 19:39:53.826184 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bs2qs" Mar 08 19:39:56 crc kubenswrapper[4885]: I0308 19:39:56.690635 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.153083 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549980-lx7sw"] Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.154837 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.156987 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.157501 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.158533 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.164356 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549980-lx7sw"] Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.225837 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x59f\" (UniqueName: \"kubernetes.io/projected/ffc137d5-821a-406d-8db5-d396d0091991-kube-api-access-2x59f\") pod \"auto-csr-approver-29549980-lx7sw\" (UID: \"ffc137d5-821a-406d-8db5-d396d0091991\") " pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.327656 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x59f\" (UniqueName: \"kubernetes.io/projected/ffc137d5-821a-406d-8db5-d396d0091991-kube-api-access-2x59f\") pod \"auto-csr-approver-29549980-lx7sw\" (UID: \"ffc137d5-821a-406d-8db5-d396d0091991\") " pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.361521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x59f\" (UniqueName: \"kubernetes.io/projected/ffc137d5-821a-406d-8db5-d396d0091991-kube-api-access-2x59f\") pod \"auto-csr-approver-29549980-lx7sw\" (UID: \"ffc137d5-821a-406d-8db5-d396d0091991\") " pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.491875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.719897 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549980-lx7sw"] Mar 08 19:40:00 crc kubenswrapper[4885]: I0308 19:40:00.995445 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" event={"ID":"ffc137d5-821a-406d-8db5-d396d0091991","Type":"ContainerStarted","Data":"b1ccc77e9d3d51c4820645e33938e38051251e7b2f1a89c945f3a73c5831bac8"} Mar 08 19:40:02 crc kubenswrapper[4885]: I0308 19:40:02.854660 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:40:02 crc kubenswrapper[4885]: I0308 19:40:02.855323 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:40:02 crc kubenswrapper[4885]: I0308 19:40:02.855386 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:40:02 crc kubenswrapper[4885]: I0308 19:40:02.856260 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7301efe622f6965ef0088239fdd6ca59b9a8395c4d2bba8dc311752a026260dc"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:40:02 crc kubenswrapper[4885]: I0308 19:40:02.856383 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://7301efe622f6965ef0088239fdd6ca59b9a8395c4d2bba8dc311752a026260dc" gracePeriod=600 Mar 08 19:40:03 crc kubenswrapper[4885]: I0308 19:40:03.014421 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="7301efe622f6965ef0088239fdd6ca59b9a8395c4d2bba8dc311752a026260dc" exitCode=0 Mar 08 19:40:03 crc kubenswrapper[4885]: I0308 19:40:03.014514 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"7301efe622f6965ef0088239fdd6ca59b9a8395c4d2bba8dc311752a026260dc"} Mar 08 19:40:03 crc kubenswrapper[4885]: I0308 19:40:03.014611 4885 scope.go:117] "RemoveContainer" containerID="c01c23ab9c825ecf6feac9d15bea36a7d2963d853406316ac1db6e3fdc0a132a" Mar 08 19:40:03 crc kubenswrapper[4885]: I0308 19:40:03.019630 4885 generic.go:334] "Generic (PLEG): container finished" podID="ffc137d5-821a-406d-8db5-d396d0091991" containerID="6b3edc0ab6930c447e72d1b9e0e05c67fcbcc8c8cb108ba4b449e1f4acc1e00e" exitCode=0 Mar 08 19:40:03 crc kubenswrapper[4885]: I0308 19:40:03.019710 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" event={"ID":"ffc137d5-821a-406d-8db5-d396d0091991","Type":"ContainerDied","Data":"6b3edc0ab6930c447e72d1b9e0e05c67fcbcc8c8cb108ba4b449e1f4acc1e00e"} Mar 08 19:40:04 crc kubenswrapper[4885]: I0308 19:40:04.029871 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"efa4fb8889b5532d28bec018d7816a61ed0ea017834833ffed5162643372e98f"} Mar 08 19:40:04 crc kubenswrapper[4885]: I0308 19:40:04.355185 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:04 crc kubenswrapper[4885]: I0308 19:40:04.390991 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x59f\" (UniqueName: \"kubernetes.io/projected/ffc137d5-821a-406d-8db5-d396d0091991-kube-api-access-2x59f\") pod \"ffc137d5-821a-406d-8db5-d396d0091991\" (UID: \"ffc137d5-821a-406d-8db5-d396d0091991\") " Mar 08 19:40:04 crc kubenswrapper[4885]: I0308 19:40:04.410036 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffc137d5-821a-406d-8db5-d396d0091991-kube-api-access-2x59f" (OuterVolumeSpecName: "kube-api-access-2x59f") pod "ffc137d5-821a-406d-8db5-d396d0091991" (UID: "ffc137d5-821a-406d-8db5-d396d0091991"). InnerVolumeSpecName "kube-api-access-2x59f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:40:04 crc kubenswrapper[4885]: I0308 19:40:04.493709 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x59f\" (UniqueName: \"kubernetes.io/projected/ffc137d5-821a-406d-8db5-d396d0091991-kube-api-access-2x59f\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:05 crc kubenswrapper[4885]: I0308 19:40:05.042347 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" event={"ID":"ffc137d5-821a-406d-8db5-d396d0091991","Type":"ContainerDied","Data":"b1ccc77e9d3d51c4820645e33938e38051251e7b2f1a89c945f3a73c5831bac8"} Mar 08 19:40:05 crc kubenswrapper[4885]: I0308 19:40:05.042392 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1ccc77e9d3d51c4820645e33938e38051251e7b2f1a89c945f3a73c5831bac8" Mar 08 19:40:05 crc kubenswrapper[4885]: I0308 19:40:05.042354 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549980-lx7sw" Mar 08 19:40:05 crc kubenswrapper[4885]: I0308 19:40:05.425479 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549974-jjqkh"] Mar 08 19:40:05 crc kubenswrapper[4885]: I0308 19:40:05.428681 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549974-jjqkh"] Mar 08 19:40:07 crc kubenswrapper[4885]: I0308 19:40:07.381611 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60da1edb-8474-4368-a6ae-0bb2b1b7b845" path="/var/lib/kubelet/pods/60da1edb-8474-4368-a6ae-0bb2b1b7b845/volumes" Mar 08 19:40:15 crc kubenswrapper[4885]: I0308 19:40:15.210604 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" podUID="d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" containerName="registry" containerID="cri-o://f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c" gracePeriod=30 Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.645399 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667430 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfrhm\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-kube-api-access-mfrhm\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667503 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-certificates\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667532 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-tls\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667571 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-trusted-ca\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667615 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-bound-sa-token\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667731 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667825 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-installation-pull-secrets\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.667870 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-ca-trust-extracted\") pod \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\" (UID: \"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1\") " Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.668571 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.668728 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.674789 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-kube-api-access-mfrhm" (OuterVolumeSpecName: "kube-api-access-mfrhm") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "kube-api-access-mfrhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.675881 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.676140 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.676500 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.679595 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.684803 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" (UID: "d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769601 4885 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769645 4885 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769664 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769681 4885 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769751 4885 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769773 4885 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:15.769791 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfrhm\" (UniqueName: \"kubernetes.io/projected/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1-kube-api-access-mfrhm\") on node \"crc\" DevicePath \"\"" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.133338 4885 generic.go:334] "Generic (PLEG): container finished" podID="d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" containerID="f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c" exitCode=0 Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.133404 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" event={"ID":"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1","Type":"ContainerDied","Data":"f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c"} Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.133448 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.133504 4885 scope.go:117] "RemoveContainer" containerID="f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.133484 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4xs78" event={"ID":"d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1","Type":"ContainerDied","Data":"d5dd902e3ef717231619c64b1b91e79b07a9f0b3233c92d0567cafca72b99c09"} Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.168073 4885 scope.go:117] "RemoveContainer" containerID="f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c" Mar 08 19:40:16 crc kubenswrapper[4885]: E0308 19:40:16.168794 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c\": container with ID starting with f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c not found: ID does not exist" containerID="f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.169164 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c"} err="failed to get container status \"f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c\": rpc error: code = NotFound desc = could not find container \"f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c\": container with ID starting with f3b546f4d5bbbecf055ec0e43b3cdfb7f6c417454125a1ec4854b42f6f55ae4c not found: ID does not exist" Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.189302 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4xs78"] Mar 08 19:40:16 crc kubenswrapper[4885]: I0308 19:40:16.204907 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4xs78"] Mar 08 19:40:17 crc kubenswrapper[4885]: I0308 19:40:17.380162 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" path="/var/lib/kubelet/pods/d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1/volumes" Mar 08 19:41:45 crc kubenswrapper[4885]: I0308 19:41:45.503624 4885 scope.go:117] "RemoveContainer" containerID="67a173c7d23f4e826672d366138f3bcda3d03e275f18ddffd328f814fe2d0924" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.148051 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549982-jnpc6"] Mar 08 19:42:00 crc kubenswrapper[4885]: E0308 19:42:00.149661 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" containerName="registry" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.149687 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" containerName="registry" Mar 08 19:42:00 crc kubenswrapper[4885]: E0308 19:42:00.149719 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffc137d5-821a-406d-8db5-d396d0091991" containerName="oc" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.149732 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffc137d5-821a-406d-8db5-d396d0091991" containerName="oc" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.149888 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffc137d5-821a-406d-8db5-d396d0091991" containerName="oc" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.149958 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d73649f7-56a7-4cd0-bc0b-8b9f26c8abc1" containerName="registry" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.151499 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.154044 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.154748 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.156519 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.160632 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549982-jnpc6"] Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.338022 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ndnv\" (UniqueName: \"kubernetes.io/projected/30a90e18-1089-40ae-a5f0-f43b1d252129-kube-api-access-8ndnv\") pod \"auto-csr-approver-29549982-jnpc6\" (UID: \"30a90e18-1089-40ae-a5f0-f43b1d252129\") " pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.439142 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ndnv\" (UniqueName: \"kubernetes.io/projected/30a90e18-1089-40ae-a5f0-f43b1d252129-kube-api-access-8ndnv\") pod \"auto-csr-approver-29549982-jnpc6\" (UID: \"30a90e18-1089-40ae-a5f0-f43b1d252129\") " pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.471951 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ndnv\" (UniqueName: \"kubernetes.io/projected/30a90e18-1089-40ae-a5f0-f43b1d252129-kube-api-access-8ndnv\") pod \"auto-csr-approver-29549982-jnpc6\" (UID: \"30a90e18-1089-40ae-a5f0-f43b1d252129\") " pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.486496 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.718352 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549982-jnpc6"] Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.725327 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 19:42:00 crc kubenswrapper[4885]: I0308 19:42:00.863764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" event={"ID":"30a90e18-1089-40ae-a5f0-f43b1d252129","Type":"ContainerStarted","Data":"6cbfe7a51bebe6572b818b81007b0d0586ecc420467ea896bbbdaf492746085b"} Mar 08 19:42:02 crc kubenswrapper[4885]: I0308 19:42:02.888123 4885 generic.go:334] "Generic (PLEG): container finished" podID="30a90e18-1089-40ae-a5f0-f43b1d252129" containerID="c050624aad83fb4c435f0fa087d6fe3ddc6c1b029b5c8f9e354ed5228ef2d3fa" exitCode=0 Mar 08 19:42:02 crc kubenswrapper[4885]: I0308 19:42:02.888323 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" event={"ID":"30a90e18-1089-40ae-a5f0-f43b1d252129","Type":"ContainerDied","Data":"c050624aad83fb4c435f0fa087d6fe3ddc6c1b029b5c8f9e354ed5228ef2d3fa"} Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.186744 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.300262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ndnv\" (UniqueName: \"kubernetes.io/projected/30a90e18-1089-40ae-a5f0-f43b1d252129-kube-api-access-8ndnv\") pod \"30a90e18-1089-40ae-a5f0-f43b1d252129\" (UID: \"30a90e18-1089-40ae-a5f0-f43b1d252129\") " Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.305624 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30a90e18-1089-40ae-a5f0-f43b1d252129-kube-api-access-8ndnv" (OuterVolumeSpecName: "kube-api-access-8ndnv") pod "30a90e18-1089-40ae-a5f0-f43b1d252129" (UID: "30a90e18-1089-40ae-a5f0-f43b1d252129"). InnerVolumeSpecName "kube-api-access-8ndnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.402215 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ndnv\" (UniqueName: \"kubernetes.io/projected/30a90e18-1089-40ae-a5f0-f43b1d252129-kube-api-access-8ndnv\") on node \"crc\" DevicePath \"\"" Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.904767 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" event={"ID":"30a90e18-1089-40ae-a5f0-f43b1d252129","Type":"ContainerDied","Data":"6cbfe7a51bebe6572b818b81007b0d0586ecc420467ea896bbbdaf492746085b"} Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.904807 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cbfe7a51bebe6572b818b81007b0d0586ecc420467ea896bbbdaf492746085b" Mar 08 19:42:04 crc kubenswrapper[4885]: I0308 19:42:04.904856 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549982-jnpc6" Mar 08 19:42:05 crc kubenswrapper[4885]: I0308 19:42:05.263182 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549976-nhqg4"] Mar 08 19:42:05 crc kubenswrapper[4885]: I0308 19:42:05.269021 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549976-nhqg4"] Mar 08 19:42:05 crc kubenswrapper[4885]: I0308 19:42:05.385890 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a29b0f8-8eee-4c05-9bbe-bebb70f16e58" path="/var/lib/kubelet/pods/5a29b0f8-8eee-4c05-9bbe-bebb70f16e58/volumes" Mar 08 19:42:32 crc kubenswrapper[4885]: I0308 19:42:32.818032 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:42:32 crc kubenswrapper[4885]: I0308 19:42:32.818702 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:42:45 crc kubenswrapper[4885]: I0308 19:42:45.574285 4885 scope.go:117] "RemoveContainer" containerID="0685a4906cd8df57a6fc2f16599ba5b339b14b8ee4e2f165c183f54473c7f2ff" Mar 08 19:42:45 crc kubenswrapper[4885]: I0308 19:42:45.640552 4885 scope.go:117] "RemoveContainer" containerID="7e9e48dffa2c1d1b35dc16a09da7078a95e25a71cb56e7dff87781cbf3d61f90" Mar 08 19:43:02 crc kubenswrapper[4885]: I0308 19:43:02.818684 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:43:02 crc kubenswrapper[4885]: I0308 19:43:02.819493 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:43:32 crc kubenswrapper[4885]: I0308 19:43:32.818795 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:43:32 crc kubenswrapper[4885]: I0308 19:43:32.819661 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:43:32 crc kubenswrapper[4885]: I0308 19:43:32.819782 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:43:32 crc kubenswrapper[4885]: I0308 19:43:32.821748 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"efa4fb8889b5532d28bec018d7816a61ed0ea017834833ffed5162643372e98f"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:43:32 crc kubenswrapper[4885]: I0308 19:43:32.822002 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://efa4fb8889b5532d28bec018d7816a61ed0ea017834833ffed5162643372e98f" gracePeriod=600 Mar 08 19:43:33 crc kubenswrapper[4885]: I0308 19:43:33.508626 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="efa4fb8889b5532d28bec018d7816a61ed0ea017834833ffed5162643372e98f" exitCode=0 Mar 08 19:43:33 crc kubenswrapper[4885]: I0308 19:43:33.508839 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"efa4fb8889b5532d28bec018d7816a61ed0ea017834833ffed5162643372e98f"} Mar 08 19:43:33 crc kubenswrapper[4885]: I0308 19:43:33.509241 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"f94b502e469fe218787b8101e45951a2dfe1f5fc0bc5b2cb2e8b55561aeaabb2"} Mar 08 19:43:33 crc kubenswrapper[4885]: I0308 19:43:33.509280 4885 scope.go:117] "RemoveContainer" containerID="7301efe622f6965ef0088239fdd6ca59b9a8395c4d2bba8dc311752a026260dc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.147410 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549984-4fjvc"] Mar 08 19:44:00 crc kubenswrapper[4885]: E0308 19:44:00.148424 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30a90e18-1089-40ae-a5f0-f43b1d252129" containerName="oc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.148442 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="30a90e18-1089-40ae-a5f0-f43b1d252129" containerName="oc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.148586 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="30a90e18-1089-40ae-a5f0-f43b1d252129" containerName="oc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.149080 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.151491 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.151611 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.152714 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.158273 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549984-4fjvc"] Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.257690 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjzrg\" (UniqueName: \"kubernetes.io/projected/383ac947-e1b1-4f15-98a6-69fcc60e0ac1-kube-api-access-mjzrg\") pod \"auto-csr-approver-29549984-4fjvc\" (UID: \"383ac947-e1b1-4f15-98a6-69fcc60e0ac1\") " pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.360687 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjzrg\" (UniqueName: \"kubernetes.io/projected/383ac947-e1b1-4f15-98a6-69fcc60e0ac1-kube-api-access-mjzrg\") pod \"auto-csr-approver-29549984-4fjvc\" (UID: \"383ac947-e1b1-4f15-98a6-69fcc60e0ac1\") " pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.400001 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjzrg\" (UniqueName: \"kubernetes.io/projected/383ac947-e1b1-4f15-98a6-69fcc60e0ac1-kube-api-access-mjzrg\") pod \"auto-csr-approver-29549984-4fjvc\" (UID: \"383ac947-e1b1-4f15-98a6-69fcc60e0ac1\") " pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.474433 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:00 crc kubenswrapper[4885]: I0308 19:44:00.941660 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549984-4fjvc"] Mar 08 19:44:01 crc kubenswrapper[4885]: I0308 19:44:01.713473 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" event={"ID":"383ac947-e1b1-4f15-98a6-69fcc60e0ac1","Type":"ContainerStarted","Data":"94961ba79d03d33e7b40a64d757b8b45e77ba4e1d9fd1b33e83c080d4e80b3b6"} Mar 08 19:44:02 crc kubenswrapper[4885]: E0308 19:44:02.500178 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod383ac947_e1b1_4f15_98a6_69fcc60e0ac1.slice/crio-b7734316fc145363037328cab9f126d7c1da55c60bd2f7c56f716841be40429c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod383ac947_e1b1_4f15_98a6_69fcc60e0ac1.slice/crio-conmon-b7734316fc145363037328cab9f126d7c1da55c60bd2f7c56f716841be40429c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:44:02 crc kubenswrapper[4885]: I0308 19:44:02.724505 4885 generic.go:334] "Generic (PLEG): container finished" podID="383ac947-e1b1-4f15-98a6-69fcc60e0ac1" containerID="b7734316fc145363037328cab9f126d7c1da55c60bd2f7c56f716841be40429c" exitCode=0 Mar 08 19:44:02 crc kubenswrapper[4885]: I0308 19:44:02.724575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" event={"ID":"383ac947-e1b1-4f15-98a6-69fcc60e0ac1","Type":"ContainerDied","Data":"b7734316fc145363037328cab9f126d7c1da55c60bd2f7c56f716841be40429c"} Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.051582 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.215093 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjzrg\" (UniqueName: \"kubernetes.io/projected/383ac947-e1b1-4f15-98a6-69fcc60e0ac1-kube-api-access-mjzrg\") pod \"383ac947-e1b1-4f15-98a6-69fcc60e0ac1\" (UID: \"383ac947-e1b1-4f15-98a6-69fcc60e0ac1\") " Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.223731 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383ac947-e1b1-4f15-98a6-69fcc60e0ac1-kube-api-access-mjzrg" (OuterVolumeSpecName: "kube-api-access-mjzrg") pod "383ac947-e1b1-4f15-98a6-69fcc60e0ac1" (UID: "383ac947-e1b1-4f15-98a6-69fcc60e0ac1"). InnerVolumeSpecName "kube-api-access-mjzrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.316631 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjzrg\" (UniqueName: \"kubernetes.io/projected/383ac947-e1b1-4f15-98a6-69fcc60e0ac1-kube-api-access-mjzrg\") on node \"crc\" DevicePath \"\"" Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.739452 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" event={"ID":"383ac947-e1b1-4f15-98a6-69fcc60e0ac1","Type":"ContainerDied","Data":"94961ba79d03d33e7b40a64d757b8b45e77ba4e1d9fd1b33e83c080d4e80b3b6"} Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.739547 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94961ba79d03d33e7b40a64d757b8b45e77ba4e1d9fd1b33e83c080d4e80b3b6" Mar 08 19:44:04 crc kubenswrapper[4885]: I0308 19:44:04.739567 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549984-4fjvc" Mar 08 19:44:05 crc kubenswrapper[4885]: I0308 19:44:05.139136 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549978-9wjmp"] Mar 08 19:44:05 crc kubenswrapper[4885]: I0308 19:44:05.141947 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549978-9wjmp"] Mar 08 19:44:05 crc kubenswrapper[4885]: I0308 19:44:05.376216 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71a3a17a-ffa4-4d31-94fb-7e720297e94b" path="/var/lib/kubelet/pods/71a3a17a-ffa4-4d31-94fb-7e720297e94b/volumes" Mar 08 19:44:45 crc kubenswrapper[4885]: I0308 19:44:45.722629 4885 scope.go:117] "RemoveContainer" containerID="2aa80e241984cb33e73f4238b00c6079576ad160fd6e5654000fab91ecb22f03" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.558532 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-psfrk"] Mar 08 19:44:58 crc kubenswrapper[4885]: E0308 19:44:58.560020 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383ac947-e1b1-4f15-98a6-69fcc60e0ac1" containerName="oc" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.560046 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="383ac947-e1b1-4f15-98a6-69fcc60e0ac1" containerName="oc" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.560227 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="383ac947-e1b1-4f15-98a6-69fcc60e0ac1" containerName="oc" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.561577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.566144 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.566484 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.567438 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.570590 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-psfrk"] Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.572276 4885 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-lm6tv" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.729796 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-966qv\" (UniqueName: \"kubernetes.io/projected/1a3c0554-f8ec-4a68-a332-1eba738b28c6-kube-api-access-966qv\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.729961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a3c0554-f8ec-4a68-a332-1eba738b28c6-crc-storage\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.730009 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a3c0554-f8ec-4a68-a332-1eba738b28c6-node-mnt\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.831173 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-966qv\" (UniqueName: \"kubernetes.io/projected/1a3c0554-f8ec-4a68-a332-1eba738b28c6-kube-api-access-966qv\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.831296 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a3c0554-f8ec-4a68-a332-1eba738b28c6-crc-storage\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.831390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a3c0554-f8ec-4a68-a332-1eba738b28c6-node-mnt\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.831831 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a3c0554-f8ec-4a68-a332-1eba738b28c6-node-mnt\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.832677 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a3c0554-f8ec-4a68-a332-1eba738b28c6-crc-storage\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.864510 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-966qv\" (UniqueName: \"kubernetes.io/projected/1a3c0554-f8ec-4a68-a332-1eba738b28c6-kube-api-access-966qv\") pod \"crc-storage-crc-psfrk\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:58 crc kubenswrapper[4885]: I0308 19:44:58.891298 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:44:59 crc kubenswrapper[4885]: I0308 19:44:59.159475 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-psfrk"] Mar 08 19:44:59 crc kubenswrapper[4885]: W0308 19:44:59.168186 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a3c0554_f8ec_4a68_a332_1eba738b28c6.slice/crio-a30897b0184cd449ad2a100102d2814d83f7adc09bfa8d6fcda4ca69150d3015 WatchSource:0}: Error finding container a30897b0184cd449ad2a100102d2814d83f7adc09bfa8d6fcda4ca69150d3015: Status 404 returned error can't find the container with id a30897b0184cd449ad2a100102d2814d83f7adc09bfa8d6fcda4ca69150d3015 Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.114540 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-psfrk" event={"ID":"1a3c0554-f8ec-4a68-a332-1eba738b28c6","Type":"ContainerStarted","Data":"a30897b0184cd449ad2a100102d2814d83f7adc09bfa8d6fcda4ca69150d3015"} Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.143290 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn"] Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.144124 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.147593 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.148955 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tdlp\" (UniqueName: \"kubernetes.io/projected/f8673a65-b7c8-4c06-9713-a095b399358a-kube-api-access-6tdlp\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.149046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8673a65-b7c8-4c06-9713-a095b399358a-config-volume\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.149081 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8673a65-b7c8-4c06-9713-a095b399358a-secret-volume\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.149879 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.167576 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn"] Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.250856 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tdlp\" (UniqueName: \"kubernetes.io/projected/f8673a65-b7c8-4c06-9713-a095b399358a-kube-api-access-6tdlp\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.250980 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8673a65-b7c8-4c06-9713-a095b399358a-config-volume\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.251020 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8673a65-b7c8-4c06-9713-a095b399358a-secret-volume\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.252547 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8673a65-b7c8-4c06-9713-a095b399358a-config-volume\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.269827 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8673a65-b7c8-4c06-9713-a095b399358a-secret-volume\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.293578 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tdlp\" (UniqueName: \"kubernetes.io/projected/f8673a65-b7c8-4c06-9713-a095b399358a-kube-api-access-6tdlp\") pod \"collect-profiles-29549985-v28gn\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.477858 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:00 crc kubenswrapper[4885]: I0308 19:45:00.971164 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn"] Mar 08 19:45:01 crc kubenswrapper[4885]: W0308 19:45:01.010814 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8673a65_b7c8_4c06_9713_a095b399358a.slice/crio-c8a50777c8f2e044d397719dfc6ccd948194cce84622f71d2a5fc37a717a790b WatchSource:0}: Error finding container c8a50777c8f2e044d397719dfc6ccd948194cce84622f71d2a5fc37a717a790b: Status 404 returned error can't find the container with id c8a50777c8f2e044d397719dfc6ccd948194cce84622f71d2a5fc37a717a790b Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.127889 4885 generic.go:334] "Generic (PLEG): container finished" podID="1a3c0554-f8ec-4a68-a332-1eba738b28c6" containerID="b57ab7fa02bc0b3cc8cdcea97c2fd6abc762d9cf4cd62e7caa0369ec5c53eef8" exitCode=0 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.127996 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-psfrk" event={"ID":"1a3c0554-f8ec-4a68-a332-1eba738b28c6","Type":"ContainerDied","Data":"b57ab7fa02bc0b3cc8cdcea97c2fd6abc762d9cf4cd62e7caa0369ec5c53eef8"} Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.129361 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" event={"ID":"f8673a65-b7c8-4c06-9713-a095b399358a","Type":"ContainerStarted","Data":"c8a50777c8f2e044d397719dfc6ccd948194cce84622f71d2a5fc37a717a790b"} Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.590886 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bssfh"] Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.591662 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-controller" containerID="cri-o://f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.591725 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="nbdb" containerID="cri-o://1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.591948 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="sbdb" containerID="cri-o://fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.592069 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-acl-logging" containerID="cri-o://9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.592115 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="northd" containerID="cri-o://379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.592065 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-node" containerID="cri-o://409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.592162 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680" gracePeriod=30 Mar 08 19:45:01 crc kubenswrapper[4885]: I0308 19:45:01.646215 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" containerID="cri-o://9417467a958808ec76e7baaf3e912528258fa08f33b991a9a656c8f2699dfe08" gracePeriod=30 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.138479 4885 generic.go:334] "Generic (PLEG): container finished" podID="f8673a65-b7c8-4c06-9713-a095b399358a" containerID="dc7b1fe292df06f58ac62305ed639526799d6857e418c3744dffefa96ddd2209" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.138647 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" event={"ID":"f8673a65-b7c8-4c06-9713-a095b399358a","Type":"ContainerDied","Data":"dc7b1fe292df06f58ac62305ed639526799d6857e418c3744dffefa96ddd2209"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.143284 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovnkube-controller/3.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.148074 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovn-acl-logging/0.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.149315 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovn-controller/0.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150023 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="9417467a958808ec76e7baaf3e912528258fa08f33b991a9a656c8f2699dfe08" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150073 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150091 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150115 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150134 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150152 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138" exitCode=0 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150149 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"9417467a958808ec76e7baaf3e912528258fa08f33b991a9a656c8f2699dfe08"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150222 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150250 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150287 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150308 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150312 4885 scope.go:117] "RemoveContainer" containerID="9f043e81314100644c599fce66b01e01b90ee2fe3874fc218276f06da4960000" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150326 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150173 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6" exitCode=143 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150451 4885 generic.go:334] "Generic (PLEG): container finished" podID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerID="f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0" exitCode=143 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.150555 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.157982 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/2.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.159507 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/1.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.159900 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerDied","Data":"47b9aa6e943174d2f8819d017007c51f3809d8a8e2d7a64900f1aa71bf065584"} Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.160658 4885 generic.go:334] "Generic (PLEG): container finished" podID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" containerID="47b9aa6e943174d2f8819d017007c51f3809d8a8e2d7a64900f1aa71bf065584" exitCode=2 Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.165410 4885 scope.go:117] "RemoveContainer" containerID="47b9aa6e943174d2f8819d017007c51f3809d8a8e2d7a64900f1aa71bf065584" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.165989 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-ff7b4_openshift-multus(9ac72c25-d3e6-4dda-8444-6cd4442af7e4)\"" pod="openshift-multus/multus-ff7b4" podUID="9ac72c25-d3e6-4dda-8444-6cd4442af7e4" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.189187 4885 scope.go:117] "RemoveContainer" containerID="f53ba9b230726719c58ea3ea7e032b8e5d98514215e8cae118756b072c5d1a4d" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.238014 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.367831 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovn-acl-logging/0.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.368598 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovn-controller/0.log" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.369290 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.383766 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-966qv\" (UniqueName: \"kubernetes.io/projected/1a3c0554-f8ec-4a68-a332-1eba738b28c6-kube-api-access-966qv\") pod \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.383911 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a3c0554-f8ec-4a68-a332-1eba738b28c6-crc-storage\") pod \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.384062 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a3c0554-f8ec-4a68-a332-1eba738b28c6-node-mnt\") pod \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\" (UID: \"1a3c0554-f8ec-4a68-a332-1eba738b28c6\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.384141 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a3c0554-f8ec-4a68-a332-1eba738b28c6-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "1a3c0554-f8ec-4a68-a332-1eba738b28c6" (UID: "1a3c0554-f8ec-4a68-a332-1eba738b28c6"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.384431 4885 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/1a3c0554-f8ec-4a68-a332-1eba738b28c6-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.391081 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a3c0554-f8ec-4a68-a332-1eba738b28c6-kube-api-access-966qv" (OuterVolumeSpecName: "kube-api-access-966qv") pod "1a3c0554-f8ec-4a68-a332-1eba738b28c6" (UID: "1a3c0554-f8ec-4a68-a332-1eba738b28c6"). InnerVolumeSpecName "kube-api-access-966qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.414266 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a3c0554-f8ec-4a68-a332-1eba738b28c6-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "1a3c0554-f8ec-4a68-a332-1eba738b28c6" (UID: "1a3c0554-f8ec-4a68-a332-1eba738b28c6"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.436649 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rbdmt"] Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.436907 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-node" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.436948 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-node" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.436959 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.436967 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.436976 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a3c0554-f8ec-4a68-a332-1eba738b28c6" containerName="storage" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.436984 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a3c0554-f8ec-4a68-a332-1eba738b28c6" containerName="storage" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.436998 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kubecfg-setup" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437006 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kubecfg-setup" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437018 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437026 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437037 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="northd" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437045 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="northd" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437054 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="nbdb" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437062 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="nbdb" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437076 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437085 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437096 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-acl-logging" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437103 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-acl-logging" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437114 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437122 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437136 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437144 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437154 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="sbdb" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437162 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="sbdb" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437295 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437311 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="sbdb" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437323 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a3c0554-f8ec-4a68-a332-1eba738b28c6" containerName="storage" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437333 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-node" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437343 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437352 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovn-acl-logging" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437365 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437374 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="kube-rbac-proxy-ovn-metrics" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437382 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="northd" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437393 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="nbdb" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437402 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437413 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437518 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437528 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437649 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: E0308 19:45:02.437752 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.437762 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" containerName="ovnkube-controller" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.439666 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485519 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-ovn\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485623 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-etc-openvswitch\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485685 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-var-lib-cni-networks-ovn-kubernetes\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485762 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485768 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485836 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485856 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-log-socket\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.485998 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-log-socket" (OuterVolumeSpecName: "log-socket") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486362 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-ovn-kubernetes\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486466 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-env-overrides\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486503 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486542 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovn-node-metrics-cert\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486595 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-netns\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486652 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-systemd\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486681 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486719 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-netd\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486769 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-openvswitch\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486826 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-bin\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486875 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-kubelet\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.486983 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-node-log\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487047 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-script-lib\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487127 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487140 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487172 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487190 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-node-log" (OuterVolumeSpecName: "node-log") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487127 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487265 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-config\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487410 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-systemd-units\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487465 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-slash\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487536 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-var-lib-openvswitch\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487584 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mlvt\" (UniqueName: \"kubernetes.io/projected/dedec2a4-d864-4f30-8a2d-b3168817ea34-kube-api-access-5mlvt\") pod \"dedec2a4-d864-4f30-8a2d-b3168817ea34\" (UID: \"dedec2a4-d864-4f30-8a2d-b3168817ea34\") " Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487535 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487739 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487749 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-slash\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487807 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovnkube-script-lib\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487745 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-slash" (OuterVolumeSpecName: "host-slash") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487791 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487844 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487888 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovn-node-metrics-cert\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.487990 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-cni-netd\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488036 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-systemd-units\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-systemd\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488106 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52khv\" (UniqueName: \"kubernetes.io/projected/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-kube-api-access-52khv\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488195 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488219 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-kubelet\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488423 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-log-socket\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488524 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488809 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-ovn\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.488985 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-env-overrides\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.489122 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-var-lib-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.489395 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-node-log\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.489514 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovnkube-config\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.489631 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-run-netns\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.489731 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-cni-bin\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.489831 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-etc-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490073 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-run-ovn-kubernetes\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490154 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490273 4885 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490343 4885 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/1a3c0554-f8ec-4a68-a332-1eba738b28c6-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490411 4885 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490479 4885 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490553 4885 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490636 4885 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490705 4885 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490774 4885 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-node-log\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490843 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490943 4885 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491043 4885 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491123 4885 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-slash\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491193 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-966qv\" (UniqueName: \"kubernetes.io/projected/1a3c0554-f8ec-4a68-a332-1eba738b28c6-kube-api-access-966qv\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491268 4885 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491346 4885 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491415 4885 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491500 4885 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491647 4885 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-log-socket\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.491726 4885 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.490742 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.492396 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dedec2a4-d864-4f30-8a2d-b3168817ea34-kube-api-access-5mlvt" (OuterVolumeSpecName: "kube-api-access-5mlvt") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "kube-api-access-5mlvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.511650 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "dedec2a4-d864-4f30-8a2d-b3168817ea34" (UID: "dedec2a4-d864-4f30-8a2d-b3168817ea34"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593180 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-slash\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593244 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovnkube-script-lib\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593315 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovn-node-metrics-cert\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593352 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-cni-netd\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593391 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-systemd-units\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593425 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-systemd\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593458 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52khv\" (UniqueName: \"kubernetes.io/projected/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-kube-api-access-52khv\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593492 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-kubelet\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593532 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-log-socket\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593562 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593591 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-ovn\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593627 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-env-overrides\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593663 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-var-lib-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593714 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-node-log\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593744 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovnkube-config\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593772 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-run-netns\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593805 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-cni-bin\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593849 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-etc-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593879 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-run-ovn-kubernetes\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.593954 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594022 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mlvt\" (UniqueName: \"kubernetes.io/projected/dedec2a4-d864-4f30-8a2d-b3168817ea34-kube-api-access-5mlvt\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594045 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dedec2a4-d864-4f30-8a2d-b3168817ea34-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594065 4885 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dedec2a4-d864-4f30-8a2d-b3168817ea34-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594141 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594230 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-slash\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594301 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.594319 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-ovn\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595358 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-env-overrides\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595427 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-var-lib-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-node-log\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595615 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovnkube-script-lib\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595810 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-log-socket\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595821 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-kubelet\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595885 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-cni-bin\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595815 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-cni-netd\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595948 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-systemd-units\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595986 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-etc-openvswitch\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.595984 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-run-netns\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.596011 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-host-run-ovn-kubernetes\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.596314 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovnkube-config\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.596544 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-run-systemd\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.601479 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-ovn-node-metrics-cert\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.627044 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52khv\" (UniqueName: \"kubernetes.io/projected/003d9fa9-c0c3-4af8-bdb0-084620b29ae0-kube-api-access-52khv\") pod \"ovnkube-node-rbdmt\" (UID: \"003d9fa9-c0c3-4af8-bdb0-084620b29ae0\") " pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: I0308 19:45:02.756282 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:02 crc kubenswrapper[4885]: W0308 19:45:02.789634 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod003d9fa9_c0c3_4af8_bdb0_084620b29ae0.slice/crio-ef7df8ae2e88cb7722a7e79b23bb9ffe544a42c65d853e531de3a9badab52555 WatchSource:0}: Error finding container ef7df8ae2e88cb7722a7e79b23bb9ffe544a42c65d853e531de3a9badab52555: Status 404 returned error can't find the container with id ef7df8ae2e88cb7722a7e79b23bb9ffe544a42c65d853e531de3a9badab52555 Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.171288 4885 generic.go:334] "Generic (PLEG): container finished" podID="003d9fa9-c0c3-4af8-bdb0-084620b29ae0" containerID="2eb629dd73ff88d9831cd476b65ccd4903a995f4f9b25e669a7928d96bd36e93" exitCode=0 Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.173212 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerDied","Data":"2eb629dd73ff88d9831cd476b65ccd4903a995f4f9b25e669a7928d96bd36e93"} Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.173440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"ef7df8ae2e88cb7722a7e79b23bb9ffe544a42c65d853e531de3a9badab52555"} Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.178042 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovn-acl-logging/0.log" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.179903 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bssfh_dedec2a4-d864-4f30-8a2d-b3168817ea34/ovn-controller/0.log" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.180598 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" event={"ID":"dedec2a4-d864-4f30-8a2d-b3168817ea34","Type":"ContainerDied","Data":"54661ff92d86d446f0561f70be37d97fffc952cd7edc4f3f4e212f70264f4183"} Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.180650 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bssfh" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.180658 4885 scope.go:117] "RemoveContainer" containerID="9417467a958808ec76e7baaf3e912528258fa08f33b991a9a656c8f2699dfe08" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.184469 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-psfrk" event={"ID":"1a3c0554-f8ec-4a68-a332-1eba738b28c6","Type":"ContainerDied","Data":"a30897b0184cd449ad2a100102d2814d83f7adc09bfa8d6fcda4ca69150d3015"} Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.184526 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a30897b0184cd449ad2a100102d2814d83f7adc09bfa8d6fcda4ca69150d3015" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.184528 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-psfrk" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.190515 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/2.log" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.214314 4885 scope.go:117] "RemoveContainer" containerID="fae0ea20454d7107e00766bc7266fff6fb6323c232cc48b7192ca3ac398f2ae3" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.248199 4885 scope.go:117] "RemoveContainer" containerID="1ebd308cd22c54151b20226c0f5227e9b9a2ef82c695000ebb141dc79885ef44" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.249718 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bssfh"] Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.256958 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bssfh"] Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.279488 4885 scope.go:117] "RemoveContainer" containerID="379bbb387f3cb0a1465cffcbe34d8a3afd7f3474eb22876f9655eb65c4ef0220" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.280445 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.306405 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8673a65-b7c8-4c06-9713-a095b399358a-secret-volume\") pod \"f8673a65-b7c8-4c06-9713-a095b399358a\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.306484 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8673a65-b7c8-4c06-9713-a095b399358a-config-volume\") pod \"f8673a65-b7c8-4c06-9713-a095b399358a\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.306517 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tdlp\" (UniqueName: \"kubernetes.io/projected/f8673a65-b7c8-4c06-9713-a095b399358a-kube-api-access-6tdlp\") pod \"f8673a65-b7c8-4c06-9713-a095b399358a\" (UID: \"f8673a65-b7c8-4c06-9713-a095b399358a\") " Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.310860 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8673a65-b7c8-4c06-9713-a095b399358a-kube-api-access-6tdlp" (OuterVolumeSpecName: "kube-api-access-6tdlp") pod "f8673a65-b7c8-4c06-9713-a095b399358a" (UID: "f8673a65-b7c8-4c06-9713-a095b399358a"). InnerVolumeSpecName "kube-api-access-6tdlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.311430 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8673a65-b7c8-4c06-9713-a095b399358a-config-volume" (OuterVolumeSpecName: "config-volume") pod "f8673a65-b7c8-4c06-9713-a095b399358a" (UID: "f8673a65-b7c8-4c06-9713-a095b399358a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.317850 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8673a65-b7c8-4c06-9713-a095b399358a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f8673a65-b7c8-4c06-9713-a095b399358a" (UID: "f8673a65-b7c8-4c06-9713-a095b399358a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.363120 4885 scope.go:117] "RemoveContainer" containerID="ca03b46f0704c5711323cac4942a6ff92fdd0f48250760576c09007579d14680" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.376000 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dedec2a4-d864-4f30-8a2d-b3168817ea34" path="/var/lib/kubelet/pods/dedec2a4-d864-4f30-8a2d-b3168817ea34/volumes" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.379284 4885 scope.go:117] "RemoveContainer" containerID="409e2462dc254e5553c528021c331ab1b5eadd2e4bbca0f92e986993ee89a138" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.393965 4885 scope.go:117] "RemoveContainer" containerID="9511c6d1db24b09efbebb8a3191e99098cb5ec4dfb095d2b2a01b749cefa47f6" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.408140 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8673a65-b7c8-4c06-9713-a095b399358a-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.408168 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tdlp\" (UniqueName: \"kubernetes.io/projected/f8673a65-b7c8-4c06-9713-a095b399358a-kube-api-access-6tdlp\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.408182 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f8673a65-b7c8-4c06-9713-a095b399358a-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.427739 4885 scope.go:117] "RemoveContainer" containerID="f19e177e5556150de52d3a96b9e2aa475ceb606dd6594a1c3b50327a576d9ea0" Mar 08 19:45:03 crc kubenswrapper[4885]: I0308 19:45:03.451111 4885 scope.go:117] "RemoveContainer" containerID="ebc92e26eabd6884255b032466edc7808f8a326feb450745d5279581b270c73e" Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.351153 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"e8f3eb63d0193da86832075983e64d83e02a6d9c9f4aeb44f8631f8ffd3988ad"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.351533 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"0891820ae75278e337fd3cc610e870e88fee31edc84e52dbed049bf24b17cacf"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.351554 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"73f370b34d05e9b11ea5ea3cf54e2bf119977d2c88048fd57601e3135cce7a88"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.351571 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"cf61f26d04d904050a733c6b1a9569445872afce269be85469582bf7f7e99351"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.351588 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"5185e11c0691c4782a4c0ff035ead7be04829de0c8dce43b90328cce5133e774"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.351604 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"8614aa3e5915b1864cd05579f93795d5c9d9fe400c04ae5562d37b8fb3e57dca"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.353509 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" event={"ID":"f8673a65-b7c8-4c06-9713-a095b399358a","Type":"ContainerDied","Data":"c8a50777c8f2e044d397719dfc6ccd948194cce84622f71d2a5fc37a717a790b"} Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.353544 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8a50777c8f2e044d397719dfc6ccd948194cce84622f71d2a5fc37a717a790b" Mar 08 19:45:04 crc kubenswrapper[4885]: I0308 19:45:04.353607 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn" Mar 08 19:45:06 crc kubenswrapper[4885]: I0308 19:45:06.372993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"427ded321330b2ecdaf11f8ec9439d128083d97f20ecd6bedcbc5c34c31bd54f"} Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.395954 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" event={"ID":"003d9fa9-c0c3-4af8-bdb0-084620b29ae0","Type":"ContainerStarted","Data":"7e3f56d51aaeefec1253d6bb79a15a4262dc2f53f1f13c78f069a85fe97dfd47"} Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.396492 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.396509 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.396523 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.429137 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.436444 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:09 crc kubenswrapper[4885]: I0308 19:45:09.441422 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" podStartSLOduration=7.441403525 podStartE2EDuration="7.441403525s" podCreationTimestamp="2026-03-08 19:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:45:09.438181639 +0000 UTC m=+810.834235672" watchObservedRunningTime="2026-03-08 19:45:09.441403525 +0000 UTC m=+810.837457548" Mar 08 19:45:10 crc kubenswrapper[4885]: I0308 19:45:10.890731 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf"] Mar 08 19:45:10 crc kubenswrapper[4885]: E0308 19:45:10.891483 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8673a65-b7c8-4c06-9713-a095b399358a" containerName="collect-profiles" Mar 08 19:45:10 crc kubenswrapper[4885]: I0308 19:45:10.891505 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8673a65-b7c8-4c06-9713-a095b399358a" containerName="collect-profiles" Mar 08 19:45:10 crc kubenswrapper[4885]: I0308 19:45:10.891671 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8673a65-b7c8-4c06-9713-a095b399358a" containerName="collect-profiles" Mar 08 19:45:10 crc kubenswrapper[4885]: I0308 19:45:10.893081 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:10 crc kubenswrapper[4885]: I0308 19:45:10.903863 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 19:45:10 crc kubenswrapper[4885]: I0308 19:45:10.904269 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf"] Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.003955 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.004007 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhgwp\" (UniqueName: \"kubernetes.io/projected/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-kube-api-access-xhgwp\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.004058 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.104851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.104939 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.104976 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhgwp\" (UniqueName: \"kubernetes.io/projected/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-kube-api-access-xhgwp\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.105747 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.106654 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.125996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhgwp\" (UniqueName: \"kubernetes.io/projected/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-kube-api-access-xhgwp\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.212427 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.249094 4885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(bde0de45022d463e84a7bb31175935b344a221d5d6de8236d3179236040ec8c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.249253 4885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(bde0de45022d463e84a7bb31175935b344a221d5d6de8236d3179236040ec8c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.249320 4885 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(bde0de45022d463e84a7bb31175935b344a221d5d6de8236d3179236040ec8c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.249429 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace(1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace(1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(bde0de45022d463e84a7bb31175935b344a221d5d6de8236d3179236040ec8c2): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.408724 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: I0308 19:45:11.409835 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.441874 4885 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(9ef87c915fa858881076d675c7a2399f972d37954a1c9df6a93f8cdb9bf1627b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.442229 4885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(9ef87c915fa858881076d675c7a2399f972d37954a1c9df6a93f8cdb9bf1627b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.442256 4885 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(9ef87c915fa858881076d675c7a2399f972d37954a1c9df6a93f8cdb9bf1627b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:11 crc kubenswrapper[4885]: E0308 19:45:11.442312 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace(1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace(1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_openshift-marketplace_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67_0(9ef87c915fa858881076d675c7a2399f972d37954a1c9df6a93f8cdb9bf1627b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" Mar 08 19:45:13 crc kubenswrapper[4885]: I0308 19:45:13.368390 4885 scope.go:117] "RemoveContainer" containerID="47b9aa6e943174d2f8819d017007c51f3809d8a8e2d7a64900f1aa71bf065584" Mar 08 19:45:14 crc kubenswrapper[4885]: I0308 19:45:14.429164 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ff7b4_9ac72c25-d3e6-4dda-8444-6cd4442af7e4/kube-multus/2.log" Mar 08 19:45:14 crc kubenswrapper[4885]: I0308 19:45:14.429419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ff7b4" event={"ID":"9ac72c25-d3e6-4dda-8444-6cd4442af7e4","Type":"ContainerStarted","Data":"4b86a03c5c9a206f4ea2240f652d8bef1ba716966723ee8fc5b64f7d118486a7"} Mar 08 19:45:22 crc kubenswrapper[4885]: I0308 19:45:22.367641 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:22 crc kubenswrapper[4885]: I0308 19:45:22.368965 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:22 crc kubenswrapper[4885]: I0308 19:45:22.700647 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf"] Mar 08 19:45:23 crc kubenswrapper[4885]: I0308 19:45:23.493525 4885 generic.go:334] "Generic (PLEG): container finished" podID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerID="86f9b94c866e293c0097cd26686dd912dc8dd9a05680cd99bc3c03cd64187c52" exitCode=0 Mar 08 19:45:23 crc kubenswrapper[4885]: I0308 19:45:23.493623 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" event={"ID":"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67","Type":"ContainerDied","Data":"86f9b94c866e293c0097cd26686dd912dc8dd9a05680cd99bc3c03cd64187c52"} Mar 08 19:45:23 crc kubenswrapper[4885]: I0308 19:45:23.494004 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" event={"ID":"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67","Type":"ContainerStarted","Data":"0fb3f200d30b6d6c3cbcc1fd09274bd1a2462686041d07008c77385970149b91"} Mar 08 19:45:26 crc kubenswrapper[4885]: I0308 19:45:26.514739 4885 generic.go:334] "Generic (PLEG): container finished" podID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerID="9786fb7b6a4f4d2ecb4e516099038ab04c52aa4d7b996237f7371a71c20dedb8" exitCode=0 Mar 08 19:45:26 crc kubenswrapper[4885]: I0308 19:45:26.514909 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" event={"ID":"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67","Type":"ContainerDied","Data":"9786fb7b6a4f4d2ecb4e516099038ab04c52aa4d7b996237f7371a71c20dedb8"} Mar 08 19:45:27 crc kubenswrapper[4885]: I0308 19:45:27.527076 4885 generic.go:334] "Generic (PLEG): container finished" podID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerID="699eeb7926e7079f8572a448098dff2cc1160bab7c53f37413cb5b08942c6c7e" exitCode=0 Mar 08 19:45:27 crc kubenswrapper[4885]: I0308 19:45:27.527138 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" event={"ID":"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67","Type":"ContainerDied","Data":"699eeb7926e7079f8572a448098dff2cc1160bab7c53f37413cb5b08942c6c7e"} Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.826668 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.900059 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-util\") pod \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.900156 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-bundle\") pod \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.900208 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhgwp\" (UniqueName: \"kubernetes.io/projected/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-kube-api-access-xhgwp\") pod \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\" (UID: \"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67\") " Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.901587 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-bundle" (OuterVolumeSpecName: "bundle") pod "1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" (UID: "1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.905692 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-kube-api-access-xhgwp" (OuterVolumeSpecName: "kube-api-access-xhgwp") pod "1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" (UID: "1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67"). InnerVolumeSpecName "kube-api-access-xhgwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:45:28 crc kubenswrapper[4885]: I0308 19:45:28.912779 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-util" (OuterVolumeSpecName: "util") pod "1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" (UID: "1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:45:29 crc kubenswrapper[4885]: I0308 19:45:29.001831 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-util\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:29 crc kubenswrapper[4885]: I0308 19:45:29.001869 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:29 crc kubenswrapper[4885]: I0308 19:45:29.001883 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhgwp\" (UniqueName: \"kubernetes.io/projected/1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67-kube-api-access-xhgwp\") on node \"crc\" DevicePath \"\"" Mar 08 19:45:29 crc kubenswrapper[4885]: I0308 19:45:29.544904 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" event={"ID":"1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67","Type":"ContainerDied","Data":"0fb3f200d30b6d6c3cbcc1fd09274bd1a2462686041d07008c77385970149b91"} Mar 08 19:45:29 crc kubenswrapper[4885]: I0308 19:45:29.544984 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fb3f200d30b6d6c3cbcc1fd09274bd1a2462686041d07008c77385970149b91" Mar 08 19:45:29 crc kubenswrapper[4885]: I0308 19:45:29.545062 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf" Mar 08 19:45:30 crc kubenswrapper[4885]: I0308 19:45:30.443141 4885 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.612695 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk"] Mar 08 19:45:32 crc kubenswrapper[4885]: E0308 19:45:32.613017 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="util" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.613035 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="util" Mar 08 19:45:32 crc kubenswrapper[4885]: E0308 19:45:32.613052 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="pull" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.613062 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="pull" Mar 08 19:45:32 crc kubenswrapper[4885]: E0308 19:45:32.613085 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="extract" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.613096 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="extract" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.613260 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67" containerName="extract" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.613805 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.615857 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.616147 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rn8k8" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.616584 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.630695 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk"] Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.776573 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkttv\" (UniqueName: \"kubernetes.io/projected/02d2b43e-55f4-49f1-9bb1-3e70ed22a3da-kube-api-access-dkttv\") pod \"nmstate-operator-75c5dccd6c-5twjk\" (UID: \"02d2b43e-55f4-49f1-9bb1-3e70ed22a3da\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.783503 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rbdmt" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.878128 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkttv\" (UniqueName: \"kubernetes.io/projected/02d2b43e-55f4-49f1-9bb1-3e70ed22a3da-kube-api-access-dkttv\") pod \"nmstate-operator-75c5dccd6c-5twjk\" (UID: \"02d2b43e-55f4-49f1-9bb1-3e70ed22a3da\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.901290 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkttv\" (UniqueName: \"kubernetes.io/projected/02d2b43e-55f4-49f1-9bb1-3e70ed22a3da-kube-api-access-dkttv\") pod \"nmstate-operator-75c5dccd6c-5twjk\" (UID: \"02d2b43e-55f4-49f1-9bb1-3e70ed22a3da\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" Mar 08 19:45:32 crc kubenswrapper[4885]: I0308 19:45:32.930733 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" Mar 08 19:45:33 crc kubenswrapper[4885]: I0308 19:45:33.166512 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk"] Mar 08 19:45:33 crc kubenswrapper[4885]: W0308 19:45:33.173145 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02d2b43e_55f4_49f1_9bb1_3e70ed22a3da.slice/crio-4a40e8045dd740aa28bc93465b1c858c9e89054bcbc83189308dc2ae0a103d67 WatchSource:0}: Error finding container 4a40e8045dd740aa28bc93465b1c858c9e89054bcbc83189308dc2ae0a103d67: Status 404 returned error can't find the container with id 4a40e8045dd740aa28bc93465b1c858c9e89054bcbc83189308dc2ae0a103d67 Mar 08 19:45:33 crc kubenswrapper[4885]: I0308 19:45:33.567645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" event={"ID":"02d2b43e-55f4-49f1-9bb1-3e70ed22a3da","Type":"ContainerStarted","Data":"4a40e8045dd740aa28bc93465b1c858c9e89054bcbc83189308dc2ae0a103d67"} Mar 08 19:45:36 crc kubenswrapper[4885]: I0308 19:45:36.589901 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" event={"ID":"02d2b43e-55f4-49f1-9bb1-3e70ed22a3da","Type":"ContainerStarted","Data":"ad29a575fad32bb21e1014d36c3e961dfaf8b0fc12bb6247f70e49cb2f02cd9f"} Mar 08 19:45:36 crc kubenswrapper[4885]: I0308 19:45:36.621778 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-5twjk" podStartSLOduration=2.282291635 podStartE2EDuration="4.621753366s" podCreationTimestamp="2026-03-08 19:45:32 +0000 UTC" firstStartedPulling="2026-03-08 19:45:33.175623767 +0000 UTC m=+834.571677800" lastFinishedPulling="2026-03-08 19:45:35.515085498 +0000 UTC m=+836.911139531" observedRunningTime="2026-03-08 19:45:36.621716955 +0000 UTC m=+838.017771008" watchObservedRunningTime="2026-03-08 19:45:36.621753366 +0000 UTC m=+838.017807419" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.345021 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-wsk7q"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.346905 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.357818 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-8pvzf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.378248 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.379050 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.383125 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.383379 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6m2b5"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.384388 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.392165 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-wsk7q"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.402603 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.486265 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487069 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487449 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tbh9\" (UniqueName: \"kubernetes.io/projected/75f588d1-7159-4a94-bf89-bb18a880a403-kube-api-access-7tbh9\") pod \"nmstate-metrics-69594cc75-wsk7q\" (UID: \"75f588d1-7159-4a94-bf89-bb18a880a403\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487513 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-ovs-socket\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487539 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3793d26a-a132-40db-b8fe-2cf83428b03c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bb7k9\" (UID: \"3793d26a-a132-40db-b8fe-2cf83428b03c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c59z8\" (UniqueName: \"kubernetes.io/projected/74d96fe5-1ab9-4703-8717-509cf115d985-kube-api-access-c59z8\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487655 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9c7q\" (UniqueName: \"kubernetes.io/projected/3793d26a-a132-40db-b8fe-2cf83428b03c-kube-api-access-v9c7q\") pod \"nmstate-webhook-786f45cff4-bb7k9\" (UID: \"3793d26a-a132-40db-b8fe-2cf83428b03c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487674 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-dbus-socket\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.487699 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-nmstate-lock\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.489096 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-bcjwg" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.489308 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.489514 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.496043 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588500 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c59z8\" (UniqueName: \"kubernetes.io/projected/74d96fe5-1ab9-4703-8717-509cf115d985-kube-api-access-c59z8\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588575 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-dbus-socket\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588599 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9c7q\" (UniqueName: \"kubernetes.io/projected/3793d26a-a132-40db-b8fe-2cf83428b03c-kube-api-access-v9c7q\") pod \"nmstate-webhook-786f45cff4-bb7k9\" (UID: \"3793d26a-a132-40db-b8fe-2cf83428b03c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588623 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-nmstate-lock\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588653 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tbh9\" (UniqueName: \"kubernetes.io/projected/75f588d1-7159-4a94-bf89-bb18a880a403-kube-api-access-7tbh9\") pod \"nmstate-metrics-69594cc75-wsk7q\" (UID: \"75f588d1-7159-4a94-bf89-bb18a880a403\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588694 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-ovs-socket\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588717 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3793d26a-a132-40db-b8fe-2cf83428b03c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bb7k9\" (UID: \"3793d26a-a132-40db-b8fe-2cf83428b03c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588744 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c548cbba-61a5-4167-b494-f57c45b1599b-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-677jt\" (UniqueName: \"kubernetes.io/projected/c548cbba-61a5-4167-b494-f57c45b1599b-kube-api-access-677jt\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.588792 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c548cbba-61a5-4167-b494-f57c45b1599b-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.589249 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-nmstate-lock\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.589293 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-ovs-socket\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.589349 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/74d96fe5-1ab9-4703-8717-509cf115d985-dbus-socket\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.602568 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3793d26a-a132-40db-b8fe-2cf83428b03c-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-bb7k9\" (UID: \"3793d26a-a132-40db-b8fe-2cf83428b03c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.610757 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c59z8\" (UniqueName: \"kubernetes.io/projected/74d96fe5-1ab9-4703-8717-509cf115d985-kube-api-access-c59z8\") pod \"nmstate-handler-6m2b5\" (UID: \"74d96fe5-1ab9-4703-8717-509cf115d985\") " pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.624787 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9c7q\" (UniqueName: \"kubernetes.io/projected/3793d26a-a132-40db-b8fe-2cf83428b03c-kube-api-access-v9c7q\") pod \"nmstate-webhook-786f45cff4-bb7k9\" (UID: \"3793d26a-a132-40db-b8fe-2cf83428b03c\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.637912 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tbh9\" (UniqueName: \"kubernetes.io/projected/75f588d1-7159-4a94-bf89-bb18a880a403-kube-api-access-7tbh9\") pod \"nmstate-metrics-69594cc75-wsk7q\" (UID: \"75f588d1-7159-4a94-bf89-bb18a880a403\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.670605 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b996dd7c9-w96bf"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.671226 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.671408 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.690528 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c548cbba-61a5-4167-b494-f57c45b1599b-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.690875 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-677jt\" (UniqueName: \"kubernetes.io/projected/c548cbba-61a5-4167-b494-f57c45b1599b-kube-api-access-677jt\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.690903 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c548cbba-61a5-4167-b494-f57c45b1599b-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.691337 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c548cbba-61a5-4167-b494-f57c45b1599b-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.695894 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b996dd7c9-w96bf"] Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.698699 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c548cbba-61a5-4167-b494-f57c45b1599b-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.700371 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.710105 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.717914 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-677jt\" (UniqueName: \"kubernetes.io/projected/c548cbba-61a5-4167-b494-f57c45b1599b-kube-api-access-677jt\") pod \"nmstate-console-plugin-5dcbbd79cf-ftcgc\" (UID: \"c548cbba-61a5-4167-b494-f57c45b1599b\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792344 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-service-ca\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792389 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-serving-cert\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792414 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-trusted-ca-bundle\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792431 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-config\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792467 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-oauth-serving-cert\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792488 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-oauth-config\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.792511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bntgs\" (UniqueName: \"kubernetes.io/projected/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-kube-api-access-bntgs\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.800442 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.854278 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-wsk7q"] Mar 08 19:45:41 crc kubenswrapper[4885]: W0308 19:45:41.862466 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75f588d1_7159_4a94_bf89_bb18a880a403.slice/crio-e934b9f328e728caaf3cbe244241604c612c27f7f38001bbb32ce7d9f9447038 WatchSource:0}: Error finding container e934b9f328e728caaf3cbe244241604c612c27f7f38001bbb32ce7d9f9447038: Status 404 returned error can't find the container with id e934b9f328e728caaf3cbe244241604c612c27f7f38001bbb32ce7d9f9447038 Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893676 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-oauth-serving-cert\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893725 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-oauth-config\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893754 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bntgs\" (UniqueName: \"kubernetes.io/projected/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-kube-api-access-bntgs\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893788 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-service-ca\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893812 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-serving-cert\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-trusted-ca-bundle\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.893863 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-config\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.894540 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-oauth-serving-cert\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.894608 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-config\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.895450 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-trusted-ca-bundle\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.895624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-service-ca\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.897662 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-serving-cert\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.906681 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-console-oauth-config\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.908223 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bntgs\" (UniqueName: \"kubernetes.io/projected/dee9a38c-b44c-44d6-ad0f-8c19401cc08d-kube-api-access-bntgs\") pod \"console-7b996dd7c9-w96bf\" (UID: \"dee9a38c-b44c-44d6-ad0f-8c19401cc08d\") " pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:41 crc kubenswrapper[4885]: I0308 19:45:41.976441 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc"] Mar 08 19:45:41 crc kubenswrapper[4885]: W0308 19:45:41.981066 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc548cbba_61a5_4167_b494_f57c45b1599b.slice/crio-c6c36de055931b5f8166200aca3cea3396709baaeaddcb6d0af48e870ea787f7 WatchSource:0}: Error finding container c6c36de055931b5f8166200aca3cea3396709baaeaddcb6d0af48e870ea787f7: Status 404 returned error can't find the container with id c6c36de055931b5f8166200aca3cea3396709baaeaddcb6d0af48e870ea787f7 Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.028637 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.160443 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9"] Mar 08 19:45:42 crc kubenswrapper[4885]: W0308 19:45:42.181033 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3793d26a_a132_40db_b8fe_2cf83428b03c.slice/crio-54efbc3d54a25476adda0f722699900849c837d23fbad25e75358284e72317b8 WatchSource:0}: Error finding container 54efbc3d54a25476adda0f722699900849c837d23fbad25e75358284e72317b8: Status 404 returned error can't find the container with id 54efbc3d54a25476adda0f722699900849c837d23fbad25e75358284e72317b8 Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.277690 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b996dd7c9-w96bf"] Mar 08 19:45:42 crc kubenswrapper[4885]: W0308 19:45:42.282466 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddee9a38c_b44c_44d6_ad0f_8c19401cc08d.slice/crio-a12f57b10208243806d9622c35319ed70cb4a69ce748c334d6cded98091d8fea WatchSource:0}: Error finding container a12f57b10208243806d9622c35319ed70cb4a69ce748c334d6cded98091d8fea: Status 404 returned error can't find the container with id a12f57b10208243806d9622c35319ed70cb4a69ce748c334d6cded98091d8fea Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.629710 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6m2b5" event={"ID":"74d96fe5-1ab9-4703-8717-509cf115d985","Type":"ContainerStarted","Data":"b5bc75ef30c06b536e813b54e231d1bbf2490f82f7b221eceef19ac55f58b961"} Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.631815 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" event={"ID":"c548cbba-61a5-4167-b494-f57c45b1599b","Type":"ContainerStarted","Data":"c6c36de055931b5f8166200aca3cea3396709baaeaddcb6d0af48e870ea787f7"} Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.632827 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" event={"ID":"3793d26a-a132-40db-b8fe-2cf83428b03c","Type":"ContainerStarted","Data":"54efbc3d54a25476adda0f722699900849c837d23fbad25e75358284e72317b8"} Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.634650 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b996dd7c9-w96bf" event={"ID":"dee9a38c-b44c-44d6-ad0f-8c19401cc08d","Type":"ContainerStarted","Data":"1bb2b1328439a08fc069609b03ebe18cdfa94ae68311eaba79b0162f2876367f"} Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.634681 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b996dd7c9-w96bf" event={"ID":"dee9a38c-b44c-44d6-ad0f-8c19401cc08d","Type":"ContainerStarted","Data":"a12f57b10208243806d9622c35319ed70cb4a69ce748c334d6cded98091d8fea"} Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.637202 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" event={"ID":"75f588d1-7159-4a94-bf89-bb18a880a403","Type":"ContainerStarted","Data":"e934b9f328e728caaf3cbe244241604c612c27f7f38001bbb32ce7d9f9447038"} Mar 08 19:45:42 crc kubenswrapper[4885]: I0308 19:45:42.663752 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b996dd7c9-w96bf" podStartSLOduration=1.663730787 podStartE2EDuration="1.663730787s" podCreationTimestamp="2026-03-08 19:45:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:45:42.66272401 +0000 UTC m=+844.058778103" watchObservedRunningTime="2026-03-08 19:45:42.663730787 +0000 UTC m=+844.059784820" Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.665685 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" event={"ID":"3793d26a-a132-40db-b8fe-2cf83428b03c","Type":"ContainerStarted","Data":"430cc1d3eb031f5b47880645cef52539e7fddbb6ce93d01366ee79bb4f939f48"} Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.666438 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.668620 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" event={"ID":"75f588d1-7159-4a94-bf89-bb18a880a403","Type":"ContainerStarted","Data":"2bbaec23d9383c386a763fccf911aa294d786ba762625b7c6fc60a23f38cd74f"} Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.671012 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6m2b5" event={"ID":"74d96fe5-1ab9-4703-8717-509cf115d985","Type":"ContainerStarted","Data":"367aad91784a6b28859a3e295f9c5073a2183ead48e88337ffd1746429885746"} Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.671136 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.674822 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" event={"ID":"c548cbba-61a5-4167-b494-f57c45b1599b","Type":"ContainerStarted","Data":"9f94088f6bd2cd278c10c399b0638b3864e723d5c90146f00f4c6887bed59e25"} Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.796470 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" podStartSLOduration=1.75861596 podStartE2EDuration="4.796451609s" podCreationTimestamp="2026-03-08 19:45:41 +0000 UTC" firstStartedPulling="2026-03-08 19:45:42.182959448 +0000 UTC m=+843.579013481" lastFinishedPulling="2026-03-08 19:45:45.220795107 +0000 UTC m=+846.616849130" observedRunningTime="2026-03-08 19:45:45.713095441 +0000 UTC m=+847.109149514" watchObservedRunningTime="2026-03-08 19:45:45.796451609 +0000 UTC m=+847.192505642" Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.797030 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-ftcgc" podStartSLOduration=1.565931415 podStartE2EDuration="4.797019154s" podCreationTimestamp="2026-03-08 19:45:41 +0000 UTC" firstStartedPulling="2026-03-08 19:45:41.983006949 +0000 UTC m=+843.379060972" lastFinishedPulling="2026-03-08 19:45:45.214094678 +0000 UTC m=+846.610148711" observedRunningTime="2026-03-08 19:45:45.79535223 +0000 UTC m=+847.191406263" watchObservedRunningTime="2026-03-08 19:45:45.797019154 +0000 UTC m=+847.193073187" Mar 08 19:45:45 crc kubenswrapper[4885]: I0308 19:45:45.827443 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6m2b5" podStartSLOduration=1.368245467 podStartE2EDuration="4.827419783s" podCreationTimestamp="2026-03-08 19:45:41 +0000 UTC" firstStartedPulling="2026-03-08 19:45:41.760682065 +0000 UTC m=+843.156736088" lastFinishedPulling="2026-03-08 19:45:45.219856391 +0000 UTC m=+846.615910404" observedRunningTime="2026-03-08 19:45:45.823348324 +0000 UTC m=+847.219402357" watchObservedRunningTime="2026-03-08 19:45:45.827419783 +0000 UTC m=+847.223473826" Mar 08 19:45:48 crc kubenswrapper[4885]: I0308 19:45:48.697568 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" event={"ID":"75f588d1-7159-4a94-bf89-bb18a880a403","Type":"ContainerStarted","Data":"cdf0d00bccdf86ac023c9a8807db048987db440d45a1dd3d7958509fee8a5ebe"} Mar 08 19:45:48 crc kubenswrapper[4885]: I0308 19:45:48.724377 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-wsk7q" podStartSLOduration=1.171126853 podStartE2EDuration="7.724336902s" podCreationTimestamp="2026-03-08 19:45:41 +0000 UTC" firstStartedPulling="2026-03-08 19:45:41.866075039 +0000 UTC m=+843.262129062" lastFinishedPulling="2026-03-08 19:45:48.419285088 +0000 UTC m=+849.815339111" observedRunningTime="2026-03-08 19:45:48.719537255 +0000 UTC m=+850.115591358" watchObservedRunningTime="2026-03-08 19:45:48.724336902 +0000 UTC m=+850.120390965" Mar 08 19:45:51 crc kubenswrapper[4885]: I0308 19:45:51.752975 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6m2b5" Mar 08 19:45:52 crc kubenswrapper[4885]: I0308 19:45:52.028999 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:52 crc kubenswrapper[4885]: I0308 19:45:52.029076 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:52 crc kubenswrapper[4885]: I0308 19:45:52.036598 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:52 crc kubenswrapper[4885]: I0308 19:45:52.737760 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b996dd7c9-w96bf" Mar 08 19:45:52 crc kubenswrapper[4885]: I0308 19:45:52.838559 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hsdmw"] Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.142038 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549986-mpsgk"] Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.143965 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.148619 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.151754 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.152157 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.153155 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549986-mpsgk"] Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.158078 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f7zr\" (UniqueName: \"kubernetes.io/projected/fa96ded3-40b5-4e54-9f54-72f64edfb672-kube-api-access-4f7zr\") pod \"auto-csr-approver-29549986-mpsgk\" (UID: \"fa96ded3-40b5-4e54-9f54-72f64edfb672\") " pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.260008 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f7zr\" (UniqueName: \"kubernetes.io/projected/fa96ded3-40b5-4e54-9f54-72f64edfb672-kube-api-access-4f7zr\") pod \"auto-csr-approver-29549986-mpsgk\" (UID: \"fa96ded3-40b5-4e54-9f54-72f64edfb672\") " pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.295642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f7zr\" (UniqueName: \"kubernetes.io/projected/fa96ded3-40b5-4e54-9f54-72f64edfb672-kube-api-access-4f7zr\") pod \"auto-csr-approver-29549986-mpsgk\" (UID: \"fa96ded3-40b5-4e54-9f54-72f64edfb672\") " pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.475283 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.698255 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549986-mpsgk"] Mar 08 19:46:00 crc kubenswrapper[4885]: W0308 19:46:00.705262 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa96ded3_40b5_4e54_9f54_72f64edfb672.slice/crio-a7da4b63e3604690d19ed5189e07dab32262636684a06bcc6ee68526930dcb8f WatchSource:0}: Error finding container a7da4b63e3604690d19ed5189e07dab32262636684a06bcc6ee68526930dcb8f: Status 404 returned error can't find the container with id a7da4b63e3604690d19ed5189e07dab32262636684a06bcc6ee68526930dcb8f Mar 08 19:46:00 crc kubenswrapper[4885]: I0308 19:46:00.785168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" event={"ID":"fa96ded3-40b5-4e54-9f54-72f64edfb672","Type":"ContainerStarted","Data":"a7da4b63e3604690d19ed5189e07dab32262636684a06bcc6ee68526930dcb8f"} Mar 08 19:46:01 crc kubenswrapper[4885]: I0308 19:46:01.710759 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-bb7k9" Mar 08 19:46:02 crc kubenswrapper[4885]: I0308 19:46:02.801615 4885 generic.go:334] "Generic (PLEG): container finished" podID="fa96ded3-40b5-4e54-9f54-72f64edfb672" containerID="dd28461ac62623fc6ad7ac5f483ad81428e5d2b1b26c821a328a6729559f6fbb" exitCode=0 Mar 08 19:46:02 crc kubenswrapper[4885]: I0308 19:46:02.801779 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" event={"ID":"fa96ded3-40b5-4e54-9f54-72f64edfb672","Type":"ContainerDied","Data":"dd28461ac62623fc6ad7ac5f483ad81428e5d2b1b26c821a328a6729559f6fbb"} Mar 08 19:46:02 crc kubenswrapper[4885]: I0308 19:46:02.819855 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:46:02 crc kubenswrapper[4885]: I0308 19:46:02.820044 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.148677 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.335399 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f7zr\" (UniqueName: \"kubernetes.io/projected/fa96ded3-40b5-4e54-9f54-72f64edfb672-kube-api-access-4f7zr\") pod \"fa96ded3-40b5-4e54-9f54-72f64edfb672\" (UID: \"fa96ded3-40b5-4e54-9f54-72f64edfb672\") " Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.345118 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa96ded3-40b5-4e54-9f54-72f64edfb672-kube-api-access-4f7zr" (OuterVolumeSpecName: "kube-api-access-4f7zr") pod "fa96ded3-40b5-4e54-9f54-72f64edfb672" (UID: "fa96ded3-40b5-4e54-9f54-72f64edfb672"). InnerVolumeSpecName "kube-api-access-4f7zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.437330 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f7zr\" (UniqueName: \"kubernetes.io/projected/fa96ded3-40b5-4e54-9f54-72f64edfb672-kube-api-access-4f7zr\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.821054 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" event={"ID":"fa96ded3-40b5-4e54-9f54-72f64edfb672","Type":"ContainerDied","Data":"a7da4b63e3604690d19ed5189e07dab32262636684a06bcc6ee68526930dcb8f"} Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.821115 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7da4b63e3604690d19ed5189e07dab32262636684a06bcc6ee68526930dcb8f" Mar 08 19:46:04 crc kubenswrapper[4885]: I0308 19:46:04.821273 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549986-mpsgk" Mar 08 19:46:05 crc kubenswrapper[4885]: I0308 19:46:05.209484 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549980-lx7sw"] Mar 08 19:46:05 crc kubenswrapper[4885]: I0308 19:46:05.224905 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549980-lx7sw"] Mar 08 19:46:05 crc kubenswrapper[4885]: I0308 19:46:05.385768 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffc137d5-821a-406d-8db5-d396d0091991" path="/var/lib/kubelet/pods/ffc137d5-821a-406d-8db5-d396d0091991/volumes" Mar 08 19:46:17 crc kubenswrapper[4885]: I0308 19:46:17.890584 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hsdmw" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" containerID="cri-o://72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2" gracePeriod=15 Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.239162 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr"] Mar 08 19:46:19 crc kubenswrapper[4885]: E0308 19:46:19.239580 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa96ded3-40b5-4e54-9f54-72f64edfb672" containerName="oc" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.239610 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa96ded3-40b5-4e54-9f54-72f64edfb672" containerName="oc" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.239870 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa96ded3-40b5-4e54-9f54-72f64edfb672" containerName="oc" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.241438 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.252272 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.277095 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67ntk\" (UniqueName: \"kubernetes.io/projected/4b43d8cc-1dca-4c13-a0b7-df1371935186-kube-api-access-67ntk\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.277374 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.277421 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.281501 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr"] Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.380699 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.380848 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67ntk\" (UniqueName: \"kubernetes.io/projected/4b43d8cc-1dca-4c13-a0b7-df1371935186-kube-api-access-67ntk\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.380907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.381523 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.385121 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.411323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67ntk\" (UniqueName: \"kubernetes.io/projected/4b43d8cc-1dca-4c13-a0b7-df1371935186-kube-api-access-67ntk\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.516047 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hsdmw_bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9/console/0.log" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.516110 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.583421 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-serving-cert\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.584193 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-config" (OuterVolumeSpecName: "console-config") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.584159 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-config\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.585781 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc92s\" (UniqueName: \"kubernetes.io/projected/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-kube-api-access-hc92s\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.585818 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-oauth-serving-cert\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.585860 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-trusted-ca-bundle\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.586365 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.586644 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.586712 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-oauth-config\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.587130 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-service-ca\") pod \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\" (UID: \"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9\") " Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.587850 4885 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.587873 4885 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.587889 4885 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.587968 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-service-ca" (OuterVolumeSpecName: "service-ca") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.590601 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.591109 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.592348 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-kube-api-access-hc92s" (OuterVolumeSpecName: "kube-api-access-hc92s") pod "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" (UID: "bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9"). InnerVolumeSpecName "kube-api-access-hc92s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.640088 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.691263 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc92s\" (UniqueName: \"kubernetes.io/projected/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-kube-api-access-hc92s\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.691320 4885 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.691339 4885 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-service-ca\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.691357 4885 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.881746 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr"] Mar 08 19:46:19 crc kubenswrapper[4885]: W0308 19:46:19.890259 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b43d8cc_1dca_4c13_a0b7_df1371935186.slice/crio-443aa3bf665000a32716872e5018e93dfaa6912377ead92036f5a2ce569d0ed3 WatchSource:0}: Error finding container 443aa3bf665000a32716872e5018e93dfaa6912377ead92036f5a2ce569d0ed3: Status 404 returned error can't find the container with id 443aa3bf665000a32716872e5018e93dfaa6912377ead92036f5a2ce569d0ed3 Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.942369 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hsdmw_bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9/console/0.log" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.942500 4885 generic.go:334] "Generic (PLEG): container finished" podID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerID="72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2" exitCode=2 Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.942734 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hsdmw" event={"ID":"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9","Type":"ContainerDied","Data":"72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2"} Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.942825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hsdmw" event={"ID":"bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9","Type":"ContainerDied","Data":"3f15d4fa347fa026085bfce2e909833cf87859b7db626e5ef40b0541cc513c59"} Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.942855 4885 scope.go:117] "RemoveContainer" containerID="72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.943042 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hsdmw" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.948852 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" event={"ID":"4b43d8cc-1dca-4c13-a0b7-df1371935186","Type":"ContainerStarted","Data":"443aa3bf665000a32716872e5018e93dfaa6912377ead92036f5a2ce569d0ed3"} Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.988787 4885 scope.go:117] "RemoveContainer" containerID="72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2" Mar 08 19:46:19 crc kubenswrapper[4885]: E0308 19:46:19.989770 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2\": container with ID starting with 72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2 not found: ID does not exist" containerID="72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.989852 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2"} err="failed to get container status \"72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2\": rpc error: code = NotFound desc = could not find container \"72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2\": container with ID starting with 72eca1534b36b93279c2d9ade8a85665dee4b6456989b6ad827043848cb1c3a2 not found: ID does not exist" Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.994852 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hsdmw"] Mar 08 19:46:19 crc kubenswrapper[4885]: I0308 19:46:19.999076 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hsdmw"] Mar 08 19:46:20 crc kubenswrapper[4885]: I0308 19:46:20.960588 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerID="c9063d91d4348c09456bc8e3a036e05fb6eaf9aa2d95dc039e13087e50453059" exitCode=0 Mar 08 19:46:20 crc kubenswrapper[4885]: I0308 19:46:20.960675 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" event={"ID":"4b43d8cc-1dca-4c13-a0b7-df1371935186","Type":"ContainerDied","Data":"c9063d91d4348c09456bc8e3a036e05fb6eaf9aa2d95dc039e13087e50453059"} Mar 08 19:46:21 crc kubenswrapper[4885]: I0308 19:46:21.380452 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" path="/var/lib/kubelet/pods/bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9/volumes" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.772832 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p2l28"] Mar 08 19:46:22 crc kubenswrapper[4885]: E0308 19:46:22.775029 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.775054 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.775228 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdbb4f97-c9c8-43ef-a4b1-06dea8d6d8b9" containerName="console" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.777884 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.782578 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p2l28"] Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.834846 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-catalog-content\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.834883 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njk5s\" (UniqueName: \"kubernetes.io/projected/c4ddb731-1c00-4dea-80e3-9c2d372916e9-kube-api-access-njk5s\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.834948 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-utilities\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.936260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-catalog-content\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.936493 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njk5s\" (UniqueName: \"kubernetes.io/projected/c4ddb731-1c00-4dea-80e3-9c2d372916e9-kube-api-access-njk5s\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.936593 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-utilities\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.937082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-catalog-content\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.937410 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-utilities\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.956074 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njk5s\" (UniqueName: \"kubernetes.io/projected/c4ddb731-1c00-4dea-80e3-9c2d372916e9-kube-api-access-njk5s\") pod \"redhat-operators-p2l28\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:22 crc kubenswrapper[4885]: I0308 19:46:22.974732 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" event={"ID":"4b43d8cc-1dca-4c13-a0b7-df1371935186","Type":"ContainerStarted","Data":"fb4dea5d511facd255858e1d6530e40b89604a0811ac12d96a14881c881b100c"} Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.173485 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.373806 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p2l28"] Mar 08 19:46:23 crc kubenswrapper[4885]: W0308 19:46:23.378485 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4ddb731_1c00_4dea_80e3_9c2d372916e9.slice/crio-09ef166d444cd5a025be9da10424813ad4749985b0415d1dc750e70588de3801 WatchSource:0}: Error finding container 09ef166d444cd5a025be9da10424813ad4749985b0415d1dc750e70588de3801: Status 404 returned error can't find the container with id 09ef166d444cd5a025be9da10424813ad4749985b0415d1dc750e70588de3801 Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.982051 4885 generic.go:334] "Generic (PLEG): container finished" podID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerID="04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2" exitCode=0 Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.982372 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerDied","Data":"04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2"} Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.982723 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerStarted","Data":"09ef166d444cd5a025be9da10424813ad4749985b0415d1dc750e70588de3801"} Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.986265 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerID="fb4dea5d511facd255858e1d6530e40b89604a0811ac12d96a14881c881b100c" exitCode=0 Mar 08 19:46:23 crc kubenswrapper[4885]: I0308 19:46:23.986321 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" event={"ID":"4b43d8cc-1dca-4c13-a0b7-df1371935186","Type":"ContainerDied","Data":"fb4dea5d511facd255858e1d6530e40b89604a0811ac12d96a14881c881b100c"} Mar 08 19:46:25 crc kubenswrapper[4885]: I0308 19:46:24.999453 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerID="b0d3a913135940ae2cfeb85521a6d631a45734d9847610f1c13fa9dfe3a8cf22" exitCode=0 Mar 08 19:46:25 crc kubenswrapper[4885]: I0308 19:46:24.999585 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" event={"ID":"4b43d8cc-1dca-4c13-a0b7-df1371935186","Type":"ContainerDied","Data":"b0d3a913135940ae2cfeb85521a6d631a45734d9847610f1c13fa9dfe3a8cf22"} Mar 08 19:46:25 crc kubenswrapper[4885]: I0308 19:46:25.002523 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerStarted","Data":"36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465"} Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.013334 4885 generic.go:334] "Generic (PLEG): container finished" podID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerID="36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465" exitCode=0 Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.013462 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerDied","Data":"36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465"} Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.346816 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.385149 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67ntk\" (UniqueName: \"kubernetes.io/projected/4b43d8cc-1dca-4c13-a0b7-df1371935186-kube-api-access-67ntk\") pod \"4b43d8cc-1dca-4c13-a0b7-df1371935186\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.385272 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-util\") pod \"4b43d8cc-1dca-4c13-a0b7-df1371935186\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.385363 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-bundle\") pod \"4b43d8cc-1dca-4c13-a0b7-df1371935186\" (UID: \"4b43d8cc-1dca-4c13-a0b7-df1371935186\") " Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.387492 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-bundle" (OuterVolumeSpecName: "bundle") pod "4b43d8cc-1dca-4c13-a0b7-df1371935186" (UID: "4b43d8cc-1dca-4c13-a0b7-df1371935186"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.403033 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-util" (OuterVolumeSpecName: "util") pod "4b43d8cc-1dca-4c13-a0b7-df1371935186" (UID: "4b43d8cc-1dca-4c13-a0b7-df1371935186"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.404610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b43d8cc-1dca-4c13-a0b7-df1371935186-kube-api-access-67ntk" (OuterVolumeSpecName: "kube-api-access-67ntk") pod "4b43d8cc-1dca-4c13-a0b7-df1371935186" (UID: "4b43d8cc-1dca-4c13-a0b7-df1371935186"). InnerVolumeSpecName "kube-api-access-67ntk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.487675 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67ntk\" (UniqueName: \"kubernetes.io/projected/4b43d8cc-1dca-4c13-a0b7-df1371935186-kube-api-access-67ntk\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.487721 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-util\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:26 crc kubenswrapper[4885]: I0308 19:46:26.487741 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4b43d8cc-1dca-4c13-a0b7-df1371935186-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:27 crc kubenswrapper[4885]: I0308 19:46:27.024692 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerStarted","Data":"9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2"} Mar 08 19:46:27 crc kubenswrapper[4885]: I0308 19:46:27.027092 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" event={"ID":"4b43d8cc-1dca-4c13-a0b7-df1371935186","Type":"ContainerDied","Data":"443aa3bf665000a32716872e5018e93dfaa6912377ead92036f5a2ce569d0ed3"} Mar 08 19:46:27 crc kubenswrapper[4885]: I0308 19:46:27.027111 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="443aa3bf665000a32716872e5018e93dfaa6912377ead92036f5a2ce569d0ed3" Mar 08 19:46:27 crc kubenswrapper[4885]: I0308 19:46:27.027175 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr" Mar 08 19:46:27 crc kubenswrapper[4885]: I0308 19:46:27.057250 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p2l28" podStartSLOduration=2.573256943 podStartE2EDuration="5.057227828s" podCreationTimestamp="2026-03-08 19:46:22 +0000 UTC" firstStartedPulling="2026-03-08 19:46:23.984729007 +0000 UTC m=+885.380783030" lastFinishedPulling="2026-03-08 19:46:26.468699882 +0000 UTC m=+887.864753915" observedRunningTime="2026-03-08 19:46:27.04640483 +0000 UTC m=+888.442458863" watchObservedRunningTime="2026-03-08 19:46:27.057227828 +0000 UTC m=+888.453281841" Mar 08 19:46:32 crc kubenswrapper[4885]: I0308 19:46:32.819193 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:46:32 crc kubenswrapper[4885]: I0308 19:46:32.819660 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:46:33 crc kubenswrapper[4885]: I0308 19:46:33.174361 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:33 crc kubenswrapper[4885]: I0308 19:46:33.174432 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:34 crc kubenswrapper[4885]: I0308 19:46:34.236737 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p2l28" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="registry-server" probeResult="failure" output=< Mar 08 19:46:34 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 19:46:34 crc kubenswrapper[4885]: > Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.619664 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9"] Mar 08 19:46:38 crc kubenswrapper[4885]: E0308 19:46:38.620171 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="extract" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.620186 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="extract" Mar 08 19:46:38 crc kubenswrapper[4885]: E0308 19:46:38.620204 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="pull" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.620212 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="pull" Mar 08 19:46:38 crc kubenswrapper[4885]: E0308 19:46:38.620238 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="util" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.620247 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="util" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.620368 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b43d8cc-1dca-4c13-a0b7-df1371935186" containerName="extract" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.620790 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.622732 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.623986 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.624000 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-p2cn5" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.623988 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.629068 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.630015 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9"] Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.655407 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-webhook-cert\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.655456 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-apiservice-cert\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.655577 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk7h6\" (UniqueName: \"kubernetes.io/projected/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-kube-api-access-fk7h6\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.757006 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk7h6\" (UniqueName: \"kubernetes.io/projected/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-kube-api-access-fk7h6\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.757135 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-webhook-cert\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.757178 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-apiservice-cert\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.763014 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-apiservice-cert\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.766807 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-webhook-cert\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.771662 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk7h6\" (UniqueName: \"kubernetes.io/projected/ca51bb10-b38d-4e58-9d29-6c6b8922f72e-kube-api-access-fk7h6\") pod \"metallb-operator-controller-manager-6bc5657994-v7mn9\" (UID: \"ca51bb10-b38d-4e58-9d29-6c6b8922f72e\") " pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.853087 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j"] Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.853914 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.856110 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-nnjzn" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.856328 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.858328 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.870541 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j"] Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.938940 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.958914 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdx9v\" (UniqueName: \"kubernetes.io/projected/6ea4545b-278f-43ff-be3c-fc1346b591a1-kube-api-access-vdx9v\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.959290 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ea4545b-278f-43ff-be3c-fc1346b591a1-webhook-cert\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:38 crc kubenswrapper[4885]: I0308 19:46:38.959312 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ea4545b-278f-43ff-be3c-fc1346b591a1-apiservice-cert\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.061313 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdx9v\" (UniqueName: \"kubernetes.io/projected/6ea4545b-278f-43ff-be3c-fc1346b591a1-kube-api-access-vdx9v\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.061464 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ea4545b-278f-43ff-be3c-fc1346b591a1-webhook-cert\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.061506 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ea4545b-278f-43ff-be3c-fc1346b591a1-apiservice-cert\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.066759 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6ea4545b-278f-43ff-be3c-fc1346b591a1-apiservice-cert\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.067036 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6ea4545b-278f-43ff-be3c-fc1346b591a1-webhook-cert\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.092172 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdx9v\" (UniqueName: \"kubernetes.io/projected/6ea4545b-278f-43ff-be3c-fc1346b591a1-kube-api-access-vdx9v\") pod \"metallb-operator-webhook-server-74675b5ddf-p7c2j\" (UID: \"6ea4545b-278f-43ff-be3c-fc1346b591a1\") " pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.167793 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.345841 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9"] Mar 08 19:46:39 crc kubenswrapper[4885]: W0308 19:46:39.356642 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca51bb10_b38d_4e58_9d29_6c6b8922f72e.slice/crio-4274b810292292ea868563154b236b232716fc4104ad17516f38ceb78fa29e9c WatchSource:0}: Error finding container 4274b810292292ea868563154b236b232716fc4104ad17516f38ceb78fa29e9c: Status 404 returned error can't find the container with id 4274b810292292ea868563154b236b232716fc4104ad17516f38ceb78fa29e9c Mar 08 19:46:39 crc kubenswrapper[4885]: I0308 19:46:39.386277 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j"] Mar 08 19:46:40 crc kubenswrapper[4885]: I0308 19:46:40.106358 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" event={"ID":"6ea4545b-278f-43ff-be3c-fc1346b591a1","Type":"ContainerStarted","Data":"5a5644454e0cb65ab10a1601abfaf42ac487a0421086e15d6f95505a84a8b741"} Mar 08 19:46:40 crc kubenswrapper[4885]: I0308 19:46:40.107980 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" event={"ID":"ca51bb10-b38d-4e58-9d29-6c6b8922f72e","Type":"ContainerStarted","Data":"4274b810292292ea868563154b236b232716fc4104ad17516f38ceb78fa29e9c"} Mar 08 19:46:43 crc kubenswrapper[4885]: I0308 19:46:43.246498 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:43 crc kubenswrapper[4885]: I0308 19:46:43.302337 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:43 crc kubenswrapper[4885]: I0308 19:46:43.957316 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p2l28"] Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.139634 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" event={"ID":"6ea4545b-278f-43ff-be3c-fc1346b591a1","Type":"ContainerStarted","Data":"fa61c5941d266f0ede484cbeb0b888dd23222a5f02b756428a17dfbec0c8dadd"} Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.140128 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.145322 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" event={"ID":"ca51bb10-b38d-4e58-9d29-6c6b8922f72e","Type":"ContainerStarted","Data":"c5a21ccc6282e89f02c992eb21583597058be56faa57567ba98547a14429420c"} Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.145520 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.145535 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p2l28" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="registry-server" containerID="cri-o://9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2" gracePeriod=2 Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.167898 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" podStartSLOduration=2.376800235 podStartE2EDuration="7.16786636s" podCreationTimestamp="2026-03-08 19:46:38 +0000 UTC" firstStartedPulling="2026-03-08 19:46:39.395394959 +0000 UTC m=+900.791448982" lastFinishedPulling="2026-03-08 19:46:44.186461084 +0000 UTC m=+905.582515107" observedRunningTime="2026-03-08 19:46:45.163243746 +0000 UTC m=+906.559297799" watchObservedRunningTime="2026-03-08 19:46:45.16786636 +0000 UTC m=+906.563920403" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.195556 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" podStartSLOduration=2.392141653 podStartE2EDuration="7.195537685s" podCreationTimestamp="2026-03-08 19:46:38 +0000 UTC" firstStartedPulling="2026-03-08 19:46:39.360220513 +0000 UTC m=+900.756274546" lastFinishedPulling="2026-03-08 19:46:44.163616565 +0000 UTC m=+905.559670578" observedRunningTime="2026-03-08 19:46:45.191582961 +0000 UTC m=+906.587636994" watchObservedRunningTime="2026-03-08 19:46:45.195537685 +0000 UTC m=+906.591591708" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.603939 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.661616 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-catalog-content\") pod \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.661807 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njk5s\" (UniqueName: \"kubernetes.io/projected/c4ddb731-1c00-4dea-80e3-9c2d372916e9-kube-api-access-njk5s\") pod \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.661865 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-utilities\") pod \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\" (UID: \"c4ddb731-1c00-4dea-80e3-9c2d372916e9\") " Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.663011 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-utilities" (OuterVolumeSpecName: "utilities") pod "c4ddb731-1c00-4dea-80e3-9c2d372916e9" (UID: "c4ddb731-1c00-4dea-80e3-9c2d372916e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.668185 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ddb731-1c00-4dea-80e3-9c2d372916e9-kube-api-access-njk5s" (OuterVolumeSpecName: "kube-api-access-njk5s") pod "c4ddb731-1c00-4dea-80e3-9c2d372916e9" (UID: "c4ddb731-1c00-4dea-80e3-9c2d372916e9"). InnerVolumeSpecName "kube-api-access-njk5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.763911 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njk5s\" (UniqueName: \"kubernetes.io/projected/c4ddb731-1c00-4dea-80e3-9c2d372916e9-kube-api-access-njk5s\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.763966 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.801045 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4ddb731-1c00-4dea-80e3-9c2d372916e9" (UID: "c4ddb731-1c00-4dea-80e3-9c2d372916e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.819460 4885 scope.go:117] "RemoveContainer" containerID="6b3edc0ab6930c447e72d1b9e0e05c67fcbcc8c8cb108ba4b449e1f4acc1e00e" Mar 08 19:46:45 crc kubenswrapper[4885]: I0308 19:46:45.864633 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4ddb731-1c00-4dea-80e3-9c2d372916e9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.154424 4885 generic.go:334] "Generic (PLEG): container finished" podID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerID="9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2" exitCode=0 Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.154490 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p2l28" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.154574 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerDied","Data":"9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2"} Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.154822 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p2l28" event={"ID":"c4ddb731-1c00-4dea-80e3-9c2d372916e9","Type":"ContainerDied","Data":"09ef166d444cd5a025be9da10424813ad4749985b0415d1dc750e70588de3801"} Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.154870 4885 scope.go:117] "RemoveContainer" containerID="9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.178045 4885 scope.go:117] "RemoveContainer" containerID="36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.193097 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p2l28"] Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.206243 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p2l28"] Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.224297 4885 scope.go:117] "RemoveContainer" containerID="04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.251167 4885 scope.go:117] "RemoveContainer" containerID="9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2" Mar 08 19:46:46 crc kubenswrapper[4885]: E0308 19:46:46.251556 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2\": container with ID starting with 9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2 not found: ID does not exist" containerID="9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.251588 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2"} err="failed to get container status \"9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2\": rpc error: code = NotFound desc = could not find container \"9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2\": container with ID starting with 9b2f2d9b7aecac0f498c6ff3673f9a97fb9f1cb481305148db4ff75aded671b2 not found: ID does not exist" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.251609 4885 scope.go:117] "RemoveContainer" containerID="36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465" Mar 08 19:46:46 crc kubenswrapper[4885]: E0308 19:46:46.251812 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465\": container with ID starting with 36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465 not found: ID does not exist" containerID="36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.251833 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465"} err="failed to get container status \"36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465\": rpc error: code = NotFound desc = could not find container \"36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465\": container with ID starting with 36a6129e2f5dfd90e1edf27c5907915fac7c8f8099c6608fb794bfedfcb78465 not found: ID does not exist" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.251844 4885 scope.go:117] "RemoveContainer" containerID="04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2" Mar 08 19:46:46 crc kubenswrapper[4885]: E0308 19:46:46.252110 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2\": container with ID starting with 04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2 not found: ID does not exist" containerID="04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2" Mar 08 19:46:46 crc kubenswrapper[4885]: I0308 19:46:46.252150 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2"} err="failed to get container status \"04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2\": rpc error: code = NotFound desc = could not find container \"04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2\": container with ID starting with 04331755fc46740de59b20c20d8d7748df0e376d4e9a28d2eee40abf5dd953f2 not found: ID does not exist" Mar 08 19:46:47 crc kubenswrapper[4885]: I0308 19:46:47.381302 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" path="/var/lib/kubelet/pods/c4ddb731-1c00-4dea-80e3-9c2d372916e9/volumes" Mar 08 19:46:59 crc kubenswrapper[4885]: I0308 19:46:59.179778 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-74675b5ddf-p7c2j" Mar 08 19:47:02 crc kubenswrapper[4885]: I0308 19:47:02.818723 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:47:02 crc kubenswrapper[4885]: I0308 19:47:02.819169 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:47:02 crc kubenswrapper[4885]: I0308 19:47:02.819278 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:47:02 crc kubenswrapper[4885]: I0308 19:47:02.820027 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f94b502e469fe218787b8101e45951a2dfe1f5fc0bc5b2cb2e8b55561aeaabb2"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:47:02 crc kubenswrapper[4885]: I0308 19:47:02.820118 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://f94b502e469fe218787b8101e45951a2dfe1f5fc0bc5b2cb2e8b55561aeaabb2" gracePeriod=600 Mar 08 19:47:03 crc kubenswrapper[4885]: I0308 19:47:03.268470 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="f94b502e469fe218787b8101e45951a2dfe1f5fc0bc5b2cb2e8b55561aeaabb2" exitCode=0 Mar 08 19:47:03 crc kubenswrapper[4885]: I0308 19:47:03.268515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"f94b502e469fe218787b8101e45951a2dfe1f5fc0bc5b2cb2e8b55561aeaabb2"} Mar 08 19:47:03 crc kubenswrapper[4885]: I0308 19:47:03.268758 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"e6dd4ce3180e7f84da70c69f276b3e39a0d5b0c2aeeabe5c8a51dafdbeafb374"} Mar 08 19:47:03 crc kubenswrapper[4885]: I0308 19:47:03.268778 4885 scope.go:117] "RemoveContainer" containerID="efa4fb8889b5532d28bec018d7816a61ed0ea017834833ffed5162643372e98f" Mar 08 19:47:18 crc kubenswrapper[4885]: I0308 19:47:18.942684 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6bc5657994-v7mn9" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.884148 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hq28v"] Mar 08 19:47:19 crc kubenswrapper[4885]: E0308 19:47:19.884453 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="extract-utilities" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.884486 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="extract-utilities" Mar 08 19:47:19 crc kubenswrapper[4885]: E0308 19:47:19.884518 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="extract-content" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.884530 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="extract-content" Mar 08 19:47:19 crc kubenswrapper[4885]: E0308 19:47:19.884544 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="registry-server" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.884555 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="registry-server" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.884736 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ddb731-1c00-4dea-80e3-9c2d372916e9" containerName="registry-server" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.887708 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.892471 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-8q54c" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.893133 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.894235 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg"] Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.894840 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.896696 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.899762 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.912494 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg"] Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944392 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dca42faa-df32-44b5-99e8-109120aa36a1-frr-startup\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944435 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca42faa-df32-44b5-99e8-109120aa36a1-metrics-certs\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944461 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-frr-sockets\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944475 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np6n6\" (UniqueName: \"kubernetes.io/projected/dca42faa-df32-44b5-99e8-109120aa36a1-kube-api-access-np6n6\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944495 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt7wq\" (UniqueName: \"kubernetes.io/projected/e76b0259-0d11-4451-b770-4ca5611ce32e-kube-api-access-wt7wq\") pod \"frr-k8s-webhook-server-7f989f654f-cq6xg\" (UID: \"e76b0259-0d11-4451-b770-4ca5611ce32e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944542 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e76b0259-0d11-4451-b770-4ca5611ce32e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-cq6xg\" (UID: \"e76b0259-0d11-4451-b770-4ca5611ce32e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944565 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-frr-conf\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944582 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-metrics\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.944605 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-reloader\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.985487 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5nclk"] Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.987384 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5nclk" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.991644 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.991894 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-gmznl" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.992062 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.992163 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 08 19:47:19 crc kubenswrapper[4885]: I0308 19:47:19.998800 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-xj2vs"] Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.004152 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.007786 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.038052 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-xj2vs"] Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/860f2bc3-9bd4-43c5-9400-67293a877c6f-metallb-excludel2\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045647 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e76b0259-0d11-4451-b770-4ca5611ce32e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-cq6xg\" (UID: \"e76b0259-0d11-4451-b770-4ca5611ce32e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045677 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grlhg\" (UniqueName: \"kubernetes.io/projected/860f2bc3-9bd4-43c5-9400-67293a877c6f-kube-api-access-grlhg\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045699 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-frr-conf\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045717 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-metrics\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045741 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-reloader\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045779 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-metrics-certs\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045802 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-cert\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045818 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dca42faa-df32-44b5-99e8-109120aa36a1-frr-startup\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045834 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9brx\" (UniqueName: \"kubernetes.io/projected/6d11a8df-ce5d-404a-b827-822101b061c8-kube-api-access-k9brx\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca42faa-df32-44b5-99e8-109120aa36a1-metrics-certs\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045871 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.045976 4885 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.046029 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dca42faa-df32-44b5-99e8-109120aa36a1-metrics-certs podName:dca42faa-df32-44b5-99e8-109120aa36a1 nodeName:}" failed. No retries permitted until 2026-03-08 19:47:20.546011571 +0000 UTC m=+941.942065584 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dca42faa-df32-44b5-99e8-109120aa36a1-metrics-certs") pod "frr-k8s-hq28v" (UID: "dca42faa-df32-44b5-99e8-109120aa36a1") : secret "frr-k8s-certs-secret" not found Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.045973 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-np6n6\" (UniqueName: \"kubernetes.io/projected/dca42faa-df32-44b5-99e8-109120aa36a1-kube-api-access-np6n6\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.046177 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-frr-conf\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.046253 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-reloader\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.046298 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-metrics\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.046759 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/dca42faa-df32-44b5-99e8-109120aa36a1-frr-startup\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.047086 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-frr-sockets\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.046906 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/dca42faa-df32-44b5-99e8-109120aa36a1-frr-sockets\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.047144 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt7wq\" (UniqueName: \"kubernetes.io/projected/e76b0259-0d11-4451-b770-4ca5611ce32e-kube-api-access-wt7wq\") pod \"frr-k8s-webhook-server-7f989f654f-cq6xg\" (UID: \"e76b0259-0d11-4451-b770-4ca5611ce32e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.047433 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-metrics-certs\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.051635 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e76b0259-0d11-4451-b770-4ca5611ce32e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-cq6xg\" (UID: \"e76b0259-0d11-4451-b770-4ca5611ce32e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.078280 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt7wq\" (UniqueName: \"kubernetes.io/projected/e76b0259-0d11-4451-b770-4ca5611ce32e-kube-api-access-wt7wq\") pod \"frr-k8s-webhook-server-7f989f654f-cq6xg\" (UID: \"e76b0259-0d11-4451-b770-4ca5611ce32e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.078939 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-np6n6\" (UniqueName: \"kubernetes.io/projected/dca42faa-df32-44b5-99e8-109120aa36a1-kube-api-access-np6n6\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148539 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/860f2bc3-9bd4-43c5-9400-67293a877c6f-metallb-excludel2\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148587 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grlhg\" (UniqueName: \"kubernetes.io/projected/860f2bc3-9bd4-43c5-9400-67293a877c6f-kube-api-access-grlhg\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148630 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-metrics-certs\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148651 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-cert\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148670 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9brx\" (UniqueName: \"kubernetes.io/projected/6d11a8df-ce5d-404a-b827-822101b061c8-kube-api-access-k9brx\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148712 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.148742 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-metrics-certs\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.148773 4885 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.148843 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-metrics-certs podName:6d11a8df-ce5d-404a-b827-822101b061c8 nodeName:}" failed. No retries permitted until 2026-03-08 19:47:20.648824097 +0000 UTC m=+942.044878120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-metrics-certs") pod "controller-86ddb6bd46-xj2vs" (UID: "6d11a8df-ce5d-404a-b827-822101b061c8") : secret "controller-certs-secret" not found Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.149257 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/860f2bc3-9bd4-43c5-9400-67293a877c6f-metallb-excludel2\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.149285 4885 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.149310 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist podName:860f2bc3-9bd4-43c5-9400-67293a877c6f nodeName:}" failed. No retries permitted until 2026-03-08 19:47:20.64930259 +0000 UTC m=+942.045356613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist") pod "speaker-5nclk" (UID: "860f2bc3-9bd4-43c5-9400-67293a877c6f") : secret "metallb-memberlist" not found Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.151242 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-metrics-certs\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.151411 4885 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.161826 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-cert\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.167348 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grlhg\" (UniqueName: \"kubernetes.io/projected/860f2bc3-9bd4-43c5-9400-67293a877c6f-kube-api-access-grlhg\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.167498 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9brx\" (UniqueName: \"kubernetes.io/projected/6d11a8df-ce5d-404a-b827-822101b061c8-kube-api-access-k9brx\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.224151 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.555706 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca42faa-df32-44b5-99e8-109120aa36a1-metrics-certs\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.562175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dca42faa-df32-44b5-99e8-109120aa36a1-metrics-certs\") pod \"frr-k8s-hq28v\" (UID: \"dca42faa-df32-44b5-99e8-109120aa36a1\") " pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.619727 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg"] Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.627977 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.657010 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-metrics-certs\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.657106 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.657311 4885 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 08 19:47:20 crc kubenswrapper[4885]: E0308 19:47:20.657385 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist podName:860f2bc3-9bd4-43c5-9400-67293a877c6f nodeName:}" failed. No retries permitted until 2026-03-08 19:47:21.657355413 +0000 UTC m=+943.053409446 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist") pod "speaker-5nclk" (UID: "860f2bc3-9bd4-43c5-9400-67293a877c6f") : secret "metallb-memberlist" not found Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.662664 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d11a8df-ce5d-404a-b827-822101b061c8-metrics-certs\") pod \"controller-86ddb6bd46-xj2vs\" (UID: \"6d11a8df-ce5d-404a-b827-822101b061c8\") " pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.814265 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:20 crc kubenswrapper[4885]: I0308 19:47:20.935430 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.236352 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-xj2vs"] Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.406110 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" event={"ID":"e76b0259-0d11-4451-b770-4ca5611ce32e","Type":"ContainerStarted","Data":"5e79837bf94d566e49bb58b39e4fa2d9ddddb3b9673d6d4358192998839c5ad8"} Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.408414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-xj2vs" event={"ID":"6d11a8df-ce5d-404a-b827-822101b061c8","Type":"ContainerStarted","Data":"0255e84f68df5481fa8e5065c4a980ae8f7a2ddd3eb48866a76c63671eb9b18f"} Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.410576 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"ea5c4b87c03fc4c39c57a7d50e6b1963821843539a8fca80b64cdedf268fee48"} Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.671775 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.681063 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/860f2bc3-9bd4-43c5-9400-67293a877c6f-memberlist\") pod \"speaker-5nclk\" (UID: \"860f2bc3-9bd4-43c5-9400-67293a877c6f\") " pod="metallb-system/speaker-5nclk" Mar 08 19:47:21 crc kubenswrapper[4885]: I0308 19:47:21.800283 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5nclk" Mar 08 19:47:21 crc kubenswrapper[4885]: W0308 19:47:21.829866 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod860f2bc3_9bd4_43c5_9400_67293a877c6f.slice/crio-261a7b8f04d3d8e551b4df8216380dadf0ea82b61f5a7a7fb76d193327540e11 WatchSource:0}: Error finding container 261a7b8f04d3d8e551b4df8216380dadf0ea82b61f5a7a7fb76d193327540e11: Status 404 returned error can't find the container with id 261a7b8f04d3d8e551b4df8216380dadf0ea82b61f5a7a7fb76d193327540e11 Mar 08 19:47:22 crc kubenswrapper[4885]: I0308 19:47:22.441957 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-xj2vs" event={"ID":"6d11a8df-ce5d-404a-b827-822101b061c8","Type":"ContainerStarted","Data":"ca1dc8c263429c96ed7f0d4d929659bb9b5e74235928a34fd759f947d802a37e"} Mar 08 19:47:22 crc kubenswrapper[4885]: I0308 19:47:22.442013 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-xj2vs" event={"ID":"6d11a8df-ce5d-404a-b827-822101b061c8","Type":"ContainerStarted","Data":"ce97e51918924c94c787223eaf8ff957bf30e42af44b6cdd35dea62a60f75a93"} Mar 08 19:47:22 crc kubenswrapper[4885]: I0308 19:47:22.442051 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:22 crc kubenswrapper[4885]: I0308 19:47:22.445397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5nclk" event={"ID":"860f2bc3-9bd4-43c5-9400-67293a877c6f","Type":"ContainerStarted","Data":"a8130f753791d0cc0b174b7605993009fcb0fa02b52f04ee66b350799ae716d5"} Mar 08 19:47:22 crc kubenswrapper[4885]: I0308 19:47:22.445441 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5nclk" event={"ID":"860f2bc3-9bd4-43c5-9400-67293a877c6f","Type":"ContainerStarted","Data":"261a7b8f04d3d8e551b4df8216380dadf0ea82b61f5a7a7fb76d193327540e11"} Mar 08 19:47:22 crc kubenswrapper[4885]: I0308 19:47:22.462014 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-xj2vs" podStartSLOduration=3.461996927 podStartE2EDuration="3.461996927s" podCreationTimestamp="2026-03-08 19:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:47:22.460050965 +0000 UTC m=+943.856104978" watchObservedRunningTime="2026-03-08 19:47:22.461996927 +0000 UTC m=+943.858050950" Mar 08 19:47:23 crc kubenswrapper[4885]: I0308 19:47:23.453244 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5nclk" event={"ID":"860f2bc3-9bd4-43c5-9400-67293a877c6f","Type":"ContainerStarted","Data":"58b5727749bb1effcca8bf3b9b3b818ecf8742ae681d8b8b7c533b3ffc0a8214"} Mar 08 19:47:23 crc kubenswrapper[4885]: I0308 19:47:23.473307 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5nclk" podStartSLOduration=4.473287764 podStartE2EDuration="4.473287764s" podCreationTimestamp="2026-03-08 19:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:47:23.466950315 +0000 UTC m=+944.863004338" watchObservedRunningTime="2026-03-08 19:47:23.473287764 +0000 UTC m=+944.869341777" Mar 08 19:47:24 crc kubenswrapper[4885]: I0308 19:47:24.462200 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5nclk" Mar 08 19:47:28 crc kubenswrapper[4885]: I0308 19:47:28.490671 4885 generic.go:334] "Generic (PLEG): container finished" podID="dca42faa-df32-44b5-99e8-109120aa36a1" containerID="76efcaf109695f1158fd41ebd4ca91e20931270214257e11b962dd8c87b3ae0e" exitCode=0 Mar 08 19:47:28 crc kubenswrapper[4885]: I0308 19:47:28.490783 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerDied","Data":"76efcaf109695f1158fd41ebd4ca91e20931270214257e11b962dd8c87b3ae0e"} Mar 08 19:47:28 crc kubenswrapper[4885]: I0308 19:47:28.494372 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" event={"ID":"e76b0259-0d11-4451-b770-4ca5611ce32e","Type":"ContainerStarted","Data":"3ac7cb82323bfaee2f9985c6c374fdc9aaae7282b3ffa4a29c875273d7e45495"} Mar 08 19:47:28 crc kubenswrapper[4885]: I0308 19:47:28.494582 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:28 crc kubenswrapper[4885]: I0308 19:47:28.564796 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" podStartSLOduration=2.487230511 podStartE2EDuration="9.564766163s" podCreationTimestamp="2026-03-08 19:47:19 +0000 UTC" firstStartedPulling="2026-03-08 19:47:20.627691583 +0000 UTC m=+942.023745616" lastFinishedPulling="2026-03-08 19:47:27.705227215 +0000 UTC m=+949.101281268" observedRunningTime="2026-03-08 19:47:28.553794441 +0000 UTC m=+949.949848494" watchObservedRunningTime="2026-03-08 19:47:28.564766163 +0000 UTC m=+949.960820226" Mar 08 19:47:29 crc kubenswrapper[4885]: I0308 19:47:29.504037 4885 generic.go:334] "Generic (PLEG): container finished" podID="dca42faa-df32-44b5-99e8-109120aa36a1" containerID="1d041f1c1898be24726f2ddc9872df824959a0536e7c1621811da9e12f86f007" exitCode=0 Mar 08 19:47:29 crc kubenswrapper[4885]: I0308 19:47:29.504135 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerDied","Data":"1d041f1c1898be24726f2ddc9872df824959a0536e7c1621811da9e12f86f007"} Mar 08 19:47:30 crc kubenswrapper[4885]: I0308 19:47:30.514663 4885 generic.go:334] "Generic (PLEG): container finished" podID="dca42faa-df32-44b5-99e8-109120aa36a1" containerID="f0a83d4c5ffbdd02c07df0d57d046fe2cb38681ba58744360a06633b6678ddc7" exitCode=0 Mar 08 19:47:30 crc kubenswrapper[4885]: I0308 19:47:30.514772 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerDied","Data":"f0a83d4c5ffbdd02c07df0d57d046fe2cb38681ba58744360a06633b6678ddc7"} Mar 08 19:47:31 crc kubenswrapper[4885]: I0308 19:47:31.528701 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"e43431589bddbd576a18965d13e24f51aa05e878b1b4d6351b86a3b257d4fe5e"} Mar 08 19:47:31 crc kubenswrapper[4885]: I0308 19:47:31.529062 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"ba601f4d7aa0f9f436a5837c93cb62947192f2e21fb9c00ec686129ebec5c31f"} Mar 08 19:47:31 crc kubenswrapper[4885]: I0308 19:47:31.529085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"1e2d71c2217281200b1510b4f8a085bf2348e7cd278d485145f7f45c51c1ac56"} Mar 08 19:47:31 crc kubenswrapper[4885]: I0308 19:47:31.529105 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"f676e00218c4c463b862070415b45a4fd7030b76ce4f96565cc6162d8317103f"} Mar 08 19:47:31 crc kubenswrapper[4885]: I0308 19:47:31.529121 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"4a3729bde3dd2f058b64fea8a20289be12e31438a2767e84c7fe256b1dae9e05"} Mar 08 19:47:32 crc kubenswrapper[4885]: I0308 19:47:32.542957 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hq28v" event={"ID":"dca42faa-df32-44b5-99e8-109120aa36a1","Type":"ContainerStarted","Data":"dc13dca89cbe4b763e3f991f24bccdec1d3da629401113525eb14a91175de89a"} Mar 08 19:47:32 crc kubenswrapper[4885]: I0308 19:47:32.543243 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:32 crc kubenswrapper[4885]: I0308 19:47:32.572559 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hq28v" podStartSLOduration=6.878159604 podStartE2EDuration="13.572536768s" podCreationTimestamp="2026-03-08 19:47:19 +0000 UTC" firstStartedPulling="2026-03-08 19:47:20.985892767 +0000 UTC m=+942.381946830" lastFinishedPulling="2026-03-08 19:47:27.680269971 +0000 UTC m=+949.076323994" observedRunningTime="2026-03-08 19:47:32.569523947 +0000 UTC m=+953.965578010" watchObservedRunningTime="2026-03-08 19:47:32.572536768 +0000 UTC m=+953.968590821" Mar 08 19:47:35 crc kubenswrapper[4885]: I0308 19:47:35.814753 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:35 crc kubenswrapper[4885]: I0308 19:47:35.865325 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.066034 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tp6kf"] Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.070545 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.076797 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tp6kf"] Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.091583 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-utilities\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.091681 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-catalog-content\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.091790 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpnqt\" (UniqueName: \"kubernetes.io/projected/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-kube-api-access-zpnqt\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.193136 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-catalog-content\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.193300 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpnqt\" (UniqueName: \"kubernetes.io/projected/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-kube-api-access-zpnqt\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.193373 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-utilities\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.193628 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-catalog-content\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.193842 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-utilities\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.216521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpnqt\" (UniqueName: \"kubernetes.io/projected/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-kube-api-access-zpnqt\") pod \"redhat-marketplace-tp6kf\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.402188 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:36 crc kubenswrapper[4885]: I0308 19:47:36.694818 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tp6kf"] Mar 08 19:47:37 crc kubenswrapper[4885]: I0308 19:47:37.587598 4885 generic.go:334] "Generic (PLEG): container finished" podID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerID="18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa" exitCode=0 Mar 08 19:47:37 crc kubenswrapper[4885]: I0308 19:47:37.587677 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tp6kf" event={"ID":"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0","Type":"ContainerDied","Data":"18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa"} Mar 08 19:47:37 crc kubenswrapper[4885]: I0308 19:47:37.587892 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tp6kf" event={"ID":"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0","Type":"ContainerStarted","Data":"e808f25e95174e48eb13df958aaeaad16e03d9d613887ec3abec24654bd6c920"} Mar 08 19:47:38 crc kubenswrapper[4885]: I0308 19:47:38.596331 4885 generic.go:334] "Generic (PLEG): container finished" podID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerID="f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5" exitCode=0 Mar 08 19:47:38 crc kubenswrapper[4885]: I0308 19:47:38.596372 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tp6kf" event={"ID":"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0","Type":"ContainerDied","Data":"f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5"} Mar 08 19:47:39 crc kubenswrapper[4885]: I0308 19:47:39.608478 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tp6kf" event={"ID":"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0","Type":"ContainerStarted","Data":"793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d"} Mar 08 19:47:39 crc kubenswrapper[4885]: I0308 19:47:39.648241 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tp6kf" podStartSLOduration=2.183537445 podStartE2EDuration="3.648094097s" podCreationTimestamp="2026-03-08 19:47:36 +0000 UTC" firstStartedPulling="2026-03-08 19:47:37.589825193 +0000 UTC m=+958.985879246" lastFinishedPulling="2026-03-08 19:47:39.054381835 +0000 UTC m=+960.450435898" observedRunningTime="2026-03-08 19:47:39.643041593 +0000 UTC m=+961.039095626" watchObservedRunningTime="2026-03-08 19:47:39.648094097 +0000 UTC m=+961.044148160" Mar 08 19:47:40 crc kubenswrapper[4885]: I0308 19:47:40.230363 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-cq6xg" Mar 08 19:47:40 crc kubenswrapper[4885]: I0308 19:47:40.817221 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hq28v" Mar 08 19:47:40 crc kubenswrapper[4885]: I0308 19:47:40.941488 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-xj2vs" Mar 08 19:47:41 crc kubenswrapper[4885]: I0308 19:47:41.804897 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5nclk" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.479752 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w"] Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.481076 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.483534 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.502064 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcrnw\" (UniqueName: \"kubernetes.io/projected/a5bcb33a-118a-438a-86f5-467399e36ddb-kube-api-access-zcrnw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.502140 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.502199 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.507557 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w"] Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.603422 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcrnw\" (UniqueName: \"kubernetes.io/projected/a5bcb33a-118a-438a-86f5-467399e36ddb-kube-api-access-zcrnw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.603593 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.603704 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.604463 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.604527 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.645454 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcrnw\" (UniqueName: \"kubernetes.io/projected/a5bcb33a-118a-438a-86f5-467399e36ddb-kube-api-access-zcrnw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:43 crc kubenswrapper[4885]: I0308 19:47:43.815966 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:44 crc kubenswrapper[4885]: I0308 19:47:44.146675 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w"] Mar 08 19:47:44 crc kubenswrapper[4885]: W0308 19:47:44.152693 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5bcb33a_118a_438a_86f5_467399e36ddb.slice/crio-ef485315c5dbabdb36ca01d10a9939fc314f7802a7468b99ea64ba7870407115 WatchSource:0}: Error finding container ef485315c5dbabdb36ca01d10a9939fc314f7802a7468b99ea64ba7870407115: Status 404 returned error can't find the container with id ef485315c5dbabdb36ca01d10a9939fc314f7802a7468b99ea64ba7870407115 Mar 08 19:47:44 crc kubenswrapper[4885]: I0308 19:47:44.658185 4885 generic.go:334] "Generic (PLEG): container finished" podID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerID="ada4fe3f29671378ed74f7751b862f298d82e4bfe3218bda000ee6b7d051019e" exitCode=0 Mar 08 19:47:44 crc kubenswrapper[4885]: I0308 19:47:44.658240 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" event={"ID":"a5bcb33a-118a-438a-86f5-467399e36ddb","Type":"ContainerDied","Data":"ada4fe3f29671378ed74f7751b862f298d82e4bfe3218bda000ee6b7d051019e"} Mar 08 19:47:44 crc kubenswrapper[4885]: I0308 19:47:44.658273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" event={"ID":"a5bcb33a-118a-438a-86f5-467399e36ddb","Type":"ContainerStarted","Data":"ef485315c5dbabdb36ca01d10a9939fc314f7802a7468b99ea64ba7870407115"} Mar 08 19:47:46 crc kubenswrapper[4885]: I0308 19:47:46.403779 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:46 crc kubenswrapper[4885]: I0308 19:47:46.404066 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:46 crc kubenswrapper[4885]: I0308 19:47:46.467777 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:46 crc kubenswrapper[4885]: I0308 19:47:46.713083 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:48 crc kubenswrapper[4885]: I0308 19:47:48.690209 4885 generic.go:334] "Generic (PLEG): container finished" podID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerID="6a17b11e7389ca9fe5329be47b679e399b863f15c11231cd63043f842d203a31" exitCode=0 Mar 08 19:47:48 crc kubenswrapper[4885]: I0308 19:47:48.690307 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" event={"ID":"a5bcb33a-118a-438a-86f5-467399e36ddb","Type":"ContainerDied","Data":"6a17b11e7389ca9fe5329be47b679e399b863f15c11231cd63043f842d203a31"} Mar 08 19:47:48 crc kubenswrapper[4885]: I0308 19:47:48.833590 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tp6kf"] Mar 08 19:47:48 crc kubenswrapper[4885]: I0308 19:47:48.833963 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tp6kf" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="registry-server" containerID="cri-o://793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d" gracePeriod=2 Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.359289 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.499423 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpnqt\" (UniqueName: \"kubernetes.io/projected/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-kube-api-access-zpnqt\") pod \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.499512 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-utilities\") pod \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.499641 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-catalog-content\") pod \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\" (UID: \"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0\") " Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.501416 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-utilities" (OuterVolumeSpecName: "utilities") pod "fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" (UID: "fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.505832 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-kube-api-access-zpnqt" (OuterVolumeSpecName: "kube-api-access-zpnqt") pod "fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" (UID: "fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0"). InnerVolumeSpecName "kube-api-access-zpnqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.547701 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" (UID: "fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.601378 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpnqt\" (UniqueName: \"kubernetes.io/projected/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-kube-api-access-zpnqt\") on node \"crc\" DevicePath \"\"" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.601434 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.601452 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.702896 4885 generic.go:334] "Generic (PLEG): container finished" podID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerID="793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d" exitCode=0 Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.703058 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tp6kf" event={"ID":"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0","Type":"ContainerDied","Data":"793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d"} Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.703134 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tp6kf" event={"ID":"fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0","Type":"ContainerDied","Data":"e808f25e95174e48eb13df958aaeaad16e03d9d613887ec3abec24654bd6c920"} Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.703189 4885 scope.go:117] "RemoveContainer" containerID="793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.703646 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tp6kf" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.709091 4885 generic.go:334] "Generic (PLEG): container finished" podID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerID="34f80d156a9821d5b0ff15771041d9780c29ee9ec79de322687e5239864c0cb2" exitCode=0 Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.709156 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" event={"ID":"a5bcb33a-118a-438a-86f5-467399e36ddb","Type":"ContainerDied","Data":"34f80d156a9821d5b0ff15771041d9780c29ee9ec79de322687e5239864c0cb2"} Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.746743 4885 scope.go:117] "RemoveContainer" containerID="f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.776211 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tp6kf"] Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.781766 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tp6kf"] Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.798204 4885 scope.go:117] "RemoveContainer" containerID="18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.817611 4885 scope.go:117] "RemoveContainer" containerID="793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d" Mar 08 19:47:49 crc kubenswrapper[4885]: E0308 19:47:49.818222 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d\": container with ID starting with 793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d not found: ID does not exist" containerID="793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.818273 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d"} err="failed to get container status \"793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d\": rpc error: code = NotFound desc = could not find container \"793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d\": container with ID starting with 793d3e2634dcd42434ba8c8ea863be9941edea2901e5598870bfdacfc4600f4d not found: ID does not exist" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.818307 4885 scope.go:117] "RemoveContainer" containerID="f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5" Mar 08 19:47:49 crc kubenswrapper[4885]: E0308 19:47:49.818797 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5\": container with ID starting with f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5 not found: ID does not exist" containerID="f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.819041 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5"} err="failed to get container status \"f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5\": rpc error: code = NotFound desc = could not find container \"f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5\": container with ID starting with f457f4c7cdbd2edc0c19b0d5131345b15785dfb68f32097db59357bd0691ded5 not found: ID does not exist" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.819214 4885 scope.go:117] "RemoveContainer" containerID="18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa" Mar 08 19:47:49 crc kubenswrapper[4885]: E0308 19:47:49.819565 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa\": container with ID starting with 18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa not found: ID does not exist" containerID="18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa" Mar 08 19:47:49 crc kubenswrapper[4885]: I0308 19:47:49.819587 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa"} err="failed to get container status \"18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa\": rpc error: code = NotFound desc = could not find container \"18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa\": container with ID starting with 18a91d53d440f66af5a6e95afd0d8676e91a705e7d40bd94bdd07aa51b3023fa not found: ID does not exist" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.073513 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.230196 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-bundle\") pod \"a5bcb33a-118a-438a-86f5-467399e36ddb\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.230266 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-util\") pod \"a5bcb33a-118a-438a-86f5-467399e36ddb\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.230356 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcrnw\" (UniqueName: \"kubernetes.io/projected/a5bcb33a-118a-438a-86f5-467399e36ddb-kube-api-access-zcrnw\") pod \"a5bcb33a-118a-438a-86f5-467399e36ddb\" (UID: \"a5bcb33a-118a-438a-86f5-467399e36ddb\") " Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.231469 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-bundle" (OuterVolumeSpecName: "bundle") pod "a5bcb33a-118a-438a-86f5-467399e36ddb" (UID: "a5bcb33a-118a-438a-86f5-467399e36ddb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.237271 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5bcb33a-118a-438a-86f5-467399e36ddb-kube-api-access-zcrnw" (OuterVolumeSpecName: "kube-api-access-zcrnw") pod "a5bcb33a-118a-438a-86f5-467399e36ddb" (UID: "a5bcb33a-118a-438a-86f5-467399e36ddb"). InnerVolumeSpecName "kube-api-access-zcrnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.240613 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-util" (OuterVolumeSpecName: "util") pod "a5bcb33a-118a-438a-86f5-467399e36ddb" (UID: "a5bcb33a-118a-438a-86f5-467399e36ddb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.332643 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcrnw\" (UniqueName: \"kubernetes.io/projected/a5bcb33a-118a-438a-86f5-467399e36ddb-kube-api-access-zcrnw\") on node \"crc\" DevicePath \"\"" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.332705 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.332725 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5bcb33a-118a-438a-86f5-467399e36ddb-util\") on node \"crc\" DevicePath \"\"" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.381124 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" path="/var/lib/kubelet/pods/fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0/volumes" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.731019 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" event={"ID":"a5bcb33a-118a-438a-86f5-467399e36ddb","Type":"ContainerDied","Data":"ef485315c5dbabdb36ca01d10a9939fc314f7802a7468b99ea64ba7870407115"} Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.731094 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef485315c5dbabdb36ca01d10a9939fc314f7802a7468b99ea64ba7870407115" Mar 08 19:47:51 crc kubenswrapper[4885]: I0308 19:47:51.731131 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.490688 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2"] Mar 08 19:47:54 crc kubenswrapper[4885]: E0308 19:47:54.491313 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="registry-server" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491333 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="registry-server" Mar 08 19:47:54 crc kubenswrapper[4885]: E0308 19:47:54.491348 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="extract-content" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491356 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="extract-content" Mar 08 19:47:54 crc kubenswrapper[4885]: E0308 19:47:54.491363 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="extract" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491371 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="extract" Mar 08 19:47:54 crc kubenswrapper[4885]: E0308 19:47:54.491381 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="extract-utilities" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491391 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="extract-utilities" Mar 08 19:47:54 crc kubenswrapper[4885]: E0308 19:47:54.491402 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="util" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491409 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="util" Mar 08 19:47:54 crc kubenswrapper[4885]: E0308 19:47:54.491425 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="pull" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491433 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="pull" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491560 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc2018b1-ba4e-4fde-90ab-f5df58a7d4d0" containerName="registry-server" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.491577 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bcb33a-118a-438a-86f5-467399e36ddb" containerName="extract" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.492072 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.494943 4885 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-mh9vk" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.495059 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.508281 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.525948 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2"] Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.672033 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr2gb\" (UniqueName: \"kubernetes.io/projected/a393ebaa-f427-44ff-965d-c1c65ab661ba-kube-api-access-rr2gb\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qkxr2\" (UID: \"a393ebaa-f427-44ff-965d-c1c65ab661ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.672085 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a393ebaa-f427-44ff-965d-c1c65ab661ba-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qkxr2\" (UID: \"a393ebaa-f427-44ff-965d-c1c65ab661ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.773860 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr2gb\" (UniqueName: \"kubernetes.io/projected/a393ebaa-f427-44ff-965d-c1c65ab661ba-kube-api-access-rr2gb\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qkxr2\" (UID: \"a393ebaa-f427-44ff-965d-c1c65ab661ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.774019 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a393ebaa-f427-44ff-965d-c1c65ab661ba-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qkxr2\" (UID: \"a393ebaa-f427-44ff-965d-c1c65ab661ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.774535 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a393ebaa-f427-44ff-965d-c1c65ab661ba-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qkxr2\" (UID: \"a393ebaa-f427-44ff-965d-c1c65ab661ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.801813 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr2gb\" (UniqueName: \"kubernetes.io/projected/a393ebaa-f427-44ff-965d-c1c65ab661ba-kube-api-access-rr2gb\") pod \"cert-manager-operator-controller-manager-66c8bdd694-qkxr2\" (UID: \"a393ebaa-f427-44ff-965d-c1c65ab661ba\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:54 crc kubenswrapper[4885]: I0308 19:47:54.818612 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" Mar 08 19:47:55 crc kubenswrapper[4885]: I0308 19:47:55.267236 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2"] Mar 08 19:47:55 crc kubenswrapper[4885]: W0308 19:47:55.277146 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda393ebaa_f427_44ff_965d_c1c65ab661ba.slice/crio-7cd56b1530ea2b03662fb4fce8d73ff574ac7f9a43e721982ac177d576a5291a WatchSource:0}: Error finding container 7cd56b1530ea2b03662fb4fce8d73ff574ac7f9a43e721982ac177d576a5291a: Status 404 returned error can't find the container with id 7cd56b1530ea2b03662fb4fce8d73ff574ac7f9a43e721982ac177d576a5291a Mar 08 19:47:55 crc kubenswrapper[4885]: I0308 19:47:55.755479 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" event={"ID":"a393ebaa-f427-44ff-965d-c1c65ab661ba","Type":"ContainerStarted","Data":"7cd56b1530ea2b03662fb4fce8d73ff574ac7f9a43e721982ac177d576a5291a"} Mar 08 19:47:58 crc kubenswrapper[4885]: I0308 19:47:58.787595 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" event={"ID":"a393ebaa-f427-44ff-965d-c1c65ab661ba","Type":"ContainerStarted","Data":"d91f8f7282045a6c1b3a077232b0d4c4e2e38f556f39912f0ba6add662d2a0ac"} Mar 08 19:47:58 crc kubenswrapper[4885]: I0308 19:47:58.821275 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-qkxr2" podStartSLOduration=1.593387292 podStartE2EDuration="4.821248827s" podCreationTimestamp="2026-03-08 19:47:54 +0000 UTC" firstStartedPulling="2026-03-08 19:47:55.280328839 +0000 UTC m=+976.676382882" lastFinishedPulling="2026-03-08 19:47:58.508190354 +0000 UTC m=+979.904244417" observedRunningTime="2026-03-08 19:47:58.812729101 +0000 UTC m=+980.208783124" watchObservedRunningTime="2026-03-08 19:47:58.821248827 +0000 UTC m=+980.217302880" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.157506 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549988-f7hdz"] Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.158191 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.160870 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.161603 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.166182 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549988-f7hdz"] Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.170171 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.348737 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxw5h\" (UniqueName: \"kubernetes.io/projected/a1daba97-3389-4e45-8a6c-bf910619f315-kube-api-access-rxw5h\") pod \"auto-csr-approver-29549988-f7hdz\" (UID: \"a1daba97-3389-4e45-8a6c-bf910619f315\") " pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.450021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxw5h\" (UniqueName: \"kubernetes.io/projected/a1daba97-3389-4e45-8a6c-bf910619f315-kube-api-access-rxw5h\") pod \"auto-csr-approver-29549988-f7hdz\" (UID: \"a1daba97-3389-4e45-8a6c-bf910619f315\") " pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.474972 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxw5h\" (UniqueName: \"kubernetes.io/projected/a1daba97-3389-4e45-8a6c-bf910619f315-kube-api-access-rxw5h\") pod \"auto-csr-approver-29549988-f7hdz\" (UID: \"a1daba97-3389-4e45-8a6c-bf910619f315\") " pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:00 crc kubenswrapper[4885]: I0308 19:48:00.771658 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:01 crc kubenswrapper[4885]: I0308 19:48:01.084176 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549988-f7hdz"] Mar 08 19:48:01 crc kubenswrapper[4885]: I0308 19:48:01.806714 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" event={"ID":"a1daba97-3389-4e45-8a6c-bf910619f315","Type":"ContainerStarted","Data":"a2a0006e3cbad50961fd02041119490cc7d99f33f80e8b9e1e8e7b7e685822b2"} Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.406538 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kgm5k"] Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.407577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.409796 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.410072 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.412878 4885 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-s2vj9" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.426553 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kgm5k"] Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.577963 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de1b5c94-7518-46c5-af4a-2b692d23b3b7-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kgm5k\" (UID: \"de1b5c94-7518-46c5-af4a-2b692d23b3b7\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.578061 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-556fr\" (UniqueName: \"kubernetes.io/projected/de1b5c94-7518-46c5-af4a-2b692d23b3b7-kube-api-access-556fr\") pod \"cert-manager-webhook-6888856db4-kgm5k\" (UID: \"de1b5c94-7518-46c5-af4a-2b692d23b3b7\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.644497 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fbnvg"] Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.645153 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.647911 4885 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-8nw7z" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.662189 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fbnvg"] Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.680174 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de1b5c94-7518-46c5-af4a-2b692d23b3b7-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kgm5k\" (UID: \"de1b5c94-7518-46c5-af4a-2b692d23b3b7\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.680251 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-556fr\" (UniqueName: \"kubernetes.io/projected/de1b5c94-7518-46c5-af4a-2b692d23b3b7-kube-api-access-556fr\") pod \"cert-manager-webhook-6888856db4-kgm5k\" (UID: \"de1b5c94-7518-46c5-af4a-2b692d23b3b7\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.700969 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-556fr\" (UniqueName: \"kubernetes.io/projected/de1b5c94-7518-46c5-af4a-2b692d23b3b7-kube-api-access-556fr\") pod \"cert-manager-webhook-6888856db4-kgm5k\" (UID: \"de1b5c94-7518-46c5-af4a-2b692d23b3b7\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.701641 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de1b5c94-7518-46c5-af4a-2b692d23b3b7-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-kgm5k\" (UID: \"de1b5c94-7518-46c5-af4a-2b692d23b3b7\") " pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.730660 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.781316 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qm7s\" (UniqueName: \"kubernetes.io/projected/d62feb91-9474-41c0-b79c-93f3f6dd830b-kube-api-access-7qm7s\") pod \"cert-manager-cainjector-5545bd876-fbnvg\" (UID: \"d62feb91-9474-41c0-b79c-93f3f6dd830b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.781583 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d62feb91-9474-41c0-b79c-93f3f6dd830b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fbnvg\" (UID: \"d62feb91-9474-41c0-b79c-93f3f6dd830b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.821468 4885 generic.go:334] "Generic (PLEG): container finished" podID="a1daba97-3389-4e45-8a6c-bf910619f315" containerID="027375be475663b68fa34275cf933a5f73118e3902051a04110bd2c7ec89a43e" exitCode=0 Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.821526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" event={"ID":"a1daba97-3389-4e45-8a6c-bf910619f315","Type":"ContainerDied","Data":"027375be475663b68fa34275cf933a5f73118e3902051a04110bd2c7ec89a43e"} Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.883230 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d62feb91-9474-41c0-b79c-93f3f6dd830b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fbnvg\" (UID: \"d62feb91-9474-41c0-b79c-93f3f6dd830b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.883333 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qm7s\" (UniqueName: \"kubernetes.io/projected/d62feb91-9474-41c0-b79c-93f3f6dd830b-kube-api-access-7qm7s\") pod \"cert-manager-cainjector-5545bd876-fbnvg\" (UID: \"d62feb91-9474-41c0-b79c-93f3f6dd830b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.901222 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d62feb91-9474-41c0-b79c-93f3f6dd830b-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-fbnvg\" (UID: \"d62feb91-9474-41c0-b79c-93f3f6dd830b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.902175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qm7s\" (UniqueName: \"kubernetes.io/projected/d62feb91-9474-41c0-b79c-93f3f6dd830b-kube-api-access-7qm7s\") pod \"cert-manager-cainjector-5545bd876-fbnvg\" (UID: \"d62feb91-9474-41c0-b79c-93f3f6dd830b\") " pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.947396 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-kgm5k"] Mar 08 19:48:02 crc kubenswrapper[4885]: I0308 19:48:02.957269 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" Mar 08 19:48:03 crc kubenswrapper[4885]: I0308 19:48:03.353590 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-fbnvg"] Mar 08 19:48:03 crc kubenswrapper[4885]: W0308 19:48:03.360670 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd62feb91_9474_41c0_b79c_93f3f6dd830b.slice/crio-40cf97615bee3872071ecda7b71b8a59d28db984ed1adbfee464e1e3073f50fa WatchSource:0}: Error finding container 40cf97615bee3872071ecda7b71b8a59d28db984ed1adbfee464e1e3073f50fa: Status 404 returned error can't find the container with id 40cf97615bee3872071ecda7b71b8a59d28db984ed1adbfee464e1e3073f50fa Mar 08 19:48:03 crc kubenswrapper[4885]: I0308 19:48:03.836195 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" event={"ID":"de1b5c94-7518-46c5-af4a-2b692d23b3b7","Type":"ContainerStarted","Data":"1ee95e3c9eeb42138759c5e8f5f4e7b9f1c317fbf5599a74999a390cfa925578"} Mar 08 19:48:03 crc kubenswrapper[4885]: I0308 19:48:03.839493 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" event={"ID":"d62feb91-9474-41c0-b79c-93f3f6dd830b","Type":"ContainerStarted","Data":"40cf97615bee3872071ecda7b71b8a59d28db984ed1adbfee464e1e3073f50fa"} Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.143387 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.201110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxw5h\" (UniqueName: \"kubernetes.io/projected/a1daba97-3389-4e45-8a6c-bf910619f315-kube-api-access-rxw5h\") pod \"a1daba97-3389-4e45-8a6c-bf910619f315\" (UID: \"a1daba97-3389-4e45-8a6c-bf910619f315\") " Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.206086 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1daba97-3389-4e45-8a6c-bf910619f315-kube-api-access-rxw5h" (OuterVolumeSpecName: "kube-api-access-rxw5h") pod "a1daba97-3389-4e45-8a6c-bf910619f315" (UID: "a1daba97-3389-4e45-8a6c-bf910619f315"). InnerVolumeSpecName "kube-api-access-rxw5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.302883 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxw5h\" (UniqueName: \"kubernetes.io/projected/a1daba97-3389-4e45-8a6c-bf910619f315-kube-api-access-rxw5h\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.848611 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" event={"ID":"a1daba97-3389-4e45-8a6c-bf910619f315","Type":"ContainerDied","Data":"a2a0006e3cbad50961fd02041119490cc7d99f33f80e8b9e1e8e7b7e685822b2"} Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.848644 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549988-f7hdz" Mar 08 19:48:04 crc kubenswrapper[4885]: I0308 19:48:04.848650 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2a0006e3cbad50961fd02041119490cc7d99f33f80e8b9e1e8e7b7e685822b2" Mar 08 19:48:05 crc kubenswrapper[4885]: I0308 19:48:05.193352 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549982-jnpc6"] Mar 08 19:48:05 crc kubenswrapper[4885]: I0308 19:48:05.197789 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549982-jnpc6"] Mar 08 19:48:05 crc kubenswrapper[4885]: I0308 19:48:05.374246 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30a90e18-1089-40ae-a5f0-f43b1d252129" path="/var/lib/kubelet/pods/30a90e18-1089-40ae-a5f0-f43b1d252129/volumes" Mar 08 19:48:07 crc kubenswrapper[4885]: I0308 19:48:07.895466 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" event={"ID":"d62feb91-9474-41c0-b79c-93f3f6dd830b","Type":"ContainerStarted","Data":"942328ef56ffd155036408f96b9736bc5c67207a26dc77713fffc4c57a453745"} Mar 08 19:48:07 crc kubenswrapper[4885]: I0308 19:48:07.897683 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" event={"ID":"de1b5c94-7518-46c5-af4a-2b692d23b3b7","Type":"ContainerStarted","Data":"6f96863a032c2582c3555288d36bfc786629f773c77ae05858ca77a1bbb00362"} Mar 08 19:48:07 crc kubenswrapper[4885]: I0308 19:48:07.897838 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:07 crc kubenswrapper[4885]: I0308 19:48:07.926180 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-fbnvg" podStartSLOduration=2.05786414 podStartE2EDuration="5.926157382s" podCreationTimestamp="2026-03-08 19:48:02 +0000 UTC" firstStartedPulling="2026-03-08 19:48:03.36206761 +0000 UTC m=+984.758121633" lastFinishedPulling="2026-03-08 19:48:07.230360862 +0000 UTC m=+988.626414875" observedRunningTime="2026-03-08 19:48:07.921000384 +0000 UTC m=+989.317054417" watchObservedRunningTime="2026-03-08 19:48:07.926157382 +0000 UTC m=+989.322211415" Mar 08 19:48:07 crc kubenswrapper[4885]: I0308 19:48:07.945329 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" podStartSLOduration=1.6945314580000002 podStartE2EDuration="5.945314341s" podCreationTimestamp="2026-03-08 19:48:02 +0000 UTC" firstStartedPulling="2026-03-08 19:48:02.953668219 +0000 UTC m=+984.349722252" lastFinishedPulling="2026-03-08 19:48:07.204451112 +0000 UTC m=+988.600505135" observedRunningTime="2026-03-08 19:48:07.942282471 +0000 UTC m=+989.338336504" watchObservedRunningTime="2026-03-08 19:48:07.945314341 +0000 UTC m=+989.341368374" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.525524 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2v5mz"] Mar 08 19:48:10 crc kubenswrapper[4885]: E0308 19:48:10.526185 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1daba97-3389-4e45-8a6c-bf910619f315" containerName="oc" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.526206 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1daba97-3389-4e45-8a6c-bf910619f315" containerName="oc" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.526396 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1daba97-3389-4e45-8a6c-bf910619f315" containerName="oc" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.527804 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.561503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2v5mz"] Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.689131 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlhwm\" (UniqueName: \"kubernetes.io/projected/a3dca148-e7cb-4675-918c-e773447bdcf4-kube-api-access-dlhwm\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.689208 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-catalog-content\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.689368 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-utilities\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.790688 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-utilities\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.791070 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlhwm\" (UniqueName: \"kubernetes.io/projected/a3dca148-e7cb-4675-918c-e773447bdcf4-kube-api-access-dlhwm\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.791232 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-catalog-content\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.791682 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-utilities\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.791936 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-catalog-content\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.817456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlhwm\" (UniqueName: \"kubernetes.io/projected/a3dca148-e7cb-4675-918c-e773447bdcf4-kube-api-access-dlhwm\") pod \"community-operators-2v5mz\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:10 crc kubenswrapper[4885]: I0308 19:48:10.906624 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:11 crc kubenswrapper[4885]: I0308 19:48:11.347781 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2v5mz"] Mar 08 19:48:11 crc kubenswrapper[4885]: I0308 19:48:11.936466 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerStarted","Data":"0a73aa511408aa95a3e655b73c27028296781e4764c3f2dfa59ddabd59e8bddb"} Mar 08 19:48:12 crc kubenswrapper[4885]: I0308 19:48:12.735698 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-kgm5k" Mar 08 19:48:12 crc kubenswrapper[4885]: I0308 19:48:12.947079 4885 generic.go:334] "Generic (PLEG): container finished" podID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerID="465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2" exitCode=0 Mar 08 19:48:12 crc kubenswrapper[4885]: I0308 19:48:12.947141 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerDied","Data":"465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2"} Mar 08 19:48:13 crc kubenswrapper[4885]: I0308 19:48:13.957115 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerStarted","Data":"3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3"} Mar 08 19:48:14 crc kubenswrapper[4885]: I0308 19:48:14.964392 4885 generic.go:334] "Generic (PLEG): container finished" podID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerID="3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3" exitCode=0 Mar 08 19:48:14 crc kubenswrapper[4885]: I0308 19:48:14.964439 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerDied","Data":"3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3"} Mar 08 19:48:15 crc kubenswrapper[4885]: I0308 19:48:15.979132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerStarted","Data":"919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99"} Mar 08 19:48:16 crc kubenswrapper[4885]: I0308 19:48:16.005178 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2v5mz" podStartSLOduration=3.370326781 podStartE2EDuration="6.005158062s" podCreationTimestamp="2026-03-08 19:48:10 +0000 UTC" firstStartedPulling="2026-03-08 19:48:12.950238879 +0000 UTC m=+994.346292932" lastFinishedPulling="2026-03-08 19:48:15.58507018 +0000 UTC m=+996.981124213" observedRunningTime="2026-03-08 19:48:16.001981346 +0000 UTC m=+997.398035399" watchObservedRunningTime="2026-03-08 19:48:16.005158062 +0000 UTC m=+997.401212095" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.009735 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-8wbq2"] Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.011749 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.014826 4885 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-r5mnt" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.027690 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-8wbq2"] Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.075120 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zghcz\" (UniqueName: \"kubernetes.io/projected/6da97aa0-4c69-414f-8fda-23403d2346e5-kube-api-access-zghcz\") pod \"cert-manager-545d4d4674-8wbq2\" (UID: \"6da97aa0-4c69-414f-8fda-23403d2346e5\") " pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.075457 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6da97aa0-4c69-414f-8fda-23403d2346e5-bound-sa-token\") pod \"cert-manager-545d4d4674-8wbq2\" (UID: \"6da97aa0-4c69-414f-8fda-23403d2346e5\") " pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.177592 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zghcz\" (UniqueName: \"kubernetes.io/projected/6da97aa0-4c69-414f-8fda-23403d2346e5-kube-api-access-zghcz\") pod \"cert-manager-545d4d4674-8wbq2\" (UID: \"6da97aa0-4c69-414f-8fda-23403d2346e5\") " pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.177991 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6da97aa0-4c69-414f-8fda-23403d2346e5-bound-sa-token\") pod \"cert-manager-545d4d4674-8wbq2\" (UID: \"6da97aa0-4c69-414f-8fda-23403d2346e5\") " pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.200506 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6da97aa0-4c69-414f-8fda-23403d2346e5-bound-sa-token\") pod \"cert-manager-545d4d4674-8wbq2\" (UID: \"6da97aa0-4c69-414f-8fda-23403d2346e5\") " pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.206785 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zghcz\" (UniqueName: \"kubernetes.io/projected/6da97aa0-4c69-414f-8fda-23403d2346e5-kube-api-access-zghcz\") pod \"cert-manager-545d4d4674-8wbq2\" (UID: \"6da97aa0-4c69-414f-8fda-23403d2346e5\") " pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.341302 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-8wbq2" Mar 08 19:48:19 crc kubenswrapper[4885]: I0308 19:48:19.860679 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-8wbq2"] Mar 08 19:48:20 crc kubenswrapper[4885]: I0308 19:48:20.024489 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-8wbq2" event={"ID":"6da97aa0-4c69-414f-8fda-23403d2346e5","Type":"ContainerStarted","Data":"6fbea6e70aed64f786da673b7e230c3abec1c21d00085441fc32925adc52e19d"} Mar 08 19:48:20 crc kubenswrapper[4885]: I0308 19:48:20.907424 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:20 crc kubenswrapper[4885]: I0308 19:48:20.907729 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:20 crc kubenswrapper[4885]: I0308 19:48:20.978814 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:21 crc kubenswrapper[4885]: I0308 19:48:21.035413 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-8wbq2" event={"ID":"6da97aa0-4c69-414f-8fda-23403d2346e5","Type":"ContainerStarted","Data":"38e60205adb96f6813b47ff0dfa9d44933c7a553a8ea2f867b06bee478816eda"} Mar 08 19:48:21 crc kubenswrapper[4885]: I0308 19:48:21.055115 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-8wbq2" podStartSLOduration=3.055095735 podStartE2EDuration="3.055095735s" podCreationTimestamp="2026-03-08 19:48:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:48:21.054362756 +0000 UTC m=+1002.450416819" watchObservedRunningTime="2026-03-08 19:48:21.055095735 +0000 UTC m=+1002.451149768" Mar 08 19:48:21 crc kubenswrapper[4885]: I0308 19:48:21.100250 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:21 crc kubenswrapper[4885]: I0308 19:48:21.225977 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2v5mz"] Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.049441 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2v5mz" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="registry-server" containerID="cri-o://919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99" gracePeriod=2 Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.478445 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.560688 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlhwm\" (UniqueName: \"kubernetes.io/projected/a3dca148-e7cb-4675-918c-e773447bdcf4-kube-api-access-dlhwm\") pod \"a3dca148-e7cb-4675-918c-e773447bdcf4\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.560837 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-utilities\") pod \"a3dca148-e7cb-4675-918c-e773447bdcf4\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.561061 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-catalog-content\") pod \"a3dca148-e7cb-4675-918c-e773447bdcf4\" (UID: \"a3dca148-e7cb-4675-918c-e773447bdcf4\") " Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.562009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-utilities" (OuterVolumeSpecName: "utilities") pod "a3dca148-e7cb-4675-918c-e773447bdcf4" (UID: "a3dca148-e7cb-4675-918c-e773447bdcf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.570227 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3dca148-e7cb-4675-918c-e773447bdcf4-kube-api-access-dlhwm" (OuterVolumeSpecName: "kube-api-access-dlhwm") pod "a3dca148-e7cb-4675-918c-e773447bdcf4" (UID: "a3dca148-e7cb-4675-918c-e773447bdcf4"). InnerVolumeSpecName "kube-api-access-dlhwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.635846 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3dca148-e7cb-4675-918c-e773447bdcf4" (UID: "a3dca148-e7cb-4675-918c-e773447bdcf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.663049 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.663121 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlhwm\" (UniqueName: \"kubernetes.io/projected/a3dca148-e7cb-4675-918c-e773447bdcf4-kube-api-access-dlhwm\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:23 crc kubenswrapper[4885]: I0308 19:48:23.663154 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3dca148-e7cb-4675-918c-e773447bdcf4-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.062252 4885 generic.go:334] "Generic (PLEG): container finished" podID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerID="919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99" exitCode=0 Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.062315 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerDied","Data":"919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99"} Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.062352 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2v5mz" event={"ID":"a3dca148-e7cb-4675-918c-e773447bdcf4","Type":"ContainerDied","Data":"0a73aa511408aa95a3e655b73c27028296781e4764c3f2dfa59ddabd59e8bddb"} Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.062389 4885 scope.go:117] "RemoveContainer" containerID="919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.062548 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2v5mz" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.091991 4885 scope.go:117] "RemoveContainer" containerID="3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.116049 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2v5mz"] Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.122321 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2v5mz"] Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.136593 4885 scope.go:117] "RemoveContainer" containerID="465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.161246 4885 scope.go:117] "RemoveContainer" containerID="919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99" Mar 08 19:48:24 crc kubenswrapper[4885]: E0308 19:48:24.161760 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99\": container with ID starting with 919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99 not found: ID does not exist" containerID="919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.161809 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99"} err="failed to get container status \"919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99\": rpc error: code = NotFound desc = could not find container \"919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99\": container with ID starting with 919ad4914fb8edbdae047d6693f44b769548abe4d24674872a4b801f6746dd99 not found: ID does not exist" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.161844 4885 scope.go:117] "RemoveContainer" containerID="3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3" Mar 08 19:48:24 crc kubenswrapper[4885]: E0308 19:48:24.162434 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3\": container with ID starting with 3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3 not found: ID does not exist" containerID="3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.162507 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3"} err="failed to get container status \"3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3\": rpc error: code = NotFound desc = could not find container \"3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3\": container with ID starting with 3cfd49cc27d376eee5de1da79ca35c169a11090348bed35e46a5423563dc8cb3 not found: ID does not exist" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.162554 4885 scope.go:117] "RemoveContainer" containerID="465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2" Mar 08 19:48:24 crc kubenswrapper[4885]: E0308 19:48:24.163025 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2\": container with ID starting with 465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2 not found: ID does not exist" containerID="465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2" Mar 08 19:48:24 crc kubenswrapper[4885]: I0308 19:48:24.163071 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2"} err="failed to get container status \"465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2\": rpc error: code = NotFound desc = could not find container \"465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2\": container with ID starting with 465e9cefdcd0c596bc00bca3099c59b2b881bc6bf60e9dd1b18be2677e84b5a2 not found: ID does not exist" Mar 08 19:48:25 crc kubenswrapper[4885]: I0308 19:48:25.380983 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" path="/var/lib/kubelet/pods/a3dca148-e7cb-4675-918c-e773447bdcf4/volumes" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.749446 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-5267w"] Mar 08 19:48:26 crc kubenswrapper[4885]: E0308 19:48:26.749778 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="registry-server" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.749800 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="registry-server" Mar 08 19:48:26 crc kubenswrapper[4885]: E0308 19:48:26.749821 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="extract-utilities" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.749833 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="extract-utilities" Mar 08 19:48:26 crc kubenswrapper[4885]: E0308 19:48:26.749860 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="extract-content" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.749871 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="extract-content" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.750084 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3dca148-e7cb-4675-918c-e773447bdcf4" containerName="registry-server" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.750656 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.753129 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-trhqw" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.753190 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.753240 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.769869 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5267w"] Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.808256 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4htp\" (UniqueName: \"kubernetes.io/projected/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f-kube-api-access-l4htp\") pod \"openstack-operator-index-5267w\" (UID: \"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f\") " pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.909328 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4htp\" (UniqueName: \"kubernetes.io/projected/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f-kube-api-access-l4htp\") pod \"openstack-operator-index-5267w\" (UID: \"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f\") " pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:26 crc kubenswrapper[4885]: I0308 19:48:26.933370 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4htp\" (UniqueName: \"kubernetes.io/projected/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f-kube-api-access-l4htp\") pod \"openstack-operator-index-5267w\" (UID: \"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f\") " pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:27 crc kubenswrapper[4885]: I0308 19:48:27.069025 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:27 crc kubenswrapper[4885]: I0308 19:48:27.291213 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-5267w"] Mar 08 19:48:28 crc kubenswrapper[4885]: I0308 19:48:28.100941 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5267w" event={"ID":"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f","Type":"ContainerStarted","Data":"3430220374f71b013dec40d23d384e699c7dd793743ffe3164939afc082b9453"} Mar 08 19:48:29 crc kubenswrapper[4885]: I0308 19:48:29.111766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5267w" event={"ID":"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f","Type":"ContainerStarted","Data":"1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802"} Mar 08 19:48:29 crc kubenswrapper[4885]: I0308 19:48:29.141722 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-5267w" podStartSLOduration=2.231652542 podStartE2EDuration="3.141692345s" podCreationTimestamp="2026-03-08 19:48:26 +0000 UTC" firstStartedPulling="2026-03-08 19:48:27.301617268 +0000 UTC m=+1008.697671301" lastFinishedPulling="2026-03-08 19:48:28.211657041 +0000 UTC m=+1009.607711104" observedRunningTime="2026-03-08 19:48:29.132396877 +0000 UTC m=+1010.528450930" watchObservedRunningTime="2026-03-08 19:48:29.141692345 +0000 UTC m=+1010.537746398" Mar 08 19:48:30 crc kubenswrapper[4885]: I0308 19:48:30.829780 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5267w"] Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.127650 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-5267w" podUID="fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" containerName="registry-server" containerID="cri-o://1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802" gracePeriod=2 Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.444028 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-w4b99"] Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.445190 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.483587 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-w4b99"] Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.592552 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gcc5\" (UniqueName: \"kubernetes.io/projected/024a1da8-dfa6-4cdc-a5ec-12b9ce56969a-kube-api-access-8gcc5\") pod \"openstack-operator-index-w4b99\" (UID: \"024a1da8-dfa6-4cdc-a5ec-12b9ce56969a\") " pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.593811 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.694087 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4htp\" (UniqueName: \"kubernetes.io/projected/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f-kube-api-access-l4htp\") pod \"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f\" (UID: \"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f\") " Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.694371 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gcc5\" (UniqueName: \"kubernetes.io/projected/024a1da8-dfa6-4cdc-a5ec-12b9ce56969a-kube-api-access-8gcc5\") pod \"openstack-operator-index-w4b99\" (UID: \"024a1da8-dfa6-4cdc-a5ec-12b9ce56969a\") " pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.703366 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f-kube-api-access-l4htp" (OuterVolumeSpecName: "kube-api-access-l4htp") pod "fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" (UID: "fa9e3b2e-4be2-420d-a812-37c7eddfcb1f"). InnerVolumeSpecName "kube-api-access-l4htp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.732672 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gcc5\" (UniqueName: \"kubernetes.io/projected/024a1da8-dfa6-4cdc-a5ec-12b9ce56969a-kube-api-access-8gcc5\") pod \"openstack-operator-index-w4b99\" (UID: \"024a1da8-dfa6-4cdc-a5ec-12b9ce56969a\") " pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.795016 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:31 crc kubenswrapper[4885]: I0308 19:48:31.795424 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4htp\" (UniqueName: \"kubernetes.io/projected/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f-kube-api-access-l4htp\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.139238 4885 generic.go:334] "Generic (PLEG): container finished" podID="fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" containerID="1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802" exitCode=0 Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.139348 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-5267w" Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.139382 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5267w" event={"ID":"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f","Type":"ContainerDied","Data":"1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802"} Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.139669 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-5267w" event={"ID":"fa9e3b2e-4be2-420d-a812-37c7eddfcb1f","Type":"ContainerDied","Data":"3430220374f71b013dec40d23d384e699c7dd793743ffe3164939afc082b9453"} Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.139705 4885 scope.go:117] "RemoveContainer" containerID="1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802" Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.178309 4885 scope.go:117] "RemoveContainer" containerID="1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802" Mar 08 19:48:32 crc kubenswrapper[4885]: E0308 19:48:32.178809 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802\": container with ID starting with 1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802 not found: ID does not exist" containerID="1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802" Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.178863 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802"} err="failed to get container status \"1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802\": rpc error: code = NotFound desc = could not find container \"1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802\": container with ID starting with 1e449330acb23a47bec9a9abdd78abedf2960d76a72b1f0beaf511168b539802 not found: ID does not exist" Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.179855 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-5267w"] Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.184977 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-5267w"] Mar 08 19:48:32 crc kubenswrapper[4885]: I0308 19:48:32.232036 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-w4b99"] Mar 08 19:48:32 crc kubenswrapper[4885]: W0308 19:48:32.234518 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod024a1da8_dfa6_4cdc_a5ec_12b9ce56969a.slice/crio-ce9807f0dd2e63a708f2672e011177f85e964d2ce50f73a3ab3ca4b8400a2f08 WatchSource:0}: Error finding container ce9807f0dd2e63a708f2672e011177f85e964d2ce50f73a3ab3ca4b8400a2f08: Status 404 returned error can't find the container with id ce9807f0dd2e63a708f2672e011177f85e964d2ce50f73a3ab3ca4b8400a2f08 Mar 08 19:48:33 crc kubenswrapper[4885]: I0308 19:48:33.151116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w4b99" event={"ID":"024a1da8-dfa6-4cdc-a5ec-12b9ce56969a","Type":"ContainerStarted","Data":"0d115ef311167762f2e12c7320218c342afcea251defcdda70b43918aff8c331"} Mar 08 19:48:33 crc kubenswrapper[4885]: I0308 19:48:33.151428 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-w4b99" event={"ID":"024a1da8-dfa6-4cdc-a5ec-12b9ce56969a","Type":"ContainerStarted","Data":"ce9807f0dd2e63a708f2672e011177f85e964d2ce50f73a3ab3ca4b8400a2f08"} Mar 08 19:48:33 crc kubenswrapper[4885]: I0308 19:48:33.178899 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-w4b99" podStartSLOduration=1.7407426209999999 podStartE2EDuration="2.178871302s" podCreationTimestamp="2026-03-08 19:48:31 +0000 UTC" firstStartedPulling="2026-03-08 19:48:32.240033894 +0000 UTC m=+1013.636087927" lastFinishedPulling="2026-03-08 19:48:32.678162545 +0000 UTC m=+1014.074216608" observedRunningTime="2026-03-08 19:48:33.170058818 +0000 UTC m=+1014.566112881" watchObservedRunningTime="2026-03-08 19:48:33.178871302 +0000 UTC m=+1014.574925365" Mar 08 19:48:33 crc kubenswrapper[4885]: I0308 19:48:33.381541 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" path="/var/lib/kubelet/pods/fa9e3b2e-4be2-420d-a812-37c7eddfcb1f/volumes" Mar 08 19:48:41 crc kubenswrapper[4885]: I0308 19:48:41.796205 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:41 crc kubenswrapper[4885]: I0308 19:48:41.796886 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:41 crc kubenswrapper[4885]: I0308 19:48:41.833355 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:42 crc kubenswrapper[4885]: I0308 19:48:42.276062 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-w4b99" Mar 08 19:48:45 crc kubenswrapper[4885]: I0308 19:48:45.924261 4885 scope.go:117] "RemoveContainer" containerID="c050624aad83fb4c435f0fa087d6fe3ddc6c1b029b5c8f9e354ed5228ef2d3fa" Mar 08 19:48:48 crc kubenswrapper[4885]: I0308 19:48:48.988573 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb"] Mar 08 19:48:48 crc kubenswrapper[4885]: E0308 19:48:48.989557 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" containerName="registry-server" Mar 08 19:48:48 crc kubenswrapper[4885]: I0308 19:48:48.989581 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" containerName="registry-server" Mar 08 19:48:48 crc kubenswrapper[4885]: I0308 19:48:48.989754 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9e3b2e-4be2-420d-a812-37c7eddfcb1f" containerName="registry-server" Mar 08 19:48:48 crc kubenswrapper[4885]: I0308 19:48:48.991018 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:48 crc kubenswrapper[4885]: I0308 19:48:48.993340 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-wvm7p" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.016612 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb"] Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.095634 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.095913 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtqnh\" (UniqueName: \"kubernetes.io/projected/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-kube-api-access-jtqnh\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.096061 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.197690 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.197872 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtqnh\" (UniqueName: \"kubernetes.io/projected/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-kube-api-access-jtqnh\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.197951 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.198473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.198590 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.236434 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtqnh\" (UniqueName: \"kubernetes.io/projected/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-kube-api-access-jtqnh\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.307462 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:49 crc kubenswrapper[4885]: I0308 19:48:49.760942 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb"] Mar 08 19:48:50 crc kubenswrapper[4885]: I0308 19:48:50.296278 4885 generic.go:334] "Generic (PLEG): container finished" podID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerID="e6ee16a25650b52efd83ec25e72bba06d68b2f1bd78837a8a0af9244697e46f5" exitCode=0 Mar 08 19:48:50 crc kubenswrapper[4885]: I0308 19:48:50.296345 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" event={"ID":"5fb2cd81-437a-46be-93b5-b96ec94b1d1c","Type":"ContainerDied","Data":"e6ee16a25650b52efd83ec25e72bba06d68b2f1bd78837a8a0af9244697e46f5"} Mar 08 19:48:50 crc kubenswrapper[4885]: I0308 19:48:50.296690 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" event={"ID":"5fb2cd81-437a-46be-93b5-b96ec94b1d1c","Type":"ContainerStarted","Data":"76747c71d15f5f8e83176ea2ced98abd16cd10232d0e50cf6db9a200b49a6fc1"} Mar 08 19:48:52 crc kubenswrapper[4885]: I0308 19:48:52.323822 4885 generic.go:334] "Generic (PLEG): container finished" podID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerID="722a49d08a61a9f585e665550df0274c058721ad95c30abe5744d3106e0a637f" exitCode=0 Mar 08 19:48:52 crc kubenswrapper[4885]: I0308 19:48:52.323908 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" event={"ID":"5fb2cd81-437a-46be-93b5-b96ec94b1d1c","Type":"ContainerDied","Data":"722a49d08a61a9f585e665550df0274c058721ad95c30abe5744d3106e0a637f"} Mar 08 19:48:53 crc kubenswrapper[4885]: I0308 19:48:53.336811 4885 generic.go:334] "Generic (PLEG): container finished" podID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerID="6f1f51906cfdbcd3ba2235908cd64569180d6b7ce50833fc3b3b9f7d32adb160" exitCode=0 Mar 08 19:48:53 crc kubenswrapper[4885]: I0308 19:48:53.336999 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" event={"ID":"5fb2cd81-437a-46be-93b5-b96ec94b1d1c","Type":"ContainerDied","Data":"6f1f51906cfdbcd3ba2235908cd64569180d6b7ce50833fc3b3b9f7d32adb160"} Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.683989 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.798435 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-util\") pod \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.798550 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtqnh\" (UniqueName: \"kubernetes.io/projected/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-kube-api-access-jtqnh\") pod \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.798709 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-bundle\") pod \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\" (UID: \"5fb2cd81-437a-46be-93b5-b96ec94b1d1c\") " Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.799722 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-bundle" (OuterVolumeSpecName: "bundle") pod "5fb2cd81-437a-46be-93b5-b96ec94b1d1c" (UID: "5fb2cd81-437a-46be-93b5-b96ec94b1d1c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.806349 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-kube-api-access-jtqnh" (OuterVolumeSpecName: "kube-api-access-jtqnh") pod "5fb2cd81-437a-46be-93b5-b96ec94b1d1c" (UID: "5fb2cd81-437a-46be-93b5-b96ec94b1d1c"). InnerVolumeSpecName "kube-api-access-jtqnh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.831062 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-util" (OuterVolumeSpecName: "util") pod "5fb2cd81-437a-46be-93b5-b96ec94b1d1c" (UID: "5fb2cd81-437a-46be-93b5-b96ec94b1d1c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.900828 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.900881 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-util\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:54 crc kubenswrapper[4885]: I0308 19:48:54.900900 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtqnh\" (UniqueName: \"kubernetes.io/projected/5fb2cd81-437a-46be-93b5-b96ec94b1d1c-kube-api-access-jtqnh\") on node \"crc\" DevicePath \"\"" Mar 08 19:48:55 crc kubenswrapper[4885]: I0308 19:48:55.357843 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" event={"ID":"5fb2cd81-437a-46be-93b5-b96ec94b1d1c","Type":"ContainerDied","Data":"76747c71d15f5f8e83176ea2ced98abd16cd10232d0e50cf6db9a200b49a6fc1"} Mar 08 19:48:55 crc kubenswrapper[4885]: I0308 19:48:55.357885 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76747c71d15f5f8e83176ea2ced98abd16cd10232d0e50cf6db9a200b49a6fc1" Mar 08 19:48:55 crc kubenswrapper[4885]: I0308 19:48:55.357953 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.947399 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b8gsx"] Mar 08 19:48:58 crc kubenswrapper[4885]: E0308 19:48:58.948084 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="util" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.948106 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="util" Mar 08 19:48:58 crc kubenswrapper[4885]: E0308 19:48:58.948126 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="pull" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.948138 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="pull" Mar 08 19:48:58 crc kubenswrapper[4885]: E0308 19:48:58.948160 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="extract" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.948173 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="extract" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.948379 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fb2cd81-437a-46be-93b5-b96ec94b1d1c" containerName="extract" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.949852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.963872 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-utilities\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.964195 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vggb\" (UniqueName: \"kubernetes.io/projected/949c8a20-4064-489d-b823-8eb76415df83-kube-api-access-7vggb\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.964258 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-catalog-content\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:58 crc kubenswrapper[4885]: I0308 19:48:58.972474 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b8gsx"] Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.065870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vggb\" (UniqueName: \"kubernetes.io/projected/949c8a20-4064-489d-b823-8eb76415df83-kube-api-access-7vggb\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.066203 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-catalog-content\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.066401 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-utilities\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.066658 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-catalog-content\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.067006 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-utilities\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.088582 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vggb\" (UniqueName: \"kubernetes.io/projected/949c8a20-4064-489d-b823-8eb76415df83-kube-api-access-7vggb\") pod \"certified-operators-b8gsx\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.284632 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:48:59 crc kubenswrapper[4885]: I0308 19:48:59.559605 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b8gsx"] Mar 08 19:49:00 crc kubenswrapper[4885]: I0308 19:49:00.395297 4885 generic.go:334] "Generic (PLEG): container finished" podID="949c8a20-4064-489d-b823-8eb76415df83" containerID="c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037" exitCode=0 Mar 08 19:49:00 crc kubenswrapper[4885]: I0308 19:49:00.395526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerDied","Data":"c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037"} Mar 08 19:49:00 crc kubenswrapper[4885]: I0308 19:49:00.395548 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerStarted","Data":"dc27c8a1889ec091ab9edee5fb97f87ce1898f17ea1459b60bfccc24e32126ec"} Mar 08 19:49:01 crc kubenswrapper[4885]: I0308 19:49:01.414091 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerStarted","Data":"7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832"} Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.003698 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4"] Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.004471 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.006070 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-s4sdf" Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.028890 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4"] Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.125856 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n74vq\" (UniqueName: \"kubernetes.io/projected/9acb4d66-3a49-42b7-bd78-4d904f080c50-kube-api-access-n74vq\") pod \"openstack-operator-controller-init-6f44f7b99f-l5vj4\" (UID: \"9acb4d66-3a49-42b7-bd78-4d904f080c50\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.227547 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n74vq\" (UniqueName: \"kubernetes.io/projected/9acb4d66-3a49-42b7-bd78-4d904f080c50-kube-api-access-n74vq\") pod \"openstack-operator-controller-init-6f44f7b99f-l5vj4\" (UID: \"9acb4d66-3a49-42b7-bd78-4d904f080c50\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.247771 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n74vq\" (UniqueName: \"kubernetes.io/projected/9acb4d66-3a49-42b7-bd78-4d904f080c50-kube-api-access-n74vq\") pod \"openstack-operator-controller-init-6f44f7b99f-l5vj4\" (UID: \"9acb4d66-3a49-42b7-bd78-4d904f080c50\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.320549 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.426475 4885 generic.go:334] "Generic (PLEG): container finished" podID="949c8a20-4064-489d-b823-8eb76415df83" containerID="7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832" exitCode=0 Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.426520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerDied","Data":"7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832"} Mar 08 19:49:02 crc kubenswrapper[4885]: I0308 19:49:02.621146 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4"] Mar 08 19:49:02 crc kubenswrapper[4885]: W0308 19:49:02.627109 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9acb4d66_3a49_42b7_bd78_4d904f080c50.slice/crio-f9b21a88d5091abe7f044beb42c1a1ae8b4392eace7f8be9ed365ed1da32a218 WatchSource:0}: Error finding container f9b21a88d5091abe7f044beb42c1a1ae8b4392eace7f8be9ed365ed1da32a218: Status 404 returned error can't find the container with id f9b21a88d5091abe7f044beb42c1a1ae8b4392eace7f8be9ed365ed1da32a218 Mar 08 19:49:03 crc kubenswrapper[4885]: I0308 19:49:03.442814 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerStarted","Data":"531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2"} Mar 08 19:49:03 crc kubenswrapper[4885]: I0308 19:49:03.444220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" event={"ID":"9acb4d66-3a49-42b7-bd78-4d904f080c50","Type":"ContainerStarted","Data":"f9b21a88d5091abe7f044beb42c1a1ae8b4392eace7f8be9ed365ed1da32a218"} Mar 08 19:49:03 crc kubenswrapper[4885]: I0308 19:49:03.465751 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b8gsx" podStartSLOduration=3.028008525 podStartE2EDuration="5.465730899s" podCreationTimestamp="2026-03-08 19:48:58 +0000 UTC" firstStartedPulling="2026-03-08 19:49:00.396842375 +0000 UTC m=+1041.792896398" lastFinishedPulling="2026-03-08 19:49:02.834564749 +0000 UTC m=+1044.230618772" observedRunningTime="2026-03-08 19:49:03.461145506 +0000 UTC m=+1044.857199549" watchObservedRunningTime="2026-03-08 19:49:03.465730899 +0000 UTC m=+1044.861784932" Mar 08 19:49:07 crc kubenswrapper[4885]: I0308 19:49:07.475311 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" event={"ID":"9acb4d66-3a49-42b7-bd78-4d904f080c50","Type":"ContainerStarted","Data":"9d4a02bf406ef164aebb4bd1746fa43fc8883eecb3eff198f8b96f01188e9923"} Mar 08 19:49:07 crc kubenswrapper[4885]: I0308 19:49:07.476126 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:07 crc kubenswrapper[4885]: I0308 19:49:07.542819 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" podStartSLOduration=2.048639298 podStartE2EDuration="6.542798848s" podCreationTimestamp="2026-03-08 19:49:01 +0000 UTC" firstStartedPulling="2026-03-08 19:49:02.628942216 +0000 UTC m=+1044.024996239" lastFinishedPulling="2026-03-08 19:49:07.123101746 +0000 UTC m=+1048.519155789" observedRunningTime="2026-03-08 19:49:07.53991634 +0000 UTC m=+1048.935970363" watchObservedRunningTime="2026-03-08 19:49:07.542798848 +0000 UTC m=+1048.938852881" Mar 08 19:49:09 crc kubenswrapper[4885]: I0308 19:49:09.285639 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:49:09 crc kubenswrapper[4885]: I0308 19:49:09.285713 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:49:09 crc kubenswrapper[4885]: I0308 19:49:09.381346 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:49:09 crc kubenswrapper[4885]: I0308 19:49:09.557155 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:49:10 crc kubenswrapper[4885]: I0308 19:49:10.314203 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b8gsx"] Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.512700 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b8gsx" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="registry-server" containerID="cri-o://531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2" gracePeriod=2 Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.926858 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.965778 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-utilities\") pod \"949c8a20-4064-489d-b823-8eb76415df83\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.965857 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vggb\" (UniqueName: \"kubernetes.io/projected/949c8a20-4064-489d-b823-8eb76415df83-kube-api-access-7vggb\") pod \"949c8a20-4064-489d-b823-8eb76415df83\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.965906 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-catalog-content\") pod \"949c8a20-4064-489d-b823-8eb76415df83\" (UID: \"949c8a20-4064-489d-b823-8eb76415df83\") " Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.967174 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-utilities" (OuterVolumeSpecName: "utilities") pod "949c8a20-4064-489d-b823-8eb76415df83" (UID: "949c8a20-4064-489d-b823-8eb76415df83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:49:11 crc kubenswrapper[4885]: I0308 19:49:11.974864 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949c8a20-4064-489d-b823-8eb76415df83-kube-api-access-7vggb" (OuterVolumeSpecName: "kube-api-access-7vggb") pod "949c8a20-4064-489d-b823-8eb76415df83" (UID: "949c8a20-4064-489d-b823-8eb76415df83"). InnerVolumeSpecName "kube-api-access-7vggb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.067716 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.067757 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vggb\" (UniqueName: \"kubernetes.io/projected/949c8a20-4064-489d-b823-8eb76415df83-kube-api-access-7vggb\") on node \"crc\" DevicePath \"\"" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.224996 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "949c8a20-4064-489d-b823-8eb76415df83" (UID: "949c8a20-4064-489d-b823-8eb76415df83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.269777 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/949c8a20-4064-489d-b823-8eb76415df83-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.324130 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-l5vj4" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.522641 4885 generic.go:334] "Generic (PLEG): container finished" podID="949c8a20-4064-489d-b823-8eb76415df83" containerID="531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2" exitCode=0 Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.522706 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerDied","Data":"531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2"} Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.522748 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b8gsx" event={"ID":"949c8a20-4064-489d-b823-8eb76415df83","Type":"ContainerDied","Data":"dc27c8a1889ec091ab9edee5fb97f87ce1898f17ea1459b60bfccc24e32126ec"} Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.522778 4885 scope.go:117] "RemoveContainer" containerID="531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.522994 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b8gsx" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.541795 4885 scope.go:117] "RemoveContainer" containerID="7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.560327 4885 scope.go:117] "RemoveContainer" containerID="c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.585625 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b8gsx"] Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.586940 4885 scope.go:117] "RemoveContainer" containerID="531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2" Mar 08 19:49:12 crc kubenswrapper[4885]: E0308 19:49:12.589593 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2\": container with ID starting with 531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2 not found: ID does not exist" containerID="531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.589643 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2"} err="failed to get container status \"531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2\": rpc error: code = NotFound desc = could not find container \"531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2\": container with ID starting with 531344d342ad38d25cc2648eff50c4952ffa7c03d4732847ad4ff3689db1ded2 not found: ID does not exist" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.589674 4885 scope.go:117] "RemoveContainer" containerID="7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832" Mar 08 19:49:12 crc kubenswrapper[4885]: E0308 19:49:12.589974 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832\": container with ID starting with 7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832 not found: ID does not exist" containerID="7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.590003 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832"} err="failed to get container status \"7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832\": rpc error: code = NotFound desc = could not find container \"7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832\": container with ID starting with 7f2720842bbd7012bcec716155692acf755c146e844a7584a343350c4aa0d832 not found: ID does not exist" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.590021 4885 scope.go:117] "RemoveContainer" containerID="c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037" Mar 08 19:49:12 crc kubenswrapper[4885]: E0308 19:49:12.590312 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037\": container with ID starting with c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037 not found: ID does not exist" containerID="c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.590369 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037"} err="failed to get container status \"c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037\": rpc error: code = NotFound desc = could not find container \"c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037\": container with ID starting with c9ba77a2975b8d1dacf71a07e83e4a65c9c6eca26cb88bfeacf4d76227b4c037 not found: ID does not exist" Mar 08 19:49:12 crc kubenswrapper[4885]: I0308 19:49:12.590386 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b8gsx"] Mar 08 19:49:13 crc kubenswrapper[4885]: I0308 19:49:13.382372 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949c8a20-4064-489d-b823-8eb76415df83" path="/var/lib/kubelet/pods/949c8a20-4064-489d-b823-8eb76415df83/volumes" Mar 08 19:49:32 crc kubenswrapper[4885]: I0308 19:49:32.838667 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:49:32 crc kubenswrapper[4885]: I0308 19:49:32.839667 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.719131 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5"] Mar 08 19:49:49 crc kubenswrapper[4885]: E0308 19:49:49.719754 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="extract-content" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.719766 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="extract-content" Mar 08 19:49:49 crc kubenswrapper[4885]: E0308 19:49:49.719782 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="registry-server" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.719788 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="registry-server" Mar 08 19:49:49 crc kubenswrapper[4885]: E0308 19:49:49.719803 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="extract-utilities" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.719810 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="extract-utilities" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.719915 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="949c8a20-4064-489d-b823-8eb76415df83" containerName="registry-server" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.720297 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.722415 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-fwd6q" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.727322 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.728222 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.733792 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-cbj2v" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.743908 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.745202 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.747350 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-4bgw5" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.748155 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.752006 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.757347 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.777417 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.780115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.784502 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-p6wb8" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.793651 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.839380 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn8vs\" (UniqueName: \"kubernetes.io/projected/45c29030-0945-4655-b035-d75e8bf0f818-kube-api-access-jn8vs\") pod \"designate-operator-controller-manager-5d87c9d997-sbrjr\" (UID: \"45c29030-0945-4655-b035-d75e8bf0f818\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.839457 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gzp5\" (UniqueName: \"kubernetes.io/projected/5f89ecdd-60c3-4da6-b185-1f044d8ffc46-kube-api-access-5gzp5\") pod \"cinder-operator-controller-manager-55d77d7b5c-f9jr4\" (UID: \"5f89ecdd-60c3-4da6-b185-1f044d8ffc46\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.839492 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45mlh\" (UniqueName: \"kubernetes.io/projected/92716f38-db4c-41d9-962d-f3cc2669a7fb-kube-api-access-45mlh\") pod \"barbican-operator-controller-manager-6db6876945-rplg5\" (UID: \"92716f38-db4c-41d9-962d-f3cc2669a7fb\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.858999 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.860023 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.865261 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-2cmxp" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.868984 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.870024 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.871499 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-25k8p" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.878912 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.892407 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.900258 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.901842 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.904213 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-2jvqx" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.904369 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.921123 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.932574 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.933354 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.935876 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-vmjjc" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.938987 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.940149 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn8vs\" (UniqueName: \"kubernetes.io/projected/45c29030-0945-4655-b035-d75e8bf0f818-kube-api-access-jn8vs\") pod \"designate-operator-controller-manager-5d87c9d997-sbrjr\" (UID: \"45c29030-0945-4655-b035-d75e8bf0f818\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.940208 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gzp5\" (UniqueName: \"kubernetes.io/projected/5f89ecdd-60c3-4da6-b185-1f044d8ffc46-kube-api-access-5gzp5\") pod \"cinder-operator-controller-manager-55d77d7b5c-f9jr4\" (UID: \"5f89ecdd-60c3-4da6-b185-1f044d8ffc46\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.940261 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45mlh\" (UniqueName: \"kubernetes.io/projected/92716f38-db4c-41d9-962d-f3cc2669a7fb-kube-api-access-45mlh\") pod \"barbican-operator-controller-manager-6db6876945-rplg5\" (UID: \"92716f38-db4c-41d9-962d-f3cc2669a7fb\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.940297 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4429\" (UniqueName: \"kubernetes.io/projected/69dc5eb7-1c2e-4fbb-a220-2129df60ffb3-kube-api-access-q4429\") pod \"glance-operator-controller-manager-64db6967f8-4hstb\" (UID: \"69dc5eb7-1c2e-4fbb-a220-2129df60ffb3\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.940342 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjqkw\" (UniqueName: \"kubernetes.io/projected/4742ab81-6c6d-43c8-8025-6a656b8c40dc-kube-api-access-sjqkw\") pod \"horizon-operator-controller-manager-78bc7f9bd9-xplpw\" (UID: \"4742ab81-6c6d-43c8-8025-6a656b8c40dc\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.944994 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.946031 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.949382 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-fspv5" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.955993 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.961291 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.966366 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gzp5\" (UniqueName: \"kubernetes.io/projected/5f89ecdd-60c3-4da6-b185-1f044d8ffc46-kube-api-access-5gzp5\") pod \"cinder-operator-controller-manager-55d77d7b5c-f9jr4\" (UID: \"5f89ecdd-60c3-4da6-b185-1f044d8ffc46\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.971736 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.971773 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.972032 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.973196 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.976779 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-kwttv" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.976859 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45mlh\" (UniqueName: \"kubernetes.io/projected/92716f38-db4c-41d9-962d-f3cc2669a7fb-kube-api-access-45mlh\") pod \"barbican-operator-controller-manager-6db6876945-rplg5\" (UID: \"92716f38-db4c-41d9-962d-f3cc2669a7fb\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.977036 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-44779" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.977412 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn8vs\" (UniqueName: \"kubernetes.io/projected/45c29030-0945-4655-b035-d75e8bf0f818-kube-api-access-jn8vs\") pod \"designate-operator-controller-manager-5d87c9d997-sbrjr\" (UID: \"45c29030-0945-4655-b035-d75e8bf0f818\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.981287 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f"] Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.991381 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.995472 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-c6pxs" Mar 08 19:49:49 crc kubenswrapper[4885]: I0308 19:49:49.997223 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.008602 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042514 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042582 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7x9b\" (UniqueName: \"kubernetes.io/projected/d5770638-6059-4ce5-b401-84b0155589a3-kube-api-access-s7x9b\") pod \"heat-operator-controller-manager-cf99c678f-n88vz\" (UID: \"d5770638-6059-4ce5-b401-84b0155589a3\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042609 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k8nk\" (UniqueName: \"kubernetes.io/projected/9fc40f07-4706-4008-b86e-e73a2f2ab620-kube-api-access-4k8nk\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042640 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjqkw\" (UniqueName: \"kubernetes.io/projected/4742ab81-6c6d-43c8-8025-6a656b8c40dc-kube-api-access-sjqkw\") pod \"horizon-operator-controller-manager-78bc7f9bd9-xplpw\" (UID: \"4742ab81-6c6d-43c8-8025-6a656b8c40dc\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pl2s\" (UniqueName: \"kubernetes.io/projected/157555d5-ca64-49f8-8849-cd763c83feda-kube-api-access-6pl2s\") pod \"keystone-operator-controller-manager-7c789f89c6-hlpjf\" (UID: \"157555d5-ca64-49f8-8849-cd763c83feda\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042738 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25s88\" (UniqueName: \"kubernetes.io/projected/7180efa7-8d93-436e-8de2-78fe5c173843-kube-api-access-25s88\") pod \"ironic-operator-controller-manager-545456dc4-nclkr\" (UID: \"7180efa7-8d93-436e-8de2-78fe5c173843\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.042777 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4429\" (UniqueName: \"kubernetes.io/projected/69dc5eb7-1c2e-4fbb-a220-2129df60ffb3-kube-api-access-q4429\") pod \"glance-operator-controller-manager-64db6967f8-4hstb\" (UID: \"69dc5eb7-1c2e-4fbb-a220-2129df60ffb3\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.047865 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.060591 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.064979 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.065746 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4429\" (UniqueName: \"kubernetes.io/projected/69dc5eb7-1c2e-4fbb-a220-2129df60ffb3-kube-api-access-q4429\") pod \"glance-operator-controller-manager-64db6967f8-4hstb\" (UID: \"69dc5eb7-1c2e-4fbb-a220-2129df60ffb3\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.068069 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-ptvkv" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.070956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjqkw\" (UniqueName: \"kubernetes.io/projected/4742ab81-6c6d-43c8-8025-6a656b8c40dc-kube-api-access-sjqkw\") pod \"horizon-operator-controller-manager-78bc7f9bd9-xplpw\" (UID: \"4742ab81-6c6d-43c8-8025-6a656b8c40dc\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.073554 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.087124 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.087530 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.091041 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.091928 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.093767 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-4w2rv" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.095356 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.096249 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.100113 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.100545 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-hlblr" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.105491 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.110131 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.114951 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.116181 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.118287 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.121590 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-28nm4" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.130582 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.131782 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.137248 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-xnzlb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.139623 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.144532 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145486 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7x9b\" (UniqueName: \"kubernetes.io/projected/d5770638-6059-4ce5-b401-84b0155589a3-kube-api-access-s7x9b\") pod \"heat-operator-controller-manager-cf99c678f-n88vz\" (UID: \"d5770638-6059-4ce5-b401-84b0155589a3\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145536 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k8nk\" (UniqueName: \"kubernetes.io/projected/9fc40f07-4706-4008-b86e-e73a2f2ab620-kube-api-access-4k8nk\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145600 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rlkq\" (UniqueName: \"kubernetes.io/projected/27aa3877-54cd-414d-80a0-ab20a68ed535-kube-api-access-6rlkq\") pod \"manila-operator-controller-manager-67d996989d-q5hfb\" (UID: \"27aa3877-54cd-414d-80a0-ab20a68ed535\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145650 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5stv\" (UniqueName: \"kubernetes.io/projected/8f363429-f2b7-468c-b74b-ef14ebfab90e-kube-api-access-j5stv\") pod \"mariadb-operator-controller-manager-7b6bfb6475-2hsgc\" (UID: \"8f363429-f2b7-468c-b74b-ef14ebfab90e\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145673 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pl2s\" (UniqueName: \"kubernetes.io/projected/157555d5-ca64-49f8-8849-cd763c83feda-kube-api-access-6pl2s\") pod \"keystone-operator-controller-manager-7c789f89c6-hlpjf\" (UID: \"157555d5-ca64-49f8-8849-cd763c83feda\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145696 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25s88\" (UniqueName: \"kubernetes.io/projected/7180efa7-8d93-436e-8de2-78fe5c173843-kube-api-access-25s88\") pod \"ironic-operator-controller-manager-545456dc4-nclkr\" (UID: \"7180efa7-8d93-436e-8de2-78fe5c173843\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145751 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l76dz\" (UniqueName: \"kubernetes.io/projected/392750e0-9d71-418d-89b0-ec10f33ec505-kube-api-access-l76dz\") pod \"neutron-operator-controller-manager-54688575f-p8r6f\" (UID: \"392750e0-9d71-418d-89b0-ec10f33ec505\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.145807 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.146068 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.146120 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert podName:9fc40f07-4706-4008-b86e-e73a2f2ab620 nodeName:}" failed. No retries permitted until 2026-03-08 19:49:50.646105704 +0000 UTC m=+1092.042159727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert") pod "infra-operator-controller-manager-f7fcc58b9-vf24d" (UID: "9fc40f07-4706-4008-b86e-e73a2f2ab620") : secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.147723 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctx6x\" (UniqueName: \"kubernetes.io/projected/7c05f3ed-fe8f-47db-b596-8b90b96c295c-kube-api-access-ctx6x\") pod \"nova-operator-controller-manager-74b6b5dc96-7vtx7\" (UID: \"7c05f3ed-fe8f-47db-b596-8b90b96c295c\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.176403 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k8nk\" (UniqueName: \"kubernetes.io/projected/9fc40f07-4706-4008-b86e-e73a2f2ab620-kube-api-access-4k8nk\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.177590 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7x9b\" (UniqueName: \"kubernetes.io/projected/d5770638-6059-4ce5-b401-84b0155589a3-kube-api-access-s7x9b\") pod \"heat-operator-controller-manager-cf99c678f-n88vz\" (UID: \"d5770638-6059-4ce5-b401-84b0155589a3\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.179428 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pl2s\" (UniqueName: \"kubernetes.io/projected/157555d5-ca64-49f8-8849-cd763c83feda-kube-api-access-6pl2s\") pod \"keystone-operator-controller-manager-7c789f89c6-hlpjf\" (UID: \"157555d5-ca64-49f8-8849-cd763c83feda\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.186512 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25s88\" (UniqueName: \"kubernetes.io/projected/7180efa7-8d93-436e-8de2-78fe5c173843-kube-api-access-25s88\") pod \"ironic-operator-controller-manager-545456dc4-nclkr\" (UID: \"7180efa7-8d93-436e-8de2-78fe5c173843\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.187450 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.192132 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.194363 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.198119 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-brs4j" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.198945 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.231627 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248607 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l76dz\" (UniqueName: \"kubernetes.io/projected/392750e0-9d71-418d-89b0-ec10f33ec505-kube-api-access-l76dz\") pod \"neutron-operator-controller-manager-54688575f-p8r6f\" (UID: \"392750e0-9d71-418d-89b0-ec10f33ec505\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctx6x\" (UniqueName: \"kubernetes.io/projected/7c05f3ed-fe8f-47db-b596-8b90b96c295c-kube-api-access-ctx6x\") pod \"nova-operator-controller-manager-74b6b5dc96-7vtx7\" (UID: \"7c05f3ed-fe8f-47db-b596-8b90b96c295c\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248705 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248737 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7snbb\" (UniqueName: \"kubernetes.io/projected/bbb8966a-e61f-427d-af2a-0fdab2348d03-kube-api-access-7snbb\") pod \"octavia-operator-controller-manager-5d86c7ddb7-k4r6w\" (UID: \"bbb8966a-e61f-427d-af2a-0fdab2348d03\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtkpt\" (UniqueName: \"kubernetes.io/projected/8d086566-6154-4ddd-8028-a9c203cfec11-kube-api-access-mtkpt\") pod \"placement-operator-controller-manager-648564c9fc-4gfw2\" (UID: \"8d086566-6154-4ddd-8028-a9c203cfec11\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248782 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnnlf\" (UniqueName: \"kubernetes.io/projected/d8de7df0-2dea-4d3c-a02e-57bfabade82f-kube-api-access-vnnlf\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248807 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rlkq\" (UniqueName: \"kubernetes.io/projected/27aa3877-54cd-414d-80a0-ab20a68ed535-kube-api-access-6rlkq\") pod \"manila-operator-controller-manager-67d996989d-q5hfb\" (UID: \"27aa3877-54cd-414d-80a0-ab20a68ed535\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248828 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt8f7\" (UniqueName: \"kubernetes.io/projected/44fbac8d-d81f-4c03-9555-ef33551d478d-kube-api-access-kt8f7\") pod \"ovn-operator-controller-manager-75684d597f-wdrfh\" (UID: \"44fbac8d-d81f-4c03-9555-ef33551d478d\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.248845 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5stv\" (UniqueName: \"kubernetes.io/projected/8f363429-f2b7-468c-b74b-ef14ebfab90e-kube-api-access-j5stv\") pod \"mariadb-operator-controller-manager-7b6bfb6475-2hsgc\" (UID: \"8f363429-f2b7-468c-b74b-ef14ebfab90e\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.249384 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.266879 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5stv\" (UniqueName: \"kubernetes.io/projected/8f363429-f2b7-468c-b74b-ef14ebfab90e-kube-api-access-j5stv\") pod \"mariadb-operator-controller-manager-7b6bfb6475-2hsgc\" (UID: \"8f363429-f2b7-468c-b74b-ef14ebfab90e\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.267316 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.268181 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.268489 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l76dz\" (UniqueName: \"kubernetes.io/projected/392750e0-9d71-418d-89b0-ec10f33ec505-kube-api-access-l76dz\") pod \"neutron-operator-controller-manager-54688575f-p8r6f\" (UID: \"392750e0-9d71-418d-89b0-ec10f33ec505\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.269037 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.270148 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctx6x\" (UniqueName: \"kubernetes.io/projected/7c05f3ed-fe8f-47db-b596-8b90b96c295c-kube-api-access-ctx6x\") pod \"nova-operator-controller-manager-74b6b5dc96-7vtx7\" (UID: \"7c05f3ed-fe8f-47db-b596-8b90b96c295c\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.272482 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rlkq\" (UniqueName: \"kubernetes.io/projected/27aa3877-54cd-414d-80a0-ab20a68ed535-kube-api-access-6rlkq\") pod \"manila-operator-controller-manager-67d996989d-q5hfb\" (UID: \"27aa3877-54cd-414d-80a0-ab20a68ed535\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.276009 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-c2b9s" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.281632 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.339356 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.341154 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.343052 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.345512 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-4cwrh" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.349734 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk6ft\" (UniqueName: \"kubernetes.io/projected/d9580392-741e-406b-b72d-91aa945f65c2-kube-api-access-tk6ft\") pod \"swift-operator-controller-manager-9b9ff9f4d-7hgld\" (UID: \"d9580392-741e-406b-b72d-91aa945f65c2\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.349782 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkgj6\" (UniqueName: \"kubernetes.io/projected/c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b-kube-api-access-vkgj6\") pod \"telemetry-operator-controller-manager-5fdb694969-7mghs\" (UID: \"c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.349827 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.349858 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7snbb\" (UniqueName: \"kubernetes.io/projected/bbb8966a-e61f-427d-af2a-0fdab2348d03-kube-api-access-7snbb\") pod \"octavia-operator-controller-manager-5d86c7ddb7-k4r6w\" (UID: \"bbb8966a-e61f-427d-af2a-0fdab2348d03\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.349941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtkpt\" (UniqueName: \"kubernetes.io/projected/8d086566-6154-4ddd-8028-a9c203cfec11-kube-api-access-mtkpt\") pod \"placement-operator-controller-manager-648564c9fc-4gfw2\" (UID: \"8d086566-6154-4ddd-8028-a9c203cfec11\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.349988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnnlf\" (UniqueName: \"kubernetes.io/projected/d8de7df0-2dea-4d3c-a02e-57bfabade82f-kube-api-access-vnnlf\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.350030 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt8f7\" (UniqueName: \"kubernetes.io/projected/44fbac8d-d81f-4c03-9555-ef33551d478d-kube-api-access-kt8f7\") pod \"ovn-operator-controller-manager-75684d597f-wdrfh\" (UID: \"44fbac8d-d81f-4c03-9555-ef33551d478d\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.350123 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.350171 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert podName:d8de7df0-2dea-4d3c-a02e-57bfabade82f nodeName:}" failed. No retries permitted until 2026-03-08 19:49:50.850156466 +0000 UTC m=+1092.246210479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" (UID: "d8de7df0-2dea-4d3c-a02e-57bfabade82f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.356099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.383462 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtkpt\" (UniqueName: \"kubernetes.io/projected/8d086566-6154-4ddd-8028-a9c203cfec11-kube-api-access-mtkpt\") pod \"placement-operator-controller-manager-648564c9fc-4gfw2\" (UID: \"8d086566-6154-4ddd-8028-a9c203cfec11\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.387142 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.393177 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7snbb\" (UniqueName: \"kubernetes.io/projected/bbb8966a-e61f-427d-af2a-0fdab2348d03-kube-api-access-7snbb\") pod \"octavia-operator-controller-manager-5d86c7ddb7-k4r6w\" (UID: \"bbb8966a-e61f-427d-af2a-0fdab2348d03\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.396010 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt8f7\" (UniqueName: \"kubernetes.io/projected/44fbac8d-d81f-4c03-9555-ef33551d478d-kube-api-access-kt8f7\") pod \"ovn-operator-controller-manager-75684d597f-wdrfh\" (UID: \"44fbac8d-d81f-4c03-9555-ef33551d478d\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.406848 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnnlf\" (UniqueName: \"kubernetes.io/projected/d8de7df0-2dea-4d3c-a02e-57bfabade82f-kube-api-access-vnnlf\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.426838 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.427696 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.439533 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.447970 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5hdn5" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.448211 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.452763 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk6ft\" (UniqueName: \"kubernetes.io/projected/d9580392-741e-406b-b72d-91aa945f65c2-kube-api-access-tk6ft\") pod \"swift-operator-controller-manager-9b9ff9f4d-7hgld\" (UID: \"d9580392-741e-406b-b72d-91aa945f65c2\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.454134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkgj6\" (UniqueName: \"kubernetes.io/projected/c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b-kube-api-access-vkgj6\") pod \"telemetry-operator-controller-manager-5fdb694969-7mghs\" (UID: \"c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.454293 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrzrw\" (UniqueName: \"kubernetes.io/projected/ea5acc0f-2ad8-46d5-80a2-502e2900fdd6-kube-api-access-mrzrw\") pod \"test-operator-controller-manager-55b5ff4dbb-xf4hm\" (UID: \"ea5acc0f-2ad8-46d5-80a2-502e2900fdd6\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.471553 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.483351 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.488739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk6ft\" (UniqueName: \"kubernetes.io/projected/d9580392-741e-406b-b72d-91aa945f65c2-kube-api-access-tk6ft\") pod \"swift-operator-controller-manager-9b9ff9f4d-7hgld\" (UID: \"d9580392-741e-406b-b72d-91aa945f65c2\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.496077 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkgj6\" (UniqueName: \"kubernetes.io/projected/c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b-kube-api-access-vkgj6\") pod \"telemetry-operator-controller-manager-5fdb694969-7mghs\" (UID: \"c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.503636 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.510617 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.511543 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.518192 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.532662 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.533141 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.538772 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-qzlfh" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.538911 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.539026 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.557411 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkl5q\" (UniqueName: \"kubernetes.io/projected/d5136d34-82a8-47c5-9d7d-09e0206587e8-kube-api-access-mkl5q\") pod \"watcher-operator-controller-manager-bccc79885-66zgf\" (UID: \"d5136d34-82a8-47c5-9d7d-09e0206587e8\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.557574 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrzrw\" (UniqueName: \"kubernetes.io/projected/ea5acc0f-2ad8-46d5-80a2-502e2900fdd6-kube-api-access-mrzrw\") pod \"test-operator-controller-manager-55b5ff4dbb-xf4hm\" (UID: \"ea5acc0f-2ad8-46d5-80a2-502e2900fdd6\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.561908 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.563204 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.573302 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-dxr5z" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.583439 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrzrw\" (UniqueName: \"kubernetes.io/projected/ea5acc0f-2ad8-46d5-80a2-502e2900fdd6-kube-api-access-mrzrw\") pod \"test-operator-controller-manager-55b5ff4dbb-xf4hm\" (UID: \"ea5acc0f-2ad8-46d5-80a2-502e2900fdd6\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.586419 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.601860 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.616554 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.658488 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkl5q\" (UniqueName: \"kubernetes.io/projected/d5136d34-82a8-47c5-9d7d-09e0206587e8-kube-api-access-mkl5q\") pod \"watcher-operator-controller-manager-bccc79885-66zgf\" (UID: \"d5136d34-82a8-47c5-9d7d-09e0206587e8\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.658540 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.658598 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.658620 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w56j6\" (UniqueName: \"kubernetes.io/projected/deedb14e-007e-44eb-bd52-85bbc12d0bec-kube-api-access-w56j6\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.658648 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snn8c\" (UniqueName: \"kubernetes.io/projected/a8caa87f-832f-4436-beaa-aaa505de3bac-kube-api-access-snn8c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pd9b2\" (UID: \"a8caa87f-832f-4436-beaa-aaa505de3bac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.658690 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.659082 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.659125 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert podName:9fc40f07-4706-4008-b86e-e73a2f2ab620 nodeName:}" failed. No retries permitted until 2026-03-08 19:49:51.659110669 +0000 UTC m=+1093.055164692 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert") pod "infra-operator-controller-manager-f7fcc58b9-vf24d" (UID: "9fc40f07-4706-4008-b86e-e73a2f2ab620") : secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.666577 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.675272 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.682128 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkl5q\" (UniqueName: \"kubernetes.io/projected/d5136d34-82a8-47c5-9d7d-09e0206587e8-kube-api-access-mkl5q\") pod \"watcher-operator-controller-manager-bccc79885-66zgf\" (UID: \"d5136d34-82a8-47c5-9d7d-09e0206587e8\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.759721 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.759838 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.761300 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w56j6\" (UniqueName: \"kubernetes.io/projected/deedb14e-007e-44eb-bd52-85bbc12d0bec-kube-api-access-w56j6\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.761357 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snn8c\" (UniqueName: \"kubernetes.io/projected/a8caa87f-832f-4436-beaa-aaa505de3bac-kube-api-access-snn8c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pd9b2\" (UID: \"a8caa87f-832f-4436-beaa-aaa505de3bac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.761941 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.761985 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:51.261971437 +0000 UTC m=+1092.658025460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "metrics-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.762183 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.762209 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:51.262201273 +0000 UTC m=+1092.658255296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.776982 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.782451 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w56j6\" (UniqueName: \"kubernetes.io/projected/deedb14e-007e-44eb-bd52-85bbc12d0bec-kube-api-access-w56j6\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.790547 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snn8c\" (UniqueName: \"kubernetes.io/projected/a8caa87f-832f-4436-beaa-aaa505de3bac-kube-api-access-snn8c\") pod \"rabbitmq-cluster-operator-manager-668c99d594-pd9b2\" (UID: \"a8caa87f-832f-4436-beaa-aaa505de3bac\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.806995 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr"] Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.832004 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" event={"ID":"5f89ecdd-60c3-4da6-b185-1f044d8ffc46","Type":"ContainerStarted","Data":"48b0f86ce995df16a2a3fc037110d4df83435b585f5cda5e3cc460093a361871"} Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.843806 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" event={"ID":"92716f38-db4c-41d9-962d-f3cc2669a7fb","Type":"ContainerStarted","Data":"756492bc66b59da92f7b4c56ebdb96f7b5fa6f316e89c45f1ccfe8d28edcee87"} Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.862521 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.862750 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: E0308 19:49:50.862846 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert podName:d8de7df0-2dea-4d3c-a02e-57bfabade82f nodeName:}" failed. No retries permitted until 2026-03-08 19:49:51.862827771 +0000 UTC m=+1093.258881794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" (UID: "d8de7df0-2dea-4d3c-a02e-57bfabade82f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:50 crc kubenswrapper[4885]: I0308 19:49:50.892724 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.051513 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.071243 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr"] Mar 08 19:49:51 crc kubenswrapper[4885]: W0308 19:49:51.076479 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7180efa7_8d93_436e_8de2_78fe5c173843.slice/crio-b5e60bda31677dfba1c37e87695e7d278e8bb1997364df8f0293c04a4f8e8464 WatchSource:0}: Error finding container b5e60bda31677dfba1c37e87695e7d278e8bb1997364df8f0293c04a4f8e8464: Status 404 returned error can't find the container with id b5e60bda31677dfba1c37e87695e7d278e8bb1997364df8f0293c04a4f8e8464 Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.094638 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz"] Mar 08 19:49:51 crc kubenswrapper[4885]: W0308 19:49:51.109449 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4742ab81_6c6d_43c8_8025_6a656b8c40dc.slice/crio-c0fe8f1a3bd237f888c8a31e50768006cb9b167663fc9864c11e258af15ea6f0 WatchSource:0}: Error finding container c0fe8f1a3bd237f888c8a31e50768006cb9b167663fc9864c11e258af15ea6f0: Status 404 returned error can't find the container with id c0fe8f1a3bd237f888c8a31e50768006cb9b167663fc9864c11e258af15ea6f0 Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.113317 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.123838 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.199345 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f"] Mar 08 19:49:51 crc kubenswrapper[4885]: W0308 19:49:51.208771 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c05f3ed_fe8f_47db_b596_8b90b96c295c.slice/crio-5d7f4b3a38d9f8fe5ccceced349c318df86e0d9116ec2742b501047606e209f9 WatchSource:0}: Error finding container 5d7f4b3a38d9f8fe5ccceced349c318df86e0d9116ec2742b501047606e209f9: Status 404 returned error can't find the container with id 5d7f4b3a38d9f8fe5ccceced349c318df86e0d9116ec2742b501047606e209f9 Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.210906 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.227343 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.239561 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc"] Mar 08 19:49:51 crc kubenswrapper[4885]: W0308 19:49:51.239963 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27aa3877_54cd_414d_80a0_ab20a68ed535.slice/crio-ec3415b20908b61c4dd355713fa0c046f81a65a76b0c71483605479ade138ddc WatchSource:0}: Error finding container ec3415b20908b61c4dd355713fa0c046f81a65a76b0c71483605479ade138ddc: Status 404 returned error can't find the container with id ec3415b20908b61c4dd355713fa0c046f81a65a76b0c71483605479ade138ddc Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.268948 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.269014 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.269253 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.269309 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:52.269291111 +0000 UTC m=+1093.665345144 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "webhook-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.269682 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.269725 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:52.269713962 +0000 UTC m=+1093.665767985 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "metrics-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.380857 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.380892 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.384661 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w"] Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.388874 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mrzrw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-55b5ff4dbb-xf4hm_openstack-operators(ea5acc0f-2ad8-46d5-80a2-502e2900fdd6): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.390039 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" podUID="ea5acc0f-2ad8-46d5-80a2-502e2900fdd6" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.398394 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7snbb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5d86c7ddb7-k4r6w_openstack-operators(bbb8966a-e61f-427d-af2a-0fdab2348d03): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.399855 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" podUID="bbb8966a-e61f-427d-af2a-0fdab2348d03" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.400516 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.409219 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs"] Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.423374 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh"] Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.428863 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mtkpt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-4gfw2_openstack-operators(8d086566-6154-4ddd-8028-a9c203cfec11): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.430151 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" podUID="8d086566-6154-4ddd-8028-a9c203cfec11" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.431437 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kt8f7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-75684d597f-wdrfh_openstack-operators(44fbac8d-d81f-4c03-9555-ef33551d478d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.432759 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" podUID="44fbac8d-d81f-4c03-9555-ef33551d478d" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.511902 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2"] Mar 08 19:49:51 crc kubenswrapper[4885]: W0308 19:49:51.519577 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8caa87f_832f_4436_beaa_aaa505de3bac.slice/crio-1c3793df69403e4f28322a43bec14126c61384bb5fca6a1036a199b548db3ae1 WatchSource:0}: Error finding container 1c3793df69403e4f28322a43bec14126c61384bb5fca6a1036a199b548db3ae1: Status 404 returned error can't find the container with id 1c3793df69403e4f28322a43bec14126c61384bb5fca6a1036a199b548db3ae1 Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.520059 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf"] Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.522911 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-snn8c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-pd9b2_openstack-operators(a8caa87f-832f-4436-beaa-aaa505de3bac): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.524854 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" podUID="a8caa87f-832f-4436-beaa-aaa505de3bac" Mar 08 19:49:51 crc kubenswrapper[4885]: W0308 19:49:51.526310 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5136d34_82a8_47c5_9d7d_09e0206587e8.slice/crio-5fc4ee9939196cbb8192e2aefc421bfb5ee60de0012d2d0f07dd61b87edd41f2 WatchSource:0}: Error finding container 5fc4ee9939196cbb8192e2aefc421bfb5ee60de0012d2d0f07dd61b87edd41f2: Status 404 returned error can't find the container with id 5fc4ee9939196cbb8192e2aefc421bfb5ee60de0012d2d0f07dd61b87edd41f2 Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.528728 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mkl5q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-66zgf_openstack-operators(d5136d34-82a8-47c5-9d7d-09e0206587e8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.530203 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" podUID="d5136d34-82a8-47c5-9d7d-09e0206587e8" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.673729 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.673942 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.674047 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert podName:9fc40f07-4706-4008-b86e-e73a2f2ab620 nodeName:}" failed. No retries permitted until 2026-03-08 19:49:53.674029223 +0000 UTC m=+1095.070083246 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert") pod "infra-operator-controller-manager-f7fcc58b9-vf24d" (UID: "9fc40f07-4706-4008-b86e-e73a2f2ab620") : secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.862457 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" event={"ID":"d5136d34-82a8-47c5-9d7d-09e0206587e8","Type":"ContainerStarted","Data":"5fc4ee9939196cbb8192e2aefc421bfb5ee60de0012d2d0f07dd61b87edd41f2"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.863974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" event={"ID":"7c05f3ed-fe8f-47db-b596-8b90b96c295c","Type":"ContainerStarted","Data":"5d7f4b3a38d9f8fe5ccceced349c318df86e0d9116ec2742b501047606e209f9"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.876903 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.877143 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" podUID="d5136d34-82a8-47c5-9d7d-09e0206587e8" Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.878035 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.878218 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert podName:d8de7df0-2dea-4d3c-a02e-57bfabade82f nodeName:}" failed. No retries permitted until 2026-03-08 19:49:53.878201797 +0000 UTC m=+1095.274255820 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" (UID: "d8de7df0-2dea-4d3c-a02e-57bfabade82f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.878601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" event={"ID":"27aa3877-54cd-414d-80a0-ab20a68ed535","Type":"ContainerStarted","Data":"ec3415b20908b61c4dd355713fa0c046f81a65a76b0c71483605479ade138ddc"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.879668 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" event={"ID":"d5770638-6059-4ce5-b401-84b0155589a3","Type":"ContainerStarted","Data":"00f2e860692473c9453bb342db6fb5e86415cef335b39e93b7cd6e8d51ea80ea"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.881836 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" event={"ID":"bbb8966a-e61f-427d-af2a-0fdab2348d03","Type":"ContainerStarted","Data":"7de47ba1e2ce934c6011ab0166bb5689c1ea770d87a8cccf125ef3004a8c9db8"} Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.882892 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" podUID="bbb8966a-e61f-427d-af2a-0fdab2348d03" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.883764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" event={"ID":"a8caa87f-832f-4436-beaa-aaa505de3bac","Type":"ContainerStarted","Data":"1c3793df69403e4f28322a43bec14126c61384bb5fca6a1036a199b548db3ae1"} Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.888207 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" podUID="a8caa87f-832f-4436-beaa-aaa505de3bac" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.890721 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" event={"ID":"8d086566-6154-4ddd-8028-a9c203cfec11","Type":"ContainerStarted","Data":"4fc553a4c42a95d7fcb95abe6357e3c0699d49e15dfe1a6b2c7a44f1ed212a6a"} Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.895675 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" podUID="8d086566-6154-4ddd-8028-a9c203cfec11" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.895685 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" event={"ID":"157555d5-ca64-49f8-8849-cd763c83feda","Type":"ContainerStarted","Data":"55b8e268ae419efc6beb90b074b226ced0641fd94c30416a4f9e9b6518087557"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.897517 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" event={"ID":"4742ab81-6c6d-43c8-8025-6a656b8c40dc","Type":"ContainerStarted","Data":"c0fe8f1a3bd237f888c8a31e50768006cb9b167663fc9864c11e258af15ea6f0"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.918864 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" event={"ID":"ea5acc0f-2ad8-46d5-80a2-502e2900fdd6","Type":"ContainerStarted","Data":"6f39c495421440ea87ad5b93b507d32dd20fb01def89d8d8ae657ed805fb2786"} Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.935139 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" podUID="ea5acc0f-2ad8-46d5-80a2-502e2900fdd6" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.938544 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" event={"ID":"44fbac8d-d81f-4c03-9555-ef33551d478d","Type":"ContainerStarted","Data":"1ad56fbe65ece63433e9568e838f9491891381c7bad7489a040a9232220f6bef"} Mar 08 19:49:51 crc kubenswrapper[4885]: E0308 19:49:51.944334 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" podUID="44fbac8d-d81f-4c03-9555-ef33551d478d" Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.946124 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" event={"ID":"45c29030-0945-4655-b035-d75e8bf0f818","Type":"ContainerStarted","Data":"dc28f5b1f2bea8e9a119ba3f574760ac3ab69da97cef38ddebaf640a31c93cbd"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.985307 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" event={"ID":"c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b","Type":"ContainerStarted","Data":"ef7e432e55054de16167cb7df63e6b9c98f51a80335795e4e8c80fdcc02bc95b"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.988652 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" event={"ID":"7180efa7-8d93-436e-8de2-78fe5c173843","Type":"ContainerStarted","Data":"b5e60bda31677dfba1c37e87695e7d278e8bb1997364df8f0293c04a4f8e8464"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.989598 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" event={"ID":"392750e0-9d71-418d-89b0-ec10f33ec505","Type":"ContainerStarted","Data":"e42f2ec6f62fe167702edf74845cf83c2fde632481abbcdf07e044947854e598"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.990325 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" event={"ID":"d9580392-741e-406b-b72d-91aa945f65c2","Type":"ContainerStarted","Data":"a2a924fea82282c74f5dda0c1eccc6dee4431859ceceb7b009efb2c4df26e91e"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.991029 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" event={"ID":"8f363429-f2b7-468c-b74b-ef14ebfab90e","Type":"ContainerStarted","Data":"d4e9495a6c0d1aa363282f9491bf343890738fe0e57586495f940e24e4a743da"} Mar 08 19:49:51 crc kubenswrapper[4885]: I0308 19:49:51.992735 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" event={"ID":"69dc5eb7-1c2e-4fbb-a220-2129df60ffb3","Type":"ContainerStarted","Data":"c62e59e3dd7a0d4f390165a5b9d055d4cc6b5e34d4082535e927defcaae047af"} Mar 08 19:49:52 crc kubenswrapper[4885]: I0308 19:49:52.296774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:52 crc kubenswrapper[4885]: I0308 19:49:52.296842 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:52 crc kubenswrapper[4885]: E0308 19:49:52.296993 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 19:49:52 crc kubenswrapper[4885]: E0308 19:49:52.296992 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 19:49:52 crc kubenswrapper[4885]: E0308 19:49:52.297045 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:54.297030286 +0000 UTC m=+1095.693084299 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "webhook-server-cert" not found Mar 08 19:49:52 crc kubenswrapper[4885]: E0308 19:49:52.297058 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:54.297051656 +0000 UTC m=+1095.693105679 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "metrics-server-cert" not found Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.026151 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:9f73c84a9581b5739d8da333c7b64403d7b7ca284b22c624d0effe07f3d2819c\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" podUID="44fbac8d-d81f-4c03-9555-ef33551d478d" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.026515 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" podUID="8d086566-6154-4ddd-8028-a9c203cfec11" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.026559 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" podUID="bbb8966a-e61f-427d-af2a-0fdab2348d03" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.026594 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" podUID="a8caa87f-832f-4436-beaa-aaa505de3bac" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.026629 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" podUID="d5136d34-82a8-47c5-9d7d-09e0206587e8" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.026662 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" podUID="ea5acc0f-2ad8-46d5-80a2-502e2900fdd6" Mar 08 19:49:53 crc kubenswrapper[4885]: I0308 19:49:53.722286 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.722507 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.722726 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert podName:9fc40f07-4706-4008-b86e-e73a2f2ab620 nodeName:}" failed. No retries permitted until 2026-03-08 19:49:57.722709263 +0000 UTC m=+1099.118763286 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert") pod "infra-operator-controller-manager-f7fcc58b9-vf24d" (UID: "9fc40f07-4706-4008-b86e-e73a2f2ab620") : secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:53 crc kubenswrapper[4885]: I0308 19:49:53.931590 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.932041 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:53 crc kubenswrapper[4885]: E0308 19:49:53.932126 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert podName:d8de7df0-2dea-4d3c-a02e-57bfabade82f nodeName:}" failed. No retries permitted until 2026-03-08 19:49:57.932086355 +0000 UTC m=+1099.328140378 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" (UID: "d8de7df0-2dea-4d3c-a02e-57bfabade82f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:54 crc kubenswrapper[4885]: I0308 19:49:54.337983 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:54 crc kubenswrapper[4885]: I0308 19:49:54.338067 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:54 crc kubenswrapper[4885]: E0308 19:49:54.338156 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 19:49:54 crc kubenswrapper[4885]: E0308 19:49:54.338228 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:58.338209305 +0000 UTC m=+1099.734263328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "metrics-server-cert" not found Mar 08 19:49:54 crc kubenswrapper[4885]: E0308 19:49:54.338222 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 19:49:54 crc kubenswrapper[4885]: E0308 19:49:54.338299 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:49:58.338281317 +0000 UTC m=+1099.734335340 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "webhook-server-cert" not found Mar 08 19:49:57 crc kubenswrapper[4885]: I0308 19:49:57.799270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:49:57 crc kubenswrapper[4885]: E0308 19:49:57.800116 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:57 crc kubenswrapper[4885]: E0308 19:49:57.800381 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert podName:9fc40f07-4706-4008-b86e-e73a2f2ab620 nodeName:}" failed. No retries permitted until 2026-03-08 19:50:05.800355706 +0000 UTC m=+1107.196409729 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert") pod "infra-operator-controller-manager-f7fcc58b9-vf24d" (UID: "9fc40f07-4706-4008-b86e-e73a2f2ab620") : secret "infra-operator-webhook-server-cert" not found Mar 08 19:49:58 crc kubenswrapper[4885]: I0308 19:49:58.003214 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:49:58 crc kubenswrapper[4885]: E0308 19:49:58.003356 4885 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:58 crc kubenswrapper[4885]: E0308 19:49:58.003588 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert podName:d8de7df0-2dea-4d3c-a02e-57bfabade82f nodeName:}" failed. No retries permitted until 2026-03-08 19:50:06.003574205 +0000 UTC m=+1107.399628228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" (UID: "d8de7df0-2dea-4d3c-a02e-57bfabade82f") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 08 19:49:58 crc kubenswrapper[4885]: I0308 19:49:58.411448 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:58 crc kubenswrapper[4885]: E0308 19:49:58.411958 4885 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 08 19:49:58 crc kubenswrapper[4885]: E0308 19:49:58.412046 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:50:06.412027227 +0000 UTC m=+1107.808081250 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "webhook-server-cert" not found Mar 08 19:49:58 crc kubenswrapper[4885]: I0308 19:49:58.412971 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:49:58 crc kubenswrapper[4885]: E0308 19:49:58.413078 4885 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 08 19:49:58 crc kubenswrapper[4885]: E0308 19:49:58.413117 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs podName:deedb14e-007e-44eb-bd52-85bbc12d0bec nodeName:}" failed. No retries permitted until 2026-03-08 19:50:06.413107616 +0000 UTC m=+1107.809161639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-pzg95" (UID: "deedb14e-007e-44eb-bd52-85bbc12d0bec") : secret "metrics-server-cert" not found Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.146776 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549990-8n4qs"] Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.148772 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.157856 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549990-8n4qs"] Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.184697 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.184954 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.185429 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.341954 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqrg7\" (UniqueName: \"kubernetes.io/projected/d8bff80c-e537-4de5-8a05-85ee81004c30-kube-api-access-mqrg7\") pod \"auto-csr-approver-29549990-8n4qs\" (UID: \"d8bff80c-e537-4de5-8a05-85ee81004c30\") " pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.444022 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqrg7\" (UniqueName: \"kubernetes.io/projected/d8bff80c-e537-4de5-8a05-85ee81004c30-kube-api-access-mqrg7\") pod \"auto-csr-approver-29549990-8n4qs\" (UID: \"d8bff80c-e537-4de5-8a05-85ee81004c30\") " pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.486636 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqrg7\" (UniqueName: \"kubernetes.io/projected/d8bff80c-e537-4de5-8a05-85ee81004c30-kube-api-access-mqrg7\") pod \"auto-csr-approver-29549990-8n4qs\" (UID: \"d8bff80c-e537-4de5-8a05-85ee81004c30\") " pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:00 crc kubenswrapper[4885]: I0308 19:50:00.512753 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:02 crc kubenswrapper[4885]: I0308 19:50:02.818015 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:50:02 crc kubenswrapper[4885]: I0308 19:50:02.818424 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:50:04 crc kubenswrapper[4885]: E0308 19:50:04.770329 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:b242403a27609ac87a0ed3a7dd788aceaf8f3da3620981cf5e000d56862d77a4" Mar 08 19:50:04 crc kubenswrapper[4885]: E0308 19:50:04.770688 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:b242403a27609ac87a0ed3a7dd788aceaf8f3da3620981cf5e000d56862d77a4,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l76dz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-54688575f-p8r6f_openstack-operators(392750e0-9d71-418d-89b0-ec10f33ec505): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:50:04 crc kubenswrapper[4885]: E0308 19:50:04.772488 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" podUID="392750e0-9d71-418d-89b0-ec10f33ec505" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.134283 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:b242403a27609ac87a0ed3a7dd788aceaf8f3da3620981cf5e000d56862d77a4\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" podUID="392750e0-9d71-418d-89b0-ec10f33ec505" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.289813 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.290002 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ctx6x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-74b6b5dc96-7vtx7_openstack-operators(7c05f3ed-fe8f-47db-b596-8b90b96c295c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.292057 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" podUID="7c05f3ed-fe8f-47db-b596-8b90b96c295c" Mar 08 19:50:05 crc kubenswrapper[4885]: I0308 19:50:05.834682 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.834866 4885 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.834909 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert podName:9fc40f07-4706-4008-b86e-e73a2f2ab620 nodeName:}" failed. No retries permitted until 2026-03-08 19:50:21.834896361 +0000 UTC m=+1123.230950384 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert") pod "infra-operator-controller-manager-f7fcc58b9-vf24d" (UID: "9fc40f07-4706-4008-b86e-e73a2f2ab620") : secret "infra-operator-webhook-server-cert" not found Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.857605 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.858295 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6pl2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-hlpjf_openstack-operators(157555d5-ca64-49f8-8849-cd763c83feda): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:50:05 crc kubenswrapper[4885]: E0308 19:50:05.859673 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" podUID="157555d5-ca64-49f8-8849-cd763c83feda" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.037215 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.047091 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d8de7df0-2dea-4d3c-a02e-57bfabade82f-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7\" (UID: \"d8de7df0-2dea-4d3c-a02e-57bfabade82f\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.079717 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.141735 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" event={"ID":"69dc5eb7-1c2e-4fbb-a220-2129df60ffb3","Type":"ContainerStarted","Data":"4250c26dd3c57856cdfdfa6f65fcfcfa2196a6c66abbbfef44e518244929cd2b"} Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.142143 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:50:06 crc kubenswrapper[4885]: E0308 19:50:06.143748 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" podUID="157555d5-ca64-49f8-8849-cd763c83feda" Mar 08 19:50:06 crc kubenswrapper[4885]: E0308 19:50:06.147247 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:172f24bd4603ac3498536a8a2c8fffb07cf9113dd52bc132778ea0aa275c6b84\\\"\"" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" podUID="7c05f3ed-fe8f-47db-b596-8b90b96c295c" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.168663 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" podStartSLOduration=2.453742179 podStartE2EDuration="17.168634024s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.114662734 +0000 UTC m=+1092.510716757" lastFinishedPulling="2026-03-08 19:50:05.829554579 +0000 UTC m=+1107.225608602" observedRunningTime="2026-03-08 19:50:06.159739257 +0000 UTC m=+1107.555793280" watchObservedRunningTime="2026-03-08 19:50:06.168634024 +0000 UTC m=+1107.564688047" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.232486 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549990-8n4qs"] Mar 08 19:50:06 crc kubenswrapper[4885]: W0308 19:50:06.285145 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8bff80c_e537_4de5_8a05_85ee81004c30.slice/crio-98430ed7c05544a1065f6eb94c43d341d1135559ca82def3f00aac4b1fc8f128 WatchSource:0}: Error finding container 98430ed7c05544a1065f6eb94c43d341d1135559ca82def3f00aac4b1fc8f128: Status 404 returned error can't find the container with id 98430ed7c05544a1065f6eb94c43d341d1135559ca82def3f00aac4b1fc8f128 Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.442008 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.442063 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.446301 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.446346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/deedb14e-007e-44eb-bd52-85bbc12d0bec-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-pzg95\" (UID: \"deedb14e-007e-44eb-bd52-85bbc12d0bec\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.463611 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.636355 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7"] Mar 08 19:50:06 crc kubenswrapper[4885]: I0308 19:50:06.922203 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95"] Mar 08 19:50:06 crc kubenswrapper[4885]: W0308 19:50:06.952332 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddeedb14e_007e_44eb_bd52_85bbc12d0bec.slice/crio-774b15b33cb8d3fd049dc8dda6d7a0394047309bcc920150a475de194f68ca35 WatchSource:0}: Error finding container 774b15b33cb8d3fd049dc8dda6d7a0394047309bcc920150a475de194f68ca35: Status 404 returned error can't find the container with id 774b15b33cb8d3fd049dc8dda6d7a0394047309bcc920150a475de194f68ca35 Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.152170 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" event={"ID":"45c29030-0945-4655-b035-d75e8bf0f818","Type":"ContainerStarted","Data":"ca4a046cee7106430c39b1f13340ce02a94034306d8aab1be4b4f4319a59ec0e"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.152321 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.157315 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" event={"ID":"d9580392-741e-406b-b72d-91aa945f65c2","Type":"ContainerStarted","Data":"2412e484cdbe2a57c60390700ee7e9fdcc59955083ced3ab2b57c998c4dfd2c9"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.157439 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.182908 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" event={"ID":"92716f38-db4c-41d9-962d-f3cc2669a7fb","Type":"ContainerStarted","Data":"91c281577b94a62718ec146ee4d85edcbe9f7a1cffc7e836d845780ef133c30f"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.183548 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.196672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" event={"ID":"8f363429-f2b7-468c-b74b-ef14ebfab90e","Type":"ContainerStarted","Data":"301a2cad9852f68f554a8b1c758196640d8e623c6c0f3a301a797432b2d3eaeb"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.196960 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.203154 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" event={"ID":"5f89ecdd-60c3-4da6-b185-1f044d8ffc46","Type":"ContainerStarted","Data":"0905f998224063cca144cd43e3cb12e75cd9da94647b0cf7fa1f916dbdf97ad2"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.203423 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.208159 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" event={"ID":"c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b","Type":"ContainerStarted","Data":"20b30d97bd2bf4f62cc70d27b97d644abd4014bc18bf06f94a1c01d79b05cd86"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.208540 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.210212 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" event={"ID":"deedb14e-007e-44eb-bd52-85bbc12d0bec","Type":"ContainerStarted","Data":"774b15b33cb8d3fd049dc8dda6d7a0394047309bcc920150a475de194f68ca35"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.214053 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" podStartSLOduration=3.265030453 podStartE2EDuration="18.214036779s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:50.882614458 +0000 UTC m=+1092.278668481" lastFinishedPulling="2026-03-08 19:50:05.831620794 +0000 UTC m=+1107.227674807" observedRunningTime="2026-03-08 19:50:07.183790274 +0000 UTC m=+1108.579844297" watchObservedRunningTime="2026-03-08 19:50:07.214036779 +0000 UTC m=+1108.610090802" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.217310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" event={"ID":"d8bff80c-e537-4de5-8a05-85ee81004c30","Type":"ContainerStarted","Data":"98430ed7c05544a1065f6eb94c43d341d1135559ca82def3f00aac4b1fc8f128"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.218420 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" podStartSLOduration=2.994140502 podStartE2EDuration="18.218410546s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:50.607885625 +0000 UTC m=+1092.003939648" lastFinishedPulling="2026-03-08 19:50:05.832155669 +0000 UTC m=+1107.228209692" observedRunningTime="2026-03-08 19:50:07.21743229 +0000 UTC m=+1108.613486303" watchObservedRunningTime="2026-03-08 19:50:07.218410546 +0000 UTC m=+1108.614464569" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.228149 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" event={"ID":"d8de7df0-2dea-4d3c-a02e-57bfabade82f","Type":"ContainerStarted","Data":"84939956a7b32f2fe9fcbefc882d3c0bb4a64aea0fc0983b6dba16506bb63f3e"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.231395 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" podStartSLOduration=2.770019285 podStartE2EDuration="17.231376691s" podCreationTimestamp="2026-03-08 19:49:50 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.370036932 +0000 UTC m=+1092.766090955" lastFinishedPulling="2026-03-08 19:50:05.831394298 +0000 UTC m=+1107.227448361" observedRunningTime="2026-03-08 19:50:07.231118914 +0000 UTC m=+1108.627172937" watchObservedRunningTime="2026-03-08 19:50:07.231376691 +0000 UTC m=+1108.627430714" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.234364 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" event={"ID":"27aa3877-54cd-414d-80a0-ab20a68ed535","Type":"ContainerStarted","Data":"3e8829a44679263cce90c1b420fd6e82935b6f4a3feb0a8d1b56c38352b9192f"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.235312 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.238365 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" event={"ID":"4742ab81-6c6d-43c8-8025-6a656b8c40dc","Type":"ContainerStarted","Data":"c8ee5357d25f841988d89f06bddc6c262675e7c53a4eab8ed89bf0fad9d9f489"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.238565 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.241271 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" event={"ID":"7180efa7-8d93-436e-8de2-78fe5c173843","Type":"ContainerStarted","Data":"4c0ae4b4ed1923b8ba336f1eeedd2c1bed8e648c010a94653c97bf8bde677286"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.241299 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.252354 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" podStartSLOduration=3.115566675 podStartE2EDuration="18.252338899s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:50.695345434 +0000 UTC m=+1092.091399457" lastFinishedPulling="2026-03-08 19:50:05.832117658 +0000 UTC m=+1107.228171681" observedRunningTime="2026-03-08 19:50:07.251418624 +0000 UTC m=+1108.647472647" watchObservedRunningTime="2026-03-08 19:50:07.252338899 +0000 UTC m=+1108.648392922" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.253709 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" event={"ID":"d5770638-6059-4ce5-b401-84b0155589a3","Type":"ContainerStarted","Data":"eed205ae70bf3237ab7511b8c025102e57d642a73384a727791f61d42998ce1b"} Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.254030 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.273472 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" podStartSLOduration=3.692422999 podStartE2EDuration="18.273453771s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.250676914 +0000 UTC m=+1092.646730937" lastFinishedPulling="2026-03-08 19:50:05.831707686 +0000 UTC m=+1107.227761709" observedRunningTime="2026-03-08 19:50:07.273163494 +0000 UTC m=+1108.669217517" watchObservedRunningTime="2026-03-08 19:50:07.273453771 +0000 UTC m=+1108.669507794" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.309555 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" podStartSLOduration=2.893514292 podStartE2EDuration="17.309539242s" podCreationTimestamp="2026-03-08 19:49:50 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.420267978 +0000 UTC m=+1092.816322001" lastFinishedPulling="2026-03-08 19:50:05.836292928 +0000 UTC m=+1107.232346951" observedRunningTime="2026-03-08 19:50:07.306208362 +0000 UTC m=+1108.702262385" watchObservedRunningTime="2026-03-08 19:50:07.309539242 +0000 UTC m=+1108.705593265" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.328234 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" podStartSLOduration=3.748882433 podStartE2EDuration="18.328217389s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.250185542 +0000 UTC m=+1092.646239565" lastFinishedPulling="2026-03-08 19:50:05.829520498 +0000 UTC m=+1107.225574521" observedRunningTime="2026-03-08 19:50:07.325876176 +0000 UTC m=+1108.721930199" watchObservedRunningTime="2026-03-08 19:50:07.328217389 +0000 UTC m=+1108.724271412" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.340045 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" podStartSLOduration=3.617150966 podStartE2EDuration="18.340030123s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.108358727 +0000 UTC m=+1092.504412740" lastFinishedPulling="2026-03-08 19:50:05.831237874 +0000 UTC m=+1107.227291897" observedRunningTime="2026-03-08 19:50:07.338458301 +0000 UTC m=+1108.734512324" watchObservedRunningTime="2026-03-08 19:50:07.340030123 +0000 UTC m=+1108.736084146" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.353567 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" podStartSLOduration=3.622902409 podStartE2EDuration="18.353549873s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.101819173 +0000 UTC m=+1092.497873196" lastFinishedPulling="2026-03-08 19:50:05.832466637 +0000 UTC m=+1107.228520660" observedRunningTime="2026-03-08 19:50:07.35005044 +0000 UTC m=+1108.746104463" watchObservedRunningTime="2026-03-08 19:50:07.353549873 +0000 UTC m=+1108.749603886" Mar 08 19:50:07 crc kubenswrapper[4885]: I0308 19:50:07.391058 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" podStartSLOduration=3.673538278 podStartE2EDuration="18.391037771s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.114144211 +0000 UTC m=+1092.510198234" lastFinishedPulling="2026-03-08 19:50:05.831643704 +0000 UTC m=+1107.227697727" observedRunningTime="2026-03-08 19:50:07.37034501 +0000 UTC m=+1108.766399033" watchObservedRunningTime="2026-03-08 19:50:07.391037771 +0000 UTC m=+1108.787091794" Mar 08 19:50:11 crc kubenswrapper[4885]: I0308 19:50:11.288773 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" event={"ID":"deedb14e-007e-44eb-bd52-85bbc12d0bec","Type":"ContainerStarted","Data":"1de6408aa86d1456ffb2743f6af0c1c9552a4390f1967d3b4009755f799b5445"} Mar 08 19:50:11 crc kubenswrapper[4885]: I0308 19:50:11.289179 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:11 crc kubenswrapper[4885]: I0308 19:50:11.329093 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" podStartSLOduration=21.329070968 podStartE2EDuration="21.329070968s" podCreationTimestamp="2026-03-08 19:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:50:11.327210719 +0000 UTC m=+1112.723264772" watchObservedRunningTime="2026-03-08 19:50:11.329070968 +0000 UTC m=+1112.725125021" Mar 08 19:50:12 crc kubenswrapper[4885]: I0308 19:50:12.298732 4885 generic.go:334] "Generic (PLEG): container finished" podID="d8bff80c-e537-4de5-8a05-85ee81004c30" containerID="626657923ce6ed6491828eb9e3d29e03cb9ceee45223fbdf56fc2006030e8b1d" exitCode=0 Mar 08 19:50:12 crc kubenswrapper[4885]: I0308 19:50:12.298788 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" event={"ID":"d8bff80c-e537-4de5-8a05-85ee81004c30","Type":"ContainerDied","Data":"626657923ce6ed6491828eb9e3d29e03cb9ceee45223fbdf56fc2006030e8b1d"} Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.200444 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.277024 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqrg7\" (UniqueName: \"kubernetes.io/projected/d8bff80c-e537-4de5-8a05-85ee81004c30-kube-api-access-mqrg7\") pod \"d8bff80c-e537-4de5-8a05-85ee81004c30\" (UID: \"d8bff80c-e537-4de5-8a05-85ee81004c30\") " Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.292810 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8bff80c-e537-4de5-8a05-85ee81004c30-kube-api-access-mqrg7" (OuterVolumeSpecName: "kube-api-access-mqrg7") pod "d8bff80c-e537-4de5-8a05-85ee81004c30" (UID: "d8bff80c-e537-4de5-8a05-85ee81004c30"). InnerVolumeSpecName "kube-api-access-mqrg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.317263 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" event={"ID":"d8bff80c-e537-4de5-8a05-85ee81004c30","Type":"ContainerDied","Data":"98430ed7c05544a1065f6eb94c43d341d1135559ca82def3f00aac4b1fc8f128"} Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.317299 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98430ed7c05544a1065f6eb94c43d341d1135559ca82def3f00aac4b1fc8f128" Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.317312 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549990-8n4qs" Mar 08 19:50:14 crc kubenswrapper[4885]: I0308 19:50:14.378407 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqrg7\" (UniqueName: \"kubernetes.io/projected/d8bff80c-e537-4de5-8a05-85ee81004c30-kube-api-access-mqrg7\") on node \"crc\" DevicePath \"\"" Mar 08 19:50:15 crc kubenswrapper[4885]: I0308 19:50:15.257143 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549984-4fjvc"] Mar 08 19:50:15 crc kubenswrapper[4885]: I0308 19:50:15.263505 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549984-4fjvc"] Mar 08 19:50:15 crc kubenswrapper[4885]: I0308 19:50:15.380471 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383ac947-e1b1-4f15-98a6-69fcc60e0ac1" path="/var/lib/kubelet/pods/383ac947-e1b1-4f15-98a6-69fcc60e0ac1/volumes" Mar 08 19:50:16 crc kubenswrapper[4885]: I0308 19:50:16.472581 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-pzg95" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.344197 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" event={"ID":"8d086566-6154-4ddd-8028-a9c203cfec11","Type":"ContainerStarted","Data":"f591fa6f264923752890fa4ae44758ccfaf811d915bddeca71453b99c70a74b1"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.344368 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.345764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" event={"ID":"d5136d34-82a8-47c5-9d7d-09e0206587e8","Type":"ContainerStarted","Data":"66f62958fa94c95d1652a06955146a918f1d708b09f0e339559030c8efdaa0d2"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.345982 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.347216 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" event={"ID":"d8de7df0-2dea-4d3c-a02e-57bfabade82f","Type":"ContainerStarted","Data":"ca291ac12d55f85066620edd4ef44f4ffcf918f74c5aec4f42789ab398b6b7ae"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.347360 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.348661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" event={"ID":"bbb8966a-e61f-427d-af2a-0fdab2348d03","Type":"ContainerStarted","Data":"e4e20c6f22b6ee9b2d15293ce56766064b5dff883644ad78be143abdc41d3f8a"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.348850 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.350064 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" event={"ID":"ea5acc0f-2ad8-46d5-80a2-502e2900fdd6","Type":"ContainerStarted","Data":"c747dd03cf5be763fba30f34527af7a9aac5b8d21b4c9a2eea79a607c54c209d"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.350248 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.351710 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" event={"ID":"44fbac8d-d81f-4c03-9555-ef33551d478d","Type":"ContainerStarted","Data":"ff26c7b17bbe62f33310071f4c7349db2b732469daac3772bd36334e8b626c7c"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.351867 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.353084 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" event={"ID":"a8caa87f-832f-4436-beaa-aaa505de3bac","Type":"ContainerStarted","Data":"670157d30b97301a66a6648804293834912a5b9ecaa797a3887928d8d7c13573"} Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.364874 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" podStartSLOduration=3.619793026 podStartE2EDuration="28.364854692s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.42819655 +0000 UTC m=+1092.824250573" lastFinishedPulling="2026-03-08 19:50:16.173258216 +0000 UTC m=+1117.569312239" observedRunningTime="2026-03-08 19:50:17.364251847 +0000 UTC m=+1118.760305880" watchObservedRunningTime="2026-03-08 19:50:17.364854692 +0000 UTC m=+1118.760908725" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.398878 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-pd9b2" podStartSLOduration=2.80175534 podStartE2EDuration="27.398856418s" podCreationTimestamp="2026-03-08 19:49:50 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.522723436 +0000 UTC m=+1092.918777459" lastFinishedPulling="2026-03-08 19:50:16.119824514 +0000 UTC m=+1117.515878537" observedRunningTime="2026-03-08 19:50:17.386324274 +0000 UTC m=+1118.782378327" watchObservedRunningTime="2026-03-08 19:50:17.398856418 +0000 UTC m=+1118.794910441" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.428268 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" podStartSLOduration=19.510534828 podStartE2EDuration="28.42824979s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:50:06.692051736 +0000 UTC m=+1108.088105759" lastFinishedPulling="2026-03-08 19:50:15.609766698 +0000 UTC m=+1117.005820721" observedRunningTime="2026-03-08 19:50:17.421618774 +0000 UTC m=+1118.817672797" watchObservedRunningTime="2026-03-08 19:50:17.42824979 +0000 UTC m=+1118.824303833" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.445349 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" podStartSLOduration=3.759896765 podStartE2EDuration="28.445329175s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.43122995 +0000 UTC m=+1092.827283973" lastFinishedPulling="2026-03-08 19:50:16.11666235 +0000 UTC m=+1117.512716383" observedRunningTime="2026-03-08 19:50:17.440380663 +0000 UTC m=+1118.836434686" watchObservedRunningTime="2026-03-08 19:50:17.445329175 +0000 UTC m=+1118.841383198" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.452766 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" podStartSLOduration=2.853608609 podStartE2EDuration="27.452751892s" podCreationTimestamp="2026-03-08 19:49:50 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.528632953 +0000 UTC m=+1092.924686976" lastFinishedPulling="2026-03-08 19:50:16.127776236 +0000 UTC m=+1117.523830259" observedRunningTime="2026-03-08 19:50:17.451651223 +0000 UTC m=+1118.847705246" watchObservedRunningTime="2026-03-08 19:50:17.452751892 +0000 UTC m=+1118.848805915" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.476806 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" podStartSLOduration=2.7148285960000003 podStartE2EDuration="27.476792582s" podCreationTimestamp="2026-03-08 19:49:50 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.388723269 +0000 UTC m=+1092.784777292" lastFinishedPulling="2026-03-08 19:50:16.150687245 +0000 UTC m=+1117.546741278" observedRunningTime="2026-03-08 19:50:17.471752738 +0000 UTC m=+1118.867806751" watchObservedRunningTime="2026-03-08 19:50:17.476792582 +0000 UTC m=+1118.872846605" Mar 08 19:50:17 crc kubenswrapper[4885]: I0308 19:50:17.488598 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" podStartSLOduration=3.754988474 podStartE2EDuration="28.488582516s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.398267693 +0000 UTC m=+1092.794321706" lastFinishedPulling="2026-03-08 19:50:16.131861715 +0000 UTC m=+1117.527915748" observedRunningTime="2026-03-08 19:50:17.487378594 +0000 UTC m=+1118.883432617" watchObservedRunningTime="2026-03-08 19:50:17.488582516 +0000 UTC m=+1118.884636539" Mar 08 19:50:18 crc kubenswrapper[4885]: I0308 19:50:18.363814 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" event={"ID":"392750e0-9d71-418d-89b0-ec10f33ec505","Type":"ContainerStarted","Data":"17bfd1b3d6efaebabb3eb82c3390d6a76923e13ad648aa637d5eb220eb4f8ad5"} Mar 08 19:50:18 crc kubenswrapper[4885]: I0308 19:50:18.364239 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:50:18 crc kubenswrapper[4885]: I0308 19:50:18.365605 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" event={"ID":"157555d5-ca64-49f8-8849-cd763c83feda","Type":"ContainerStarted","Data":"3d054c295deed751d93f3a8878dc5591a1d6cc8c70fe4d9f2643ee5936e02417"} Mar 08 19:50:18 crc kubenswrapper[4885]: I0308 19:50:18.411705 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" podStartSLOduration=2.793261297 podStartE2EDuration="29.411677496s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.209971951 +0000 UTC m=+1092.606025974" lastFinishedPulling="2026-03-08 19:50:17.82838815 +0000 UTC m=+1119.224442173" observedRunningTime="2026-03-08 19:50:18.407133945 +0000 UTC m=+1119.803187968" watchObservedRunningTime="2026-03-08 19:50:18.411677496 +0000 UTC m=+1119.807731539" Mar 08 19:50:18 crc kubenswrapper[4885]: I0308 19:50:18.435674 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" podStartSLOduration=2.708377757 podStartE2EDuration="29.435655974s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.050946969 +0000 UTC m=+1092.447000992" lastFinishedPulling="2026-03-08 19:50:17.778225186 +0000 UTC m=+1119.174279209" observedRunningTime="2026-03-08 19:50:18.427809015 +0000 UTC m=+1119.823863038" watchObservedRunningTime="2026-03-08 19:50:18.435655974 +0000 UTC m=+1119.831710017" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.053250 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-rplg5" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.079023 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-f9jr4" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.094033 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-sbrjr" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.131482 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-4hstb" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.190272 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-xplpw" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.206436 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-n88vz" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.251959 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-nclkr" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.267942 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.343583 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-q5hfb" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.358884 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-2hsgc" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.535916 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-7hgld" Mar 08 19:50:20 crc kubenswrapper[4885]: I0308 19:50:20.605290 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-7mghs" Mar 08 19:50:21 crc kubenswrapper[4885]: I0308 19:50:21.389648 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" event={"ID":"7c05f3ed-fe8f-47db-b596-8b90b96c295c","Type":"ContainerStarted","Data":"5c6ba672b42f96c4f3d5a647278db02dfc5a68988906e88606385ec526358e27"} Mar 08 19:50:21 crc kubenswrapper[4885]: I0308 19:50:21.390764 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:50:21 crc kubenswrapper[4885]: I0308 19:50:21.423819 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" podStartSLOduration=2.831358272 podStartE2EDuration="32.423790219s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:49:51.210916427 +0000 UTC m=+1092.606970460" lastFinishedPulling="2026-03-08 19:50:20.803348374 +0000 UTC m=+1122.199402407" observedRunningTime="2026-03-08 19:50:21.417926593 +0000 UTC m=+1122.813980646" watchObservedRunningTime="2026-03-08 19:50:21.423790219 +0000 UTC m=+1122.819844282" Mar 08 19:50:21 crc kubenswrapper[4885]: I0308 19:50:21.896197 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:50:21 crc kubenswrapper[4885]: I0308 19:50:21.910288 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fc40f07-4706-4008-b86e-e73a2f2ab620-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-vf24d\" (UID: \"9fc40f07-4706-4008-b86e-e73a2f2ab620\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:50:22 crc kubenswrapper[4885]: I0308 19:50:22.025502 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:50:22 crc kubenswrapper[4885]: I0308 19:50:22.323740 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d"] Mar 08 19:50:22 crc kubenswrapper[4885]: W0308 19:50:22.328446 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fc40f07_4706_4008_b86e_e73a2f2ab620.slice/crio-0cffcd70b6546c7dd43e5f1462d417d3ef5559069b1d2f0f03594331922d891d WatchSource:0}: Error finding container 0cffcd70b6546c7dd43e5f1462d417d3ef5559069b1d2f0f03594331922d891d: Status 404 returned error can't find the container with id 0cffcd70b6546c7dd43e5f1462d417d3ef5559069b1d2f0f03594331922d891d Mar 08 19:50:22 crc kubenswrapper[4885]: I0308 19:50:22.398563 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" event={"ID":"9fc40f07-4706-4008-b86e-e73a2f2ab620","Type":"ContainerStarted","Data":"0cffcd70b6546c7dd43e5f1462d417d3ef5559069b1d2f0f03594331922d891d"} Mar 08 19:50:25 crc kubenswrapper[4885]: I0308 19:50:25.425903 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" event={"ID":"9fc40f07-4706-4008-b86e-e73a2f2ab620","Type":"ContainerStarted","Data":"0951ee5f7648c1b7f86e13f2f59910707cb8c3155eca01ccbb648fa4dec6b6c0"} Mar 08 19:50:25 crc kubenswrapper[4885]: I0308 19:50:25.426629 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:50:25 crc kubenswrapper[4885]: I0308 19:50:25.452725 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" podStartSLOduration=33.774405279 podStartE2EDuration="36.452704836s" podCreationTimestamp="2026-03-08 19:49:49 +0000 UTC" firstStartedPulling="2026-03-08 19:50:22.331459139 +0000 UTC m=+1123.727513172" lastFinishedPulling="2026-03-08 19:50:25.009758706 +0000 UTC m=+1126.405812729" observedRunningTime="2026-03-08 19:50:25.450713183 +0000 UTC m=+1126.846767236" watchObservedRunningTime="2026-03-08 19:50:25.452704836 +0000 UTC m=+1126.848758859" Mar 08 19:50:26 crc kubenswrapper[4885]: I0308 19:50:26.089088 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.270695 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.443969 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-p8r6f" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.451556 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-7vtx7" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.475996 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-k4r6w" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.520747 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-wdrfh" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.525261 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-4gfw2" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.679162 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-xf4hm" Mar 08 19:50:30 crc kubenswrapper[4885]: I0308 19:50:30.781001 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" Mar 08 19:50:32 crc kubenswrapper[4885]: I0308 19:50:32.035340 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-vf24d" Mar 08 19:50:32 crc kubenswrapper[4885]: I0308 19:50:32.818443 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:50:32 crc kubenswrapper[4885]: I0308 19:50:32.818826 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:50:32 crc kubenswrapper[4885]: I0308 19:50:32.818918 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:50:32 crc kubenswrapper[4885]: I0308 19:50:32.819785 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6dd4ce3180e7f84da70c69f276b3e39a0d5b0c2aeeabe5c8a51dafdbeafb374"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:50:32 crc kubenswrapper[4885]: I0308 19:50:32.819873 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://e6dd4ce3180e7f84da70c69f276b3e39a0d5b0c2aeeabe5c8a51dafdbeafb374" gracePeriod=600 Mar 08 19:50:33 crc kubenswrapper[4885]: I0308 19:50:33.530565 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="e6dd4ce3180e7f84da70c69f276b3e39a0d5b0c2aeeabe5c8a51dafdbeafb374" exitCode=0 Mar 08 19:50:33 crc kubenswrapper[4885]: I0308 19:50:33.530651 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"e6dd4ce3180e7f84da70c69f276b3e39a0d5b0c2aeeabe5c8a51dafdbeafb374"} Mar 08 19:50:33 crc kubenswrapper[4885]: I0308 19:50:33.531905 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"c24a30299a18630f198121b61248ad8d1e3d9e8acd806e23d5c1d953fe5cfa83"} Mar 08 19:50:33 crc kubenswrapper[4885]: I0308 19:50:33.532088 4885 scope.go:117] "RemoveContainer" containerID="f94b502e469fe218787b8101e45951a2dfe1f5fc0bc5b2cb2e8b55561aeaabb2" Mar 08 19:50:46 crc kubenswrapper[4885]: I0308 19:50:46.063504 4885 scope.go:117] "RemoveContainer" containerID="b7734316fc145363037328cab9f126d7c1da55c60bd2f7c56f716841be40429c" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.394734 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-w67fx"] Mar 08 19:50:49 crc kubenswrapper[4885]: E0308 19:50:49.395894 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8bff80c-e537-4de5-8a05-85ee81004c30" containerName="oc" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.395911 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8bff80c-e537-4de5-8a05-85ee81004c30" containerName="oc" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.396110 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8bff80c-e537-4de5-8a05-85ee81004c30" containerName="oc" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.397102 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.400813 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.400893 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.401450 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-cjg8w" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.419102 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.433435 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-w67fx"] Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.496035 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-442qz"] Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.498606 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.500241 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.508423 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-442qz"] Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.531566 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg25w\" (UniqueName: \"kubernetes.io/projected/19de3581-aa8c-48c0-aad6-4139d132ca70-kube-api-access-xg25w\") pod \"dnsmasq-dns-589db6c89c-w67fx\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.531648 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19de3581-aa8c-48c0-aad6-4139d132ca70-config\") pod \"dnsmasq-dns-589db6c89c-w67fx\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.632535 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg25w\" (UniqueName: \"kubernetes.io/projected/19de3581-aa8c-48c0-aad6-4139d132ca70-kube-api-access-xg25w\") pod \"dnsmasq-dns-589db6c89c-w67fx\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.632673 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19de3581-aa8c-48c0-aad6-4139d132ca70-config\") pod \"dnsmasq-dns-589db6c89c-w67fx\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.632747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-config\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.632786 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfmpl\" (UniqueName: \"kubernetes.io/projected/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-kube-api-access-jfmpl\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.632830 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.633823 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19de3581-aa8c-48c0-aad6-4139d132ca70-config\") pod \"dnsmasq-dns-589db6c89c-w67fx\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.651637 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg25w\" (UniqueName: \"kubernetes.io/projected/19de3581-aa8c-48c0-aad6-4139d132ca70-kube-api-access-xg25w\") pod \"dnsmasq-dns-589db6c89c-w67fx\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.734196 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfmpl\" (UniqueName: \"kubernetes.io/projected/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-kube-api-access-jfmpl\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.734247 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.734347 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-config\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.735276 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-config\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.735289 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-dns-svc\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.735989 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.752615 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfmpl\" (UniqueName: \"kubernetes.io/projected/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-kube-api-access-jfmpl\") pod \"dnsmasq-dns-86bbd886cf-442qz\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:49 crc kubenswrapper[4885]: I0308 19:50:49.818468 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:50:50 crc kubenswrapper[4885]: W0308 19:50:50.170025 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod19de3581_aa8c_48c0_aad6_4139d132ca70.slice/crio-883cc4093e6e1d1ce406c4937cdb0137ce8cbfb3722b5404c71c307ac4d03766 WatchSource:0}: Error finding container 883cc4093e6e1d1ce406c4937cdb0137ce8cbfb3722b5404c71c307ac4d03766: Status 404 returned error can't find the container with id 883cc4093e6e1d1ce406c4937cdb0137ce8cbfb3722b5404c71c307ac4d03766 Mar 08 19:50:50 crc kubenswrapper[4885]: I0308 19:50:50.171235 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-w67fx"] Mar 08 19:50:50 crc kubenswrapper[4885]: I0308 19:50:50.237814 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-442qz"] Mar 08 19:50:50 crc kubenswrapper[4885]: I0308 19:50:50.692032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" event={"ID":"19de3581-aa8c-48c0-aad6-4139d132ca70","Type":"ContainerStarted","Data":"883cc4093e6e1d1ce406c4937cdb0137ce8cbfb3722b5404c71c307ac4d03766"} Mar 08 19:50:50 crc kubenswrapper[4885]: I0308 19:50:50.693260 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" event={"ID":"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0","Type":"ContainerStarted","Data":"b81b2d09cf554bd40279185ed607a08f534a46ffc9af81dd3b055b475415182a"} Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.473067 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-w67fx"] Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.498029 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-vwdgd"] Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.499290 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.508858 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-vwdgd"] Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.661774 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-config\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.661858 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.662005 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mznsj\" (UniqueName: \"kubernetes.io/projected/604f26ec-2884-4eb6-97f9-e2961f8907b1-kube-api-access-mznsj\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.763194 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-config\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.763556 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.763638 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mznsj\" (UniqueName: \"kubernetes.io/projected/604f26ec-2884-4eb6-97f9-e2961f8907b1-kube-api-access-mznsj\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.764453 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-config\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.764990 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-dns-svc\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.793703 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mznsj\" (UniqueName: \"kubernetes.io/projected/604f26ec-2884-4eb6-97f9-e2961f8907b1-kube-api-access-mznsj\") pod \"dnsmasq-dns-79f9fc56ff-vwdgd\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:51 crc kubenswrapper[4885]: I0308 19:50:51.822354 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.089601 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-vwdgd"] Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.306988 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-442qz"] Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.334628 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-86zq7"] Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.335971 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.341904 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-86zq7"] Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.472975 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.473310 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nskr\" (UniqueName: \"kubernetes.io/projected/2196a973-cc42-4633-8c99-1422d07d475a-kube-api-access-5nskr\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.473411 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-config\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.574967 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-config\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.575054 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.575081 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nskr\" (UniqueName: \"kubernetes.io/projected/2196a973-cc42-4633-8c99-1422d07d475a-kube-api-access-5nskr\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.575959 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-config\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.576027 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-dns-svc\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.612833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nskr\" (UniqueName: \"kubernetes.io/projected/2196a973-cc42-4633-8c99-1422d07d475a-kube-api-access-5nskr\") pod \"dnsmasq-dns-7c47bcb9f9-86zq7\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.638800 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.639841 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.646445 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.646516 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.646650 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.646727 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-8v7l4" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.646746 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.646807 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.651476 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.665343 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.666164 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.752661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" event={"ID":"604f26ec-2884-4eb6-97f9-e2961f8907b1","Type":"ContainerStarted","Data":"abe3994c827c15ee527cc92242a1479dbff3de65c1e91c72ef3fd14c10326728"} Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781149 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781172 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjj2h\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-kube-api-access-wjj2h\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781209 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781223 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781238 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781265 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781295 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781313 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781347 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.781372 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.882801 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883081 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883104 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjj2h\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-kube-api-access-wjj2h\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883135 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883152 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883206 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883232 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883246 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883275 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883298 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.883701 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.884335 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.884452 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.884804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.885023 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.885967 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-server-conf\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.889620 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.889807 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.898177 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-pod-info\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.898716 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.905355 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjj2h\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-kube-api-access-wjj2h\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.931067 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " pod="openstack/rabbitmq-server-0" Mar 08 19:50:52 crc kubenswrapper[4885]: I0308 19:50:52.979172 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-86zq7"] Mar 08 19:50:52 crc kubenswrapper[4885]: W0308 19:50:52.996726 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2196a973_cc42_4633_8c99_1422d07d475a.slice/crio-ea36b64eee9bd69dcd99d1003ff2856f54e1225bf3a3740ad6059b5726729f56 WatchSource:0}: Error finding container ea36b64eee9bd69dcd99d1003ff2856f54e1225bf3a3740ad6059b5726729f56: Status 404 returned error can't find the container with id ea36b64eee9bd69dcd99d1003ff2856f54e1225bf3a3740ad6059b5726729f56 Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.005300 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.471598 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.472963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.475696 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.475741 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.475762 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.475701 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7bqmj" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.475737 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.475853 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.482126 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.494654 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.504756 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:50:53 crc kubenswrapper[4885]: W0308 19:50:53.513036 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01dc1fd5_4e2f_4129_9452_ed50fa1d182b.slice/crio-8b6317b734453e8868ed75e3450ff9740fcef0ac699a30e38e0adad2d77d26bc WatchSource:0}: Error finding container 8b6317b734453e8868ed75e3450ff9740fcef0ac699a30e38e0adad2d77d26bc: Status 404 returned error can't find the container with id 8b6317b734453e8868ed75e3450ff9740fcef0ac699a30e38e0adad2d77d26bc Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595164 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595236 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595264 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595280 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9bwn\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-kube-api-access-h9bwn\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595324 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595340 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595357 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595389 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96257eac-42ec-44cf-80be-9be68c0ebb1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595492 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595612 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.595635 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96257eac-42ec-44cf-80be-9be68c0ebb1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.697795 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96257eac-42ec-44cf-80be-9be68c0ebb1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.697864 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.697944 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.697966 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96257eac-42ec-44cf-80be-9be68c0ebb1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698060 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698084 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698099 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9bwn\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-kube-api-access-h9bwn\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698154 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698171 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.698592 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.702088 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.702213 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96257eac-42ec-44cf-80be-9be68c0ebb1b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.703162 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.703717 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.715952 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.717614 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.718563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.722833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96257eac-42ec-44cf-80be-9be68c0ebb1b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.726826 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.731452 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9bwn\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-kube-api-access-h9bwn\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.751429 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.772804 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" event={"ID":"2196a973-cc42-4633-8c99-1422d07d475a","Type":"ContainerStarted","Data":"ea36b64eee9bd69dcd99d1003ff2856f54e1225bf3a3740ad6059b5726729f56"} Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.778793 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01dc1fd5-4e2f-4129-9452-ed50fa1d182b","Type":"ContainerStarted","Data":"8b6317b734453e8868ed75e3450ff9740fcef0ac699a30e38e0adad2d77d26bc"} Mar 08 19:50:53 crc kubenswrapper[4885]: I0308 19:50:53.804951 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.380934 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.791722 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"96257eac-42ec-44cf-80be-9be68c0ebb1b","Type":"ContainerStarted","Data":"654fe72412f8a73fefea3c7f4b820f3cc3985166e76d795cc9ca36d6cf741354"} Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.801874 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.803455 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.807079 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.807237 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.807460 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.807568 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-f5npv" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.810170 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.811383 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916562 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-generated\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916607 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916667 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-default\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916692 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-operator-scripts\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916711 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916737 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-kolla-config\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916829 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:54 crc kubenswrapper[4885]: I0308 19:50:54.916850 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc6jf\" (UniqueName: \"kubernetes.io/projected/93f52f98-0e26-4fc1-a9af-f580531f8550-kube-api-access-jc6jf\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018288 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-default\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018345 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-operator-scripts\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018367 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018395 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-kolla-config\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018412 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018434 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc6jf\" (UniqueName: \"kubernetes.io/projected/93f52f98-0e26-4fc1-a9af-f580531f8550-kube-api-access-jc6jf\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018472 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-generated\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.018687 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.019132 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-kolla-config\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.019535 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-generated\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.019776 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-default\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.020450 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-operator-scripts\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.025171 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.025375 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.035926 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc6jf\" (UniqueName: \"kubernetes.io/projected/93f52f98-0e26-4fc1-a9af-f580531f8550-kube-api-access-jc6jf\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.038400 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.129428 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.688520 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 19:50:55 crc kubenswrapper[4885]: W0308 19:50:55.711125 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93f52f98_0e26_4fc1_a9af_f580531f8550.slice/crio-c562a73365a8a4cec4d84c9a8ed8cc0d8747c679cc7180e594fd7812393beab5 WatchSource:0}: Error finding container c562a73365a8a4cec4d84c9a8ed8cc0d8747c679cc7180e594fd7812393beab5: Status 404 returned error can't find the container with id c562a73365a8a4cec4d84c9a8ed8cc0d8747c679cc7180e594fd7812393beab5 Mar 08 19:50:55 crc kubenswrapper[4885]: I0308 19:50:55.800575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93f52f98-0e26-4fc1-a9af-f580531f8550","Type":"ContainerStarted","Data":"c562a73365a8a4cec4d84c9a8ed8cc0d8747c679cc7180e594fd7812393beab5"} Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.158909 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.160762 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.164060 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.164668 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5kqpk" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.164842 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.164975 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.168687 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.252811 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.252863 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jqgd\" (UniqueName: \"kubernetes.io/projected/925797ff-e1b0-4df7-83db-2091264a4bb8-kube-api-access-8jqgd\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.252902 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.252946 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.253000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.253022 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.253058 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.253078 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354187 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354224 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354248 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354285 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354309 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354348 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354368 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jqgd\" (UniqueName: \"kubernetes.io/projected/925797ff-e1b0-4df7-83db-2091264a4bb8-kube-api-access-8jqgd\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.354614 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.355233 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.355797 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.356008 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.356523 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.366474 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.366488 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.370934 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jqgd\" (UniqueName: \"kubernetes.io/projected/925797ff-e1b0-4df7-83db-2091264a4bb8-kube-api-access-8jqgd\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.398577 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.459762 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.460601 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.465220 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.465362 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.465587 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-cc744" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.474403 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.481184 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.558234 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-kolla-config\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.558272 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-combined-ca-bundle\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.558371 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-config-data\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.558418 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q554n\" (UniqueName: \"kubernetes.io/projected/da1d62ba-4033-4906-87c1-d673c1ab8637-kube-api-access-q554n\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.558441 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-memcached-tls-certs\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.659552 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-memcached-tls-certs\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.659639 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-kolla-config\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.659662 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-combined-ca-bundle\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.659706 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-config-data\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.659739 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q554n\" (UniqueName: \"kubernetes.io/projected/da1d62ba-4033-4906-87c1-d673c1ab8637-kube-api-access-q554n\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.662049 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-kolla-config\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.662447 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-config-data\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.667907 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-memcached-tls-certs\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.668331 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-combined-ca-bundle\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.674004 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q554n\" (UniqueName: \"kubernetes.io/projected/da1d62ba-4033-4906-87c1-d673c1ab8637-kube-api-access-q554n\") pod \"memcached-0\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " pod="openstack/memcached-0" Mar 08 19:50:56 crc kubenswrapper[4885]: I0308 19:50:56.832249 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.640872 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.641739 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.643320 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-dsb9r" Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.656597 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.791499 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxdn7\" (UniqueName: \"kubernetes.io/projected/50f2f07f-efc4-4778-944c-d4819f0b0e30-kube-api-access-cxdn7\") pod \"kube-state-metrics-0\" (UID: \"50f2f07f-efc4-4778-944c-d4819f0b0e30\") " pod="openstack/kube-state-metrics-0" Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.892745 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxdn7\" (UniqueName: \"kubernetes.io/projected/50f2f07f-efc4-4778-944c-d4819f0b0e30-kube-api-access-cxdn7\") pod \"kube-state-metrics-0\" (UID: \"50f2f07f-efc4-4778-944c-d4819f0b0e30\") " pod="openstack/kube-state-metrics-0" Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.917801 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxdn7\" (UniqueName: \"kubernetes.io/projected/50f2f07f-efc4-4778-944c-d4819f0b0e30-kube-api-access-cxdn7\") pod \"kube-state-metrics-0\" (UID: \"50f2f07f-efc4-4778-944c-d4819f0b0e30\") " pod="openstack/kube-state-metrics-0" Mar 08 19:50:58 crc kubenswrapper[4885]: I0308 19:50:58.991650 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.055700 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.057412 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.063255 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.063458 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.063555 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.063691 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.063800 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-swjjk" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.071423 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.144522 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.144646 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.144678 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.144773 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.144856 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chcvg\" (UniqueName: \"kubernetes.io/projected/d768ed9e-b089-4308-befc-e3bd6aa68683-kube-api-access-chcvg\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.144960 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.145022 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.145123 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-config\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246525 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246566 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chcvg\" (UniqueName: \"kubernetes.io/projected/d768ed9e-b089-4308-befc-e3bd6aa68683-kube-api-access-chcvg\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246599 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246619 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-config\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246678 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246708 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246724 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.246897 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.247393 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.248052 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-config\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.250541 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.252976 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.253062 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.266504 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.269044 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chcvg\" (UniqueName: \"kubernetes.io/projected/d768ed9e-b089-4308-befc-e3bd6aa68683-kube-api-access-chcvg\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.269905 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.395104 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.972966 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mn4lz"] Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.974246 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.978231 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.982470 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.989260 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-47w7f" Mar 08 19:51:02 crc kubenswrapper[4885]: I0308 19:51:02.989782 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mn4lz"] Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071220 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c223ffe-b12c-4c78-920a-66e6feb9178f-scripts\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071283 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071370 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-log-ovn\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071485 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run-ovn\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071553 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz9cs\" (UniqueName: \"kubernetes.io/projected/1c223ffe-b12c-4c78-920a-66e6feb9178f-kube-api-access-lz9cs\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071605 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-ovn-controller-tls-certs\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.071634 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-combined-ca-bundle\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.096402 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-pp4rs"] Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.100837 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.105880 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-pp4rs"] Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.172875 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz9cs\" (UniqueName: \"kubernetes.io/projected/1c223ffe-b12c-4c78-920a-66e6feb9178f-kube-api-access-lz9cs\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.172954 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-run\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.172984 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-ovn-controller-tls-certs\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173008 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-combined-ca-bundle\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173045 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c223ffe-b12c-4c78-920a-66e6feb9178f-scripts\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173069 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173102 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-log-ovn\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173142 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-log\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-etc-ovs\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173197 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-lib\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173227 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66rvx\" (UniqueName: \"kubernetes.io/projected/88c2918a-548b-4b78-a34c-2aa2969ee2cd-kube-api-access-66rvx\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173271 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88c2918a-548b-4b78-a34c-2aa2969ee2cd-scripts\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173306 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run-ovn\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173906 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run-ovn\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.173993 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-log-ovn\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.174168 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.175994 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c223ffe-b12c-4c78-920a-66e6feb9178f-scripts\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.182200 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-combined-ca-bundle\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.194651 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-ovn-controller-tls-certs\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.195833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz9cs\" (UniqueName: \"kubernetes.io/projected/1c223ffe-b12c-4c78-920a-66e6feb9178f-kube-api-access-lz9cs\") pod \"ovn-controller-mn4lz\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274623 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88c2918a-548b-4b78-a34c-2aa2969ee2cd-scripts\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274690 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-run\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274753 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-log\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274775 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-etc-ovs\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274798 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-lib\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274821 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66rvx\" (UniqueName: \"kubernetes.io/projected/88c2918a-548b-4b78-a34c-2aa2969ee2cd-kube-api-access-66rvx\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.274877 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-run\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.275088 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-etc-ovs\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.275102 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-lib\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.275189 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-log\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.276521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88c2918a-548b-4b78-a34c-2aa2969ee2cd-scripts\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.289655 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66rvx\" (UniqueName: \"kubernetes.io/projected/88c2918a-548b-4b78-a34c-2aa2969ee2cd-kube-api-access-66rvx\") pod \"ovn-controller-ovs-pp4rs\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.362365 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:03 crc kubenswrapper[4885]: I0308 19:51:03.417534 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.545030 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.546554 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.570729 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.571168 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-5wc4v" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.572189 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.577572 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.583443 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.617393 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.617472 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.618853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk872\" (UniqueName: \"kubernetes.io/projected/b8dd6448-dd16-4487-bc90-f835712effc1-kube-api-access-wk872\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.619010 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-config\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.619105 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.619173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.619356 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.619472 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.720930 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.720997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721034 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721075 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721095 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721585 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721764 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721298 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk872\" (UniqueName: \"kubernetes.io/projected/b8dd6448-dd16-4487-bc90-f835712effc1-kube-api-access-wk872\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-config\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.721872 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.726588 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.726869 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.729041 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-config\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.730469 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.739573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.744299 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.747570 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk872\" (UniqueName: \"kubernetes.io/projected/b8dd6448-dd16-4487-bc90-f835712effc1-kube-api-access-wk872\") pod \"ovsdbserver-sb-0\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:05 crc kubenswrapper[4885]: I0308 19:51:05.887493 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.519395 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.520248 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mznsj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-79f9fc56ff-vwdgd_openstack(604f26ec-2884-4eb6-97f9-e2961f8907b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.521615 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.523350 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.523598 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5nskr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-7c47bcb9f9-86zq7_openstack(2196a973-cc42-4633-8c99-1422d07d475a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.525876 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" podUID="2196a973-cc42-4633-8c99-1422d07d475a" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.567383 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.567567 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jfmpl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86bbd886cf-442qz_openstack(2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.568946 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" podUID="2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.626788 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.627146 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xg25w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-589db6c89c-w67fx_openstack(19de3581-aa8c-48c0-aad6-4139d132ca70): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.628363 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" podUID="19de3581-aa8c-48c0-aad6-4139d132ca70" Mar 08 19:51:12 crc kubenswrapper[4885]: I0308 19:51:12.878640 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 19:51:12 crc kubenswrapper[4885]: W0308 19:51:12.888328 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda1d62ba_4033_4906_87c1_d673c1ab8637.slice/crio-d5c7811542e264146da5dbb28e9ad294c9b3d2c5ddb970c427caa3521f6cd065 WatchSource:0}: Error finding container d5c7811542e264146da5dbb28e9ad294c9b3d2c5ddb970c427caa3521f6cd065: Status 404 returned error can't find the container with id d5c7811542e264146da5dbb28e9ad294c9b3d2c5ddb970c427caa3521f6cd065 Mar 08 19:51:12 crc kubenswrapper[4885]: I0308 19:51:12.946836 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"da1d62ba-4033-4906-87c1-d673c1ab8637","Type":"ContainerStarted","Data":"d5c7811542e264146da5dbb28e9ad294c9b3d2c5ddb970c427caa3521f6cd065"} Mar 08 19:51:12 crc kubenswrapper[4885]: I0308 19:51:12.949199 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93f52f98-0e26-4fc1-a9af-f580531f8550","Type":"ContainerStarted","Data":"8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2"} Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.950957 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" podUID="2196a973-cc42-4633-8c99-1422d07d475a" Mar 08 19:51:12 crc kubenswrapper[4885]: E0308 19:51:12.951339 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:944833b50342d462c10637342bc85197a8cf099a3650df12e23854dde99af514\\\"\"" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" Mar 08 19:51:13 crc kubenswrapper[4885]: W0308 19:51:13.041443 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50f2f07f_efc4_4778_944c_d4819f0b0e30.slice/crio-3c7c7d0cb67bdd716e27a921f55c82cceb775a12654fa8ab0fa3866727e30e29 WatchSource:0}: Error finding container 3c7c7d0cb67bdd716e27a921f55c82cceb775a12654fa8ab0fa3866727e30e29: Status 404 returned error can't find the container with id 3c7c7d0cb67bdd716e27a921f55c82cceb775a12654fa8ab0fa3866727e30e29 Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.044030 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.085557 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mn4lz"] Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.180536 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.215206 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.267694 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 19:51:13 crc kubenswrapper[4885]: W0308 19:51:13.268872 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd768ed9e_b089_4308_befc_e3bd6aa68683.slice/crio-d04c91ea8c65ff23403733626ccc4e0944e79d09733c79a70fe91cced28380f8 WatchSource:0}: Error finding container d04c91ea8c65ff23403733626ccc4e0944e79d09733c79a70fe91cced28380f8: Status 404 returned error can't find the container with id d04c91ea8c65ff23403733626ccc4e0944e79d09733c79a70fe91cced28380f8 Mar 08 19:51:13 crc kubenswrapper[4885]: W0308 19:51:13.365622 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod925797ff_e1b0_4df7_83db_2091264a4bb8.slice/crio-9491886528ee59a2997b30d600c2b1b7132f56a89a9aaa3a02c1e5325fbb4651 WatchSource:0}: Error finding container 9491886528ee59a2997b30d600c2b1b7132f56a89a9aaa3a02c1e5325fbb4651: Status 404 returned error can't find the container with id 9491886528ee59a2997b30d600c2b1b7132f56a89a9aaa3a02c1e5325fbb4651 Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.652758 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.662843 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.776142 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg25w\" (UniqueName: \"kubernetes.io/projected/19de3581-aa8c-48c0-aad6-4139d132ca70-kube-api-access-xg25w\") pod \"19de3581-aa8c-48c0-aad6-4139d132ca70\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.776539 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-config\") pod \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.776631 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfmpl\" (UniqueName: \"kubernetes.io/projected/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-kube-api-access-jfmpl\") pod \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.776657 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19de3581-aa8c-48c0-aad6-4139d132ca70-config\") pod \"19de3581-aa8c-48c0-aad6-4139d132ca70\" (UID: \"19de3581-aa8c-48c0-aad6-4139d132ca70\") " Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.776731 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-dns-svc\") pod \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\" (UID: \"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0\") " Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.777751 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19de3581-aa8c-48c0-aad6-4139d132ca70-config" (OuterVolumeSpecName: "config") pod "19de3581-aa8c-48c0-aad6-4139d132ca70" (UID: "19de3581-aa8c-48c0-aad6-4139d132ca70"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.778147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-config" (OuterVolumeSpecName: "config") pod "2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0" (UID: "2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.778283 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0" (UID: "2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.779909 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19de3581-aa8c-48c0-aad6-4139d132ca70-kube-api-access-xg25w" (OuterVolumeSpecName: "kube-api-access-xg25w") pod "19de3581-aa8c-48c0-aad6-4139d132ca70" (UID: "19de3581-aa8c-48c0-aad6-4139d132ca70"). InnerVolumeSpecName "kube-api-access-xg25w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.785205 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-kube-api-access-jfmpl" (OuterVolumeSpecName: "kube-api-access-jfmpl") pod "2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0" (UID: "2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0"). InnerVolumeSpecName "kube-api-access-jfmpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.878180 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg25w\" (UniqueName: \"kubernetes.io/projected/19de3581-aa8c-48c0-aad6-4139d132ca70-kube-api-access-xg25w\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.878209 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.878220 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfmpl\" (UniqueName: \"kubernetes.io/projected/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-kube-api-access-jfmpl\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.878229 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19de3581-aa8c-48c0-aad6-4139d132ca70-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.878239 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.954064 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-pp4rs"] Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.960272 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01dc1fd5-4e2f-4129-9452-ed50fa1d182b","Type":"ContainerStarted","Data":"f7d40d12aee399534fa9d02af86ea25978b99ea1398acccdac988f16615d42dd"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.962874 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.962885 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589db6c89c-w67fx" event={"ID":"19de3581-aa8c-48c0-aad6-4139d132ca70","Type":"ContainerDied","Data":"883cc4093e6e1d1ce406c4937cdb0137ce8cbfb3722b5404c71c307ac4d03766"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.963854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b8dd6448-dd16-4487-bc90-f835712effc1","Type":"ContainerStarted","Data":"27094a5eecfea3bd81d2314594b8cfdb03f329abe60f33c847c8c969d4747a0d"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.965137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"96257eac-42ec-44cf-80be-9be68c0ebb1b","Type":"ContainerStarted","Data":"67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.966174 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"50f2f07f-efc4-4778-944c-d4819f0b0e30","Type":"ContainerStarted","Data":"3c7c7d0cb67bdd716e27a921f55c82cceb775a12654fa8ab0fa3866727e30e29"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.967669 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"925797ff-e1b0-4df7-83db-2091264a4bb8","Type":"ContainerStarted","Data":"81b8543a909b03c12110951d0f4dfaca241eaccbf11cf8dd8e3aa4e40b790556"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.967695 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"925797ff-e1b0-4df7-83db-2091264a4bb8","Type":"ContainerStarted","Data":"9491886528ee59a2997b30d600c2b1b7132f56a89a9aaa3a02c1e5325fbb4651"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.968524 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz" event={"ID":"1c223ffe-b12c-4c78-920a-66e6feb9178f","Type":"ContainerStarted","Data":"9c43acfe462c14a8f2d4b9a160251e927b7f46fa5baa4a259edc59c61d9dcadd"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.969709 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" event={"ID":"2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0","Type":"ContainerDied","Data":"b81b2d09cf554bd40279185ed607a08f534a46ffc9af81dd3b055b475415182a"} Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.969733 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86bbd886cf-442qz" Mar 08 19:51:13 crc kubenswrapper[4885]: I0308 19:51:13.994734 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d768ed9e-b089-4308-befc-e3bd6aa68683","Type":"ContainerStarted","Data":"d04c91ea8c65ff23403733626ccc4e0944e79d09733c79a70fe91cced28380f8"} Mar 08 19:51:14 crc kubenswrapper[4885]: I0308 19:51:14.114976 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-w67fx"] Mar 08 19:51:14 crc kubenswrapper[4885]: I0308 19:51:14.125760 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589db6c89c-w67fx"] Mar 08 19:51:14 crc kubenswrapper[4885]: I0308 19:51:14.148047 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-442qz"] Mar 08 19:51:14 crc kubenswrapper[4885]: I0308 19:51:14.156385 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86bbd886cf-442qz"] Mar 08 19:51:15 crc kubenswrapper[4885]: I0308 19:51:15.004394 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerStarted","Data":"e8b9e6003711ba0073f7cced036d1550a5aac01aa3276b7f4a1f8ca2c14ba942"} Mar 08 19:51:15 crc kubenswrapper[4885]: I0308 19:51:15.379159 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19de3581-aa8c-48c0-aad6-4139d132ca70" path="/var/lib/kubelet/pods/19de3581-aa8c-48c0-aad6-4139d132ca70/volumes" Mar 08 19:51:15 crc kubenswrapper[4885]: I0308 19:51:15.379690 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0" path="/var/lib/kubelet/pods/2fa0c262-573b-4c5f-8bf8-04e5ba2bd8d0/volumes" Mar 08 19:51:18 crc kubenswrapper[4885]: I0308 19:51:18.035409 4885 generic.go:334] "Generic (PLEG): container finished" podID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerID="8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2" exitCode=0 Mar 08 19:51:18 crc kubenswrapper[4885]: I0308 19:51:18.035512 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93f52f98-0e26-4fc1-a9af-f580531f8550","Type":"ContainerDied","Data":"8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2"} Mar 08 19:51:21 crc kubenswrapper[4885]: I0308 19:51:21.538624 4885 generic.go:334] "Generic (PLEG): container finished" podID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerID="81b8543a909b03c12110951d0f4dfaca241eaccbf11cf8dd8e3aa4e40b790556" exitCode=0 Mar 08 19:51:21 crc kubenswrapper[4885]: I0308 19:51:21.538733 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"925797ff-e1b0-4df7-83db-2091264a4bb8","Type":"ContainerDied","Data":"81b8543a909b03c12110951d0f4dfaca241eaccbf11cf8dd8e3aa4e40b790556"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.565692 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93f52f98-0e26-4fc1-a9af-f580531f8550","Type":"ContainerStarted","Data":"88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.572969 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"50f2f07f-efc4-4778-944c-d4819f0b0e30","Type":"ContainerStarted","Data":"fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.573083 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.576639 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"925797ff-e1b0-4df7-83db-2091264a4bb8","Type":"ContainerStarted","Data":"db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.578316 4885 generic.go:334] "Generic (PLEG): container finished" podID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerID="4d70b99d630277ded10493eacfddd38fddedced2d880750db49b6b3f39017dba" exitCode=0 Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.578357 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerDied","Data":"4d70b99d630277ded10493eacfddd38fddedced2d880750db49b6b3f39017dba"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.580707 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz" event={"ID":"1c223ffe-b12c-4c78-920a-66e6feb9178f","Type":"ContainerStarted","Data":"40b3ba3ccd4cd0fabe1a8de0a1537908216c0198be6d0bf26dce86c9b6a32605"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.580831 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-mn4lz" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.582672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b8dd6448-dd16-4487-bc90-f835712effc1","Type":"ContainerStarted","Data":"1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.587743 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"da1d62ba-4033-4906-87c1-d673c1ab8637","Type":"ContainerStarted","Data":"f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.588648 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.593915 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=14.756493401 podStartE2EDuration="31.593894371s" podCreationTimestamp="2026-03-08 19:50:53 +0000 UTC" firstStartedPulling="2026-03-08 19:50:55.713599376 +0000 UTC m=+1157.109653399" lastFinishedPulling="2026-03-08 19:51:12.551000346 +0000 UTC m=+1173.947054369" observedRunningTime="2026-03-08 19:51:24.589193276 +0000 UTC m=+1185.985247309" watchObservedRunningTime="2026-03-08 19:51:24.593894371 +0000 UTC m=+1185.989948404" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.608155 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d768ed9e-b089-4308-befc-e3bd6aa68683","Type":"ContainerStarted","Data":"4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c"} Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.609656 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=29.60962935 podStartE2EDuration="29.60962935s" podCreationTimestamp="2026-03-08 19:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:51:24.606598889 +0000 UTC m=+1186.002652912" watchObservedRunningTime="2026-03-08 19:51:24.60962935 +0000 UTC m=+1186.005683373" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.653304 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=16.357311555 podStartE2EDuration="26.653265682s" podCreationTimestamp="2026-03-08 19:50:58 +0000 UTC" firstStartedPulling="2026-03-08 19:51:13.048139508 +0000 UTC m=+1174.444193531" lastFinishedPulling="2026-03-08 19:51:23.344093605 +0000 UTC m=+1184.740147658" observedRunningTime="2026-03-08 19:51:24.646366049 +0000 UTC m=+1186.042420062" watchObservedRunningTime="2026-03-08 19:51:24.653265682 +0000 UTC m=+1186.049319705" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.672819 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=18.881367494 podStartE2EDuration="28.672802021s" podCreationTimestamp="2026-03-08 19:50:56 +0000 UTC" firstStartedPulling="2026-03-08 19:51:12.890329558 +0000 UTC m=+1174.286383581" lastFinishedPulling="2026-03-08 19:51:22.681764065 +0000 UTC m=+1184.077818108" observedRunningTime="2026-03-08 19:51:24.666410562 +0000 UTC m=+1186.062464585" watchObservedRunningTime="2026-03-08 19:51:24.672802021 +0000 UTC m=+1186.068856044" Mar 08 19:51:24 crc kubenswrapper[4885]: I0308 19:51:24.692936 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-mn4lz" podStartSLOduration=12.517301673 podStartE2EDuration="22.692904647s" podCreationTimestamp="2026-03-08 19:51:02 +0000 UTC" firstStartedPulling="2026-03-08 19:51:13.084813884 +0000 UTC m=+1174.480867907" lastFinishedPulling="2026-03-08 19:51:23.260416858 +0000 UTC m=+1184.656470881" observedRunningTime="2026-03-08 19:51:24.682228883 +0000 UTC m=+1186.078282926" watchObservedRunningTime="2026-03-08 19:51:24.692904647 +0000 UTC m=+1186.088958680" Mar 08 19:51:25 crc kubenswrapper[4885]: I0308 19:51:25.131049 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 08 19:51:25 crc kubenswrapper[4885]: I0308 19:51:25.131092 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 08 19:51:25 crc kubenswrapper[4885]: I0308 19:51:25.629271 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerStarted","Data":"e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1"} Mar 08 19:51:25 crc kubenswrapper[4885]: I0308 19:51:25.630254 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerStarted","Data":"a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f"} Mar 08 19:51:25 crc kubenswrapper[4885]: I0308 19:51:25.658963 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-pp4rs" podStartSLOduration=13.42910377 podStartE2EDuration="22.65894111s" podCreationTimestamp="2026-03-08 19:51:03 +0000 UTC" firstStartedPulling="2026-03-08 19:51:14.029469448 +0000 UTC m=+1175.425523471" lastFinishedPulling="2026-03-08 19:51:23.259306788 +0000 UTC m=+1184.655360811" observedRunningTime="2026-03-08 19:51:25.652027115 +0000 UTC m=+1187.048081148" watchObservedRunningTime="2026-03-08 19:51:25.65894111 +0000 UTC m=+1187.054995143" Mar 08 19:51:26 crc kubenswrapper[4885]: I0308 19:51:26.482711 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 08 19:51:26 crc kubenswrapper[4885]: I0308 19:51:26.483086 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 08 19:51:26 crc kubenswrapper[4885]: I0308 19:51:26.643144 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:26 crc kubenswrapper[4885]: I0308 19:51:26.643413 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:28 crc kubenswrapper[4885]: E0308 19:51:28.604418 4885 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.80:54714->38.102.83.80:33667: write tcp 38.102.83.80:54714->38.102.83.80:33667: write: broken pipe Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.659819 4885 generic.go:334] "Generic (PLEG): container finished" podID="2196a973-cc42-4633-8c99-1422d07d475a" containerID="3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08" exitCode=0 Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.659975 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" event={"ID":"2196a973-cc42-4633-8c99-1422d07d475a","Type":"ContainerDied","Data":"3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08"} Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.662726 4885 generic.go:334] "Generic (PLEG): container finished" podID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerID="fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030" exitCode=0 Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.662808 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" event={"ID":"604f26ec-2884-4eb6-97f9-e2961f8907b1","Type":"ContainerDied","Data":"fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030"} Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.666092 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b8dd6448-dd16-4487-bc90-f835712effc1","Type":"ContainerStarted","Data":"f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8"} Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.670742 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d768ed9e-b089-4308-befc-e3bd6aa68683","Type":"ContainerStarted","Data":"e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5"} Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.720171 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.158897603 podStartE2EDuration="27.720153411s" podCreationTimestamp="2026-03-08 19:51:01 +0000 UTC" firstStartedPulling="2026-03-08 19:51:13.273859756 +0000 UTC m=+1174.669913779" lastFinishedPulling="2026-03-08 19:51:27.835115564 +0000 UTC m=+1189.231169587" observedRunningTime="2026-03-08 19:51:28.713037222 +0000 UTC m=+1190.109091255" watchObservedRunningTime="2026-03-08 19:51:28.720153411 +0000 UTC m=+1190.116207434" Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.754127 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=10.3646741 podStartE2EDuration="24.754109905s" podCreationTimestamp="2026-03-08 19:51:04 +0000 UTC" firstStartedPulling="2026-03-08 19:51:13.465951659 +0000 UTC m=+1174.862005692" lastFinishedPulling="2026-03-08 19:51:27.855387474 +0000 UTC m=+1189.251441497" observedRunningTime="2026-03-08 19:51:28.743751799 +0000 UTC m=+1190.139805822" watchObservedRunningTime="2026-03-08 19:51:28.754109905 +0000 UTC m=+1190.150163928" Mar 08 19:51:28 crc kubenswrapper[4885]: I0308 19:51:28.997587 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.163979 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.270604 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.401330 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.455516 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.699720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" event={"ID":"2196a973-cc42-4633-8c99-1422d07d475a","Type":"ContainerStarted","Data":"bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b"} Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.699970 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.704270 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" event={"ID":"604f26ec-2884-4eb6-97f9-e2961f8907b1","Type":"ContainerStarted","Data":"d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf"} Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.705183 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.731212 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" podStartSLOduration=2.922839131 podStartE2EDuration="37.731183042s" podCreationTimestamp="2026-03-08 19:50:52 +0000 UTC" firstStartedPulling="2026-03-08 19:50:53.002169837 +0000 UTC m=+1154.398223860" lastFinishedPulling="2026-03-08 19:51:27.810513758 +0000 UTC m=+1189.206567771" observedRunningTime="2026-03-08 19:51:29.722636464 +0000 UTC m=+1191.118690557" watchObservedRunningTime="2026-03-08 19:51:29.731183042 +0000 UTC m=+1191.127237105" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.757446 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" podStartSLOduration=3.101946319 podStartE2EDuration="38.75741898s" podCreationTimestamp="2026-03-08 19:50:51 +0000 UTC" firstStartedPulling="2026-03-08 19:50:52.104046171 +0000 UTC m=+1153.500100194" lastFinishedPulling="2026-03-08 19:51:27.759518802 +0000 UTC m=+1189.155572855" observedRunningTime="2026-03-08 19:51:29.747478506 +0000 UTC m=+1191.143532559" watchObservedRunningTime="2026-03-08 19:51:29.75741898 +0000 UTC m=+1191.153473043" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.790118 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.888226 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:29 crc kubenswrapper[4885]: I0308 19:51:29.970657 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.061626 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-vwdgd"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.081899 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-rk2jx"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.087858 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.091478 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.095367 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-5qsh8"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.096352 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.106468 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.121715 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-rk2jx"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.130217 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5qsh8"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194676 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovn-rundir\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194748 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovs-rundir\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194798 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d55a2-f4f0-4e46-809c-367a3110c33d-config\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194828 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-combined-ca-bundle\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194849 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsn2v\" (UniqueName: \"kubernetes.io/projected/f432b919-7772-41bf-9113-94eefe45e347-kube-api-access-fsn2v\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194877 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-dns-svc\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194907 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-config\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194942 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4tlt\" (UniqueName: \"kubernetes.io/projected/474d55a2-f4f0-4e46-809c-367a3110c33d-kube-api-access-l4tlt\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.194971 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296398 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-combined-ca-bundle\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296448 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsn2v\" (UniqueName: \"kubernetes.io/projected/f432b919-7772-41bf-9113-94eefe45e347-kube-api-access-fsn2v\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296481 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-dns-svc\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296515 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-config\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296535 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4tlt\" (UniqueName: \"kubernetes.io/projected/474d55a2-f4f0-4e46-809c-367a3110c33d-kube-api-access-l4tlt\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296563 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovn-rundir\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296614 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296632 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovs-rundir\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.296659 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d55a2-f4f0-4e46-809c-367a3110c33d-config\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.297381 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d55a2-f4f0-4e46-809c-367a3110c33d-config\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.298203 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovn-rundir\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.298228 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovs-rundir\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.298895 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-dns-svc\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.299043 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-config\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.299115 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-ovsdbserver-nb\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.302570 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-combined-ca-bundle\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.315456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4tlt\" (UniqueName: \"kubernetes.io/projected/474d55a2-f4f0-4e46-809c-367a3110c33d-kube-api-access-l4tlt\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.315501 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-5qsh8\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.320894 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsn2v\" (UniqueName: \"kubernetes.io/projected/f432b919-7772-41bf-9113-94eefe45e347-kube-api-access-fsn2v\") pod \"dnsmasq-dns-6444958b7f-rk2jx\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.412276 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.431026 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.474310 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-86zq7"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.513448 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-fdwqp"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.515203 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.518426 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.524325 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-fdwqp"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.602431 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.602477 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.602496 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-config\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.602577 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.602619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jm7q\" (UniqueName: \"kubernetes.io/projected/201fa134-20f7-4902-8fd4-ba352e7f4e95-kube-api-access-6jm7q\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.719156 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.719217 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jm7q\" (UniqueName: \"kubernetes.io/projected/201fa134-20f7-4902-8fd4-ba352e7f4e95-kube-api-access-6jm7q\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.719245 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.719270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.719291 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-config\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.720380 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-config\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.720466 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-nb\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.720896 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-dns-svc\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.721071 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-sb\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.739516 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jm7q\" (UniqueName: \"kubernetes.io/projected/201fa134-20f7-4902-8fd4-ba352e7f4e95-kube-api-access-6jm7q\") pod \"dnsmasq-dns-7b57d9888c-fdwqp\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.748618 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.748882 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.748629 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerName="dnsmasq-dns" containerID="cri-o://d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf" gracePeriod=10 Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.792655 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.835068 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-5qsh8"] Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.845124 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:30 crc kubenswrapper[4885]: I0308 19:51:30.908063 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-rk2jx"] Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.015400 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.016937 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.019750 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-cmhs5" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.019887 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.019996 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.020170 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.041325 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133123 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-scripts\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133253 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133299 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133342 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbl7q\" (UniqueName: \"kubernetes.io/projected/f1f46cb2-c95d-40f5-9acc-720e094b91bc-kube-api-access-pbl7q\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133398 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-config\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.133440 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.156357 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.234765 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-config\") pod \"604f26ec-2884-4eb6-97f9-e2961f8907b1\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.234880 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mznsj\" (UniqueName: \"kubernetes.io/projected/604f26ec-2884-4eb6-97f9-e2961f8907b1-kube-api-access-mznsj\") pod \"604f26ec-2884-4eb6-97f9-e2961f8907b1\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235005 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-dns-svc\") pod \"604f26ec-2884-4eb6-97f9-e2961f8907b1\" (UID: \"604f26ec-2884-4eb6-97f9-e2961f8907b1\") " Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235230 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-scripts\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235304 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235345 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235365 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbl7q\" (UniqueName: \"kubernetes.io/projected/f1f46cb2-c95d-40f5-9acc-720e094b91bc-kube-api-access-pbl7q\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235388 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235422 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-config\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.235455 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.236423 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.237157 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-config\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.237833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-scripts\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.239553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.239667 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.240794 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.241009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/604f26ec-2884-4eb6-97f9-e2961f8907b1-kube-api-access-mznsj" (OuterVolumeSpecName: "kube-api-access-mznsj") pod "604f26ec-2884-4eb6-97f9-e2961f8907b1" (UID: "604f26ec-2884-4eb6-97f9-e2961f8907b1"). InnerVolumeSpecName "kube-api-access-mznsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.249991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbl7q\" (UniqueName: \"kubernetes.io/projected/f1f46cb2-c95d-40f5-9acc-720e094b91bc-kube-api-access-pbl7q\") pod \"ovn-northd-0\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.273483 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-config" (OuterVolumeSpecName: "config") pod "604f26ec-2884-4eb6-97f9-e2961f8907b1" (UID: "604f26ec-2884-4eb6-97f9-e2961f8907b1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.275086 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "604f26ec-2884-4eb6-97f9-e2961f8907b1" (UID: "604f26ec-2884-4eb6-97f9-e2961f8907b1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.279011 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.337985 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.338231 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/604f26ec-2884-4eb6-97f9-e2961f8907b1-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.338241 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mznsj\" (UniqueName: \"kubernetes.io/projected/604f26ec-2884-4eb6-97f9-e2961f8907b1-kube-api-access-mznsj\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.342775 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-fdwqp"] Mar 08 19:51:31 crc kubenswrapper[4885]: W0308 19:51:31.343652 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod201fa134_20f7_4902_8fd4_ba352e7f4e95.slice/crio-4817fdc2e93c45edc661de711d2e74d24fa8c501db17ccf9809b6e4e92461549 WatchSource:0}: Error finding container 4817fdc2e93c45edc661de711d2e74d24fa8c501db17ccf9809b6e4e92461549: Status 404 returned error can't find the container with id 4817fdc2e93c45edc661de711d2e74d24fa8c501db17ccf9809b6e4e92461549 Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.369161 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.385881 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.667206 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.758563 4885 generic.go:334] "Generic (PLEG): container finished" podID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerID="d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf" exitCode=0 Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.758647 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" event={"ID":"604f26ec-2884-4eb6-97f9-e2961f8907b1","Type":"ContainerDied","Data":"d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.758647 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.758679 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9fc56ff-vwdgd" event={"ID":"604f26ec-2884-4eb6-97f9-e2961f8907b1","Type":"ContainerDied","Data":"abe3994c827c15ee527cc92242a1479dbff3de65c1e91c72ef3fd14c10326728"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.758716 4885 scope.go:117] "RemoveContainer" containerID="d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.763036 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5qsh8" event={"ID":"474d55a2-f4f0-4e46-809c-367a3110c33d","Type":"ContainerStarted","Data":"e8623499ee05972629def49d746cd17bbc01095c41d8c7e431c723d1dc4187ee"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.763065 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5qsh8" event={"ID":"474d55a2-f4f0-4e46-809c-367a3110c33d","Type":"ContainerStarted","Data":"35bbd23060f341a3429a5bab6384434330b103927103bf0acea28883cf67dc65"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.765421 4885 generic.go:334] "Generic (PLEG): container finished" podID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerID="e6cf715e1922fcc27c66130c6d2113c75ad20d6e5d12122260396ed02a84d181" exitCode=0 Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.765476 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" event={"ID":"201fa134-20f7-4902-8fd4-ba352e7f4e95","Type":"ContainerDied","Data":"e6cf715e1922fcc27c66130c6d2113c75ad20d6e5d12122260396ed02a84d181"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.765497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" event={"ID":"201fa134-20f7-4902-8fd4-ba352e7f4e95","Type":"ContainerStarted","Data":"4817fdc2e93c45edc661de711d2e74d24fa8c501db17ccf9809b6e4e92461549"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.767971 4885 generic.go:334] "Generic (PLEG): container finished" podID="f432b919-7772-41bf-9113-94eefe45e347" containerID="fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0" exitCode=0 Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.768022 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" event={"ID":"f432b919-7772-41bf-9113-94eefe45e347","Type":"ContainerDied","Data":"fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.768185 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" event={"ID":"f432b919-7772-41bf-9113-94eefe45e347","Type":"ContainerStarted","Data":"472650bb697781b76975122a9d79fd5f4e880cc3eb0edd5811698f77d98f72e5"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.769468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1f46cb2-c95d-40f5-9acc-720e094b91bc","Type":"ContainerStarted","Data":"cd1df5e26dfde01021643639b3d30a9000c123fa83c48692684173ba1b046531"} Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.769612 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" podUID="2196a973-cc42-4633-8c99-1422d07d475a" containerName="dnsmasq-dns" containerID="cri-o://bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b" gracePeriod=10 Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.785686 4885 scope.go:117] "RemoveContainer" containerID="fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.785609 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-vwdgd"] Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.791789 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79f9fc56ff-vwdgd"] Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.815822 4885 scope.go:117] "RemoveContainer" containerID="d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf" Mar 08 19:51:31 crc kubenswrapper[4885]: E0308 19:51:31.816175 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf\": container with ID starting with d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf not found: ID does not exist" containerID="d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.816214 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf"} err="failed to get container status \"d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf\": rpc error: code = NotFound desc = could not find container \"d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf\": container with ID starting with d5610cf14e1f604b30312afac26e2cbf21c9876a069ff1aceb9c82436553bfbf not found: ID does not exist" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.816236 4885 scope.go:117] "RemoveContainer" containerID="fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030" Mar 08 19:51:31 crc kubenswrapper[4885]: E0308 19:51:31.816479 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030\": container with ID starting with fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030 not found: ID does not exist" containerID="fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.816733 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030"} err="failed to get container status \"fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030\": rpc error: code = NotFound desc = could not find container \"fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030\": container with ID starting with fd7a612aca70e00ac6c5c1da52a61ed0c56ef93b9d83f94ceef847839445c030 not found: ID does not exist" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.833232 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 08 19:51:31 crc kubenswrapper[4885]: I0308 19:51:31.844789 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-5qsh8" podStartSLOduration=1.8447738089999999 podStartE2EDuration="1.844773809s" podCreationTimestamp="2026-03-08 19:51:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:51:31.836476798 +0000 UTC m=+1193.232530831" watchObservedRunningTime="2026-03-08 19:51:31.844773809 +0000 UTC m=+1193.240827832" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.124260 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a047-account-create-update-64r5q"] Mar 08 19:51:32 crc kubenswrapper[4885]: E0308 19:51:32.124826 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerName="init" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.124836 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerName="init" Mar 08 19:51:32 crc kubenswrapper[4885]: E0308 19:51:32.124874 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerName="dnsmasq-dns" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.124881 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerName="dnsmasq-dns" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.125028 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" containerName="dnsmasq-dns" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.125565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.127226 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.130479 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-9jddz"] Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.131193 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.137134 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9jddz"] Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.159410 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7cjr\" (UniqueName: \"kubernetes.io/projected/0f704685-800d-4386-a47d-8c60b0885aca-kube-api-access-t7cjr\") pod \"glance-a047-account-create-update-64r5q\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.159468 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrjtg\" (UniqueName: \"kubernetes.io/projected/5f0edc25-2cc1-4111-96e3-3807e6463d57-kube-api-access-vrjtg\") pod \"glance-db-create-9jddz\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.159546 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0edc25-2cc1-4111-96e3-3807e6463d57-operator-scripts\") pod \"glance-db-create-9jddz\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.159576 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f704685-800d-4386-a47d-8c60b0885aca-operator-scripts\") pod \"glance-a047-account-create-update-64r5q\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.172775 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a047-account-create-update-64r5q"] Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.197798 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.261052 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nskr\" (UniqueName: \"kubernetes.io/projected/2196a973-cc42-4633-8c99-1422d07d475a-kube-api-access-5nskr\") pod \"2196a973-cc42-4633-8c99-1422d07d475a\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.261445 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-config\") pod \"2196a973-cc42-4633-8c99-1422d07d475a\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.261546 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-dns-svc\") pod \"2196a973-cc42-4633-8c99-1422d07d475a\" (UID: \"2196a973-cc42-4633-8c99-1422d07d475a\") " Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.261848 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7cjr\" (UniqueName: \"kubernetes.io/projected/0f704685-800d-4386-a47d-8c60b0885aca-kube-api-access-t7cjr\") pod \"glance-a047-account-create-update-64r5q\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.262212 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrjtg\" (UniqueName: \"kubernetes.io/projected/5f0edc25-2cc1-4111-96e3-3807e6463d57-kube-api-access-vrjtg\") pod \"glance-db-create-9jddz\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.262494 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0edc25-2cc1-4111-96e3-3807e6463d57-operator-scripts\") pod \"glance-db-create-9jddz\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.263947 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f704685-800d-4386-a47d-8c60b0885aca-operator-scripts\") pod \"glance-a047-account-create-update-64r5q\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.263888 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0edc25-2cc1-4111-96e3-3807e6463d57-operator-scripts\") pod \"glance-db-create-9jddz\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.264560 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f704685-800d-4386-a47d-8c60b0885aca-operator-scripts\") pod \"glance-a047-account-create-update-64r5q\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.266270 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2196a973-cc42-4633-8c99-1422d07d475a-kube-api-access-5nskr" (OuterVolumeSpecName: "kube-api-access-5nskr") pod "2196a973-cc42-4633-8c99-1422d07d475a" (UID: "2196a973-cc42-4633-8c99-1422d07d475a"). InnerVolumeSpecName "kube-api-access-5nskr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.276622 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrjtg\" (UniqueName: \"kubernetes.io/projected/5f0edc25-2cc1-4111-96e3-3807e6463d57-kube-api-access-vrjtg\") pod \"glance-db-create-9jddz\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.276689 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7cjr\" (UniqueName: \"kubernetes.io/projected/0f704685-800d-4386-a47d-8c60b0885aca-kube-api-access-t7cjr\") pod \"glance-a047-account-create-update-64r5q\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.301166 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-config" (OuterVolumeSpecName: "config") pod "2196a973-cc42-4633-8c99-1422d07d475a" (UID: "2196a973-cc42-4633-8c99-1422d07d475a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.327366 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2196a973-cc42-4633-8c99-1422d07d475a" (UID: "2196a973-cc42-4633-8c99-1422d07d475a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.365148 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.365170 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2196a973-cc42-4633-8c99-1422d07d475a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.365180 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nskr\" (UniqueName: \"kubernetes.io/projected/2196a973-cc42-4633-8c99-1422d07d475a-kube-api-access-5nskr\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.456271 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.485835 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9jddz" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.810750 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" event={"ID":"201fa134-20f7-4902-8fd4-ba352e7f4e95","Type":"ContainerStarted","Data":"b0f30ec931ed12658474aee25cc28d8b7b4c49ada3c642c748084351a40ddd97"} Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.811175 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.818761 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" event={"ID":"f432b919-7772-41bf-9113-94eefe45e347","Type":"ContainerStarted","Data":"e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2"} Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.819130 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.822376 4885 generic.go:334] "Generic (PLEG): container finished" podID="2196a973-cc42-4633-8c99-1422d07d475a" containerID="bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b" exitCode=0 Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.822762 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.822801 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" event={"ID":"2196a973-cc42-4633-8c99-1422d07d475a","Type":"ContainerDied","Data":"bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b"} Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.822893 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c47bcb9f9-86zq7" event={"ID":"2196a973-cc42-4633-8c99-1422d07d475a","Type":"ContainerDied","Data":"ea36b64eee9bd69dcd99d1003ff2856f54e1225bf3a3740ad6059b5726729f56"} Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.822957 4885 scope.go:117] "RemoveContainer" containerID="bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.840673 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" podStartSLOduration=2.840655456 podStartE2EDuration="2.840655456s" podCreationTimestamp="2026-03-08 19:51:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:51:32.830474876 +0000 UTC m=+1194.226528899" watchObservedRunningTime="2026-03-08 19:51:32.840655456 +0000 UTC m=+1194.236709479" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.856175 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" podStartSLOduration=2.856157919 podStartE2EDuration="2.856157919s" podCreationTimestamp="2026-03-08 19:51:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:51:32.852278595 +0000 UTC m=+1194.248332618" watchObservedRunningTime="2026-03-08 19:51:32.856157919 +0000 UTC m=+1194.252211942" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.876667 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-86zq7"] Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.882577 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c47bcb9f9-86zq7"] Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.889913 4885 scope.go:117] "RemoveContainer" containerID="3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.911751 4885 scope.go:117] "RemoveContainer" containerID="bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b" Mar 08 19:51:32 crc kubenswrapper[4885]: E0308 19:51:32.912112 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b\": container with ID starting with bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b not found: ID does not exist" containerID="bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.912134 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b"} err="failed to get container status \"bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b\": rpc error: code = NotFound desc = could not find container \"bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b\": container with ID starting with bde155fbe653e006a60cff9519314d789d07d2fa2879c45af23dc12239876a1b not found: ID does not exist" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.912152 4885 scope.go:117] "RemoveContainer" containerID="3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08" Mar 08 19:51:32 crc kubenswrapper[4885]: E0308 19:51:32.912467 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08\": container with ID starting with 3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08 not found: ID does not exist" containerID="3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08" Mar 08 19:51:32 crc kubenswrapper[4885]: I0308 19:51:32.912501 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08"} err="failed to get container status \"3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08\": rpc error: code = NotFound desc = could not find container \"3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08\": container with ID starting with 3190668ddea68977189d5f30a1dd4b09ed9ee783dae1e3957464a39ad9ff7a08 not found: ID does not exist" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.173404 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-9jddz"] Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.183235 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a047-account-create-update-64r5q"] Mar 08 19:51:33 crc kubenswrapper[4885]: W0308 19:51:33.191757 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f0edc25_2cc1_4111_96e3_3807e6463d57.slice/crio-656b3cfce09ee41fa0191c5f939961281dbdfc743c98f1c8a45eaa3a9a0b60a7 WatchSource:0}: Error finding container 656b3cfce09ee41fa0191c5f939961281dbdfc743c98f1c8a45eaa3a9a0b60a7: Status 404 returned error can't find the container with id 656b3cfce09ee41fa0191c5f939961281dbdfc743c98f1c8a45eaa3a9a0b60a7 Mar 08 19:51:33 crc kubenswrapper[4885]: W0308 19:51:33.196628 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f704685_800d_4386_a47d_8c60b0885aca.slice/crio-8fba01d30b11a02729ca422508e1391564970bb51e5260ee86f36c1c12c7cf17 WatchSource:0}: Error finding container 8fba01d30b11a02729ca422508e1391564970bb51e5260ee86f36c1c12c7cf17: Status 404 returned error can't find the container with id 8fba01d30b11a02729ca422508e1391564970bb51e5260ee86f36c1c12c7cf17 Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.379870 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2196a973-cc42-4633-8c99-1422d07d475a" path="/var/lib/kubelet/pods/2196a973-cc42-4633-8c99-1422d07d475a/volumes" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.381661 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="604f26ec-2884-4eb6-97f9-e2961f8907b1" path="/var/lib/kubelet/pods/604f26ec-2884-4eb6-97f9-e2961f8907b1/volumes" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.760758 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-rrt62"] Mar 08 19:51:33 crc kubenswrapper[4885]: E0308 19:51:33.761067 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2196a973-cc42-4633-8c99-1422d07d475a" containerName="dnsmasq-dns" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.761083 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2196a973-cc42-4633-8c99-1422d07d475a" containerName="dnsmasq-dns" Mar 08 19:51:33 crc kubenswrapper[4885]: E0308 19:51:33.761103 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2196a973-cc42-4633-8c99-1422d07d475a" containerName="init" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.761109 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2196a973-cc42-4633-8c99-1422d07d475a" containerName="init" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.761276 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2196a973-cc42-4633-8c99-1422d07d475a" containerName="dnsmasq-dns" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.761761 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.764306 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.811285 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rrt62"] Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.834009 4885 generic.go:334] "Generic (PLEG): container finished" podID="5f0edc25-2cc1-4111-96e3-3807e6463d57" containerID="e5feabe92d49eb8fd4bb48801094df276f9bf1fc07181b4b0ee0908d604394fb" exitCode=0 Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.834072 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9jddz" event={"ID":"5f0edc25-2cc1-4111-96e3-3807e6463d57","Type":"ContainerDied","Data":"e5feabe92d49eb8fd4bb48801094df276f9bf1fc07181b4b0ee0908d604394fb"} Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.834097 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9jddz" event={"ID":"5f0edc25-2cc1-4111-96e3-3807e6463d57","Type":"ContainerStarted","Data":"656b3cfce09ee41fa0191c5f939961281dbdfc743c98f1c8a45eaa3a9a0b60a7"} Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.835949 4885 generic.go:334] "Generic (PLEG): container finished" podID="0f704685-800d-4386-a47d-8c60b0885aca" containerID="3cb04d8216824e70d6b5ea33718713bb6914ece1b0e3362b1186f648f1502b81" exitCode=0 Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.836020 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a047-account-create-update-64r5q" event={"ID":"0f704685-800d-4386-a47d-8c60b0885aca","Type":"ContainerDied","Data":"3cb04d8216824e70d6b5ea33718713bb6914ece1b0e3362b1186f648f1502b81"} Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.836088 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a047-account-create-update-64r5q" event={"ID":"0f704685-800d-4386-a47d-8c60b0885aca","Type":"ContainerStarted","Data":"8fba01d30b11a02729ca422508e1391564970bb51e5260ee86f36c1c12c7cf17"} Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.839250 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1f46cb2-c95d-40f5-9acc-720e094b91bc","Type":"ContainerStarted","Data":"619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0"} Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.839298 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1f46cb2-c95d-40f5-9acc-720e094b91bc","Type":"ContainerStarted","Data":"e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f"} Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.839380 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.894358 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-operator-scripts\") pod \"root-account-create-update-rrt62\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.894449 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vppxj\" (UniqueName: \"kubernetes.io/projected/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-kube-api-access-vppxj\") pod \"root-account-create-update-rrt62\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.907739 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.902105392 podStartE2EDuration="3.907714758s" podCreationTimestamp="2026-03-08 19:51:30 +0000 UTC" firstStartedPulling="2026-03-08 19:51:31.686657761 +0000 UTC m=+1193.082711784" lastFinishedPulling="2026-03-08 19:51:32.692267137 +0000 UTC m=+1194.088321150" observedRunningTime="2026-03-08 19:51:33.882651461 +0000 UTC m=+1195.278705494" watchObservedRunningTime="2026-03-08 19:51:33.907714758 +0000 UTC m=+1195.303768821" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.996556 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-operator-scripts\") pod \"root-account-create-update-rrt62\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.997471 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-operator-scripts\") pod \"root-account-create-update-rrt62\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:33 crc kubenswrapper[4885]: I0308 19:51:33.998078 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vppxj\" (UniqueName: \"kubernetes.io/projected/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-kube-api-access-vppxj\") pod \"root-account-create-update-rrt62\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:34 crc kubenswrapper[4885]: I0308 19:51:34.019305 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vppxj\" (UniqueName: \"kubernetes.io/projected/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-kube-api-access-vppxj\") pod \"root-account-create-update-rrt62\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:34 crc kubenswrapper[4885]: I0308 19:51:34.078577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:34 crc kubenswrapper[4885]: I0308 19:51:34.337214 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-rrt62"] Mar 08 19:51:34 crc kubenswrapper[4885]: I0308 19:51:34.858329 4885 generic.go:334] "Generic (PLEG): container finished" podID="83db30de-c4aa-4a2f-9e5f-e4545e4ff475" containerID="d990977988383de183ee74b10460a2aef417ed74ff41f049c648f4b0922ddb17" exitCode=0 Mar 08 19:51:34 crc kubenswrapper[4885]: I0308 19:51:34.858403 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rrt62" event={"ID":"83db30de-c4aa-4a2f-9e5f-e4545e4ff475","Type":"ContainerDied","Data":"d990977988383de183ee74b10460a2aef417ed74ff41f049c648f4b0922ddb17"} Mar 08 19:51:34 crc kubenswrapper[4885]: I0308 19:51:34.858823 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rrt62" event={"ID":"83db30de-c4aa-4a2f-9e5f-e4545e4ff475","Type":"ContainerStarted","Data":"4748f37f3293a79f70bd6bd68218faa3a5c8bfcbcfa932e6390b681055f8d44c"} Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.305439 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9jddz" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.323767 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.424733 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7cjr\" (UniqueName: \"kubernetes.io/projected/0f704685-800d-4386-a47d-8c60b0885aca-kube-api-access-t7cjr\") pod \"0f704685-800d-4386-a47d-8c60b0885aca\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.425016 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0edc25-2cc1-4111-96e3-3807e6463d57-operator-scripts\") pod \"5f0edc25-2cc1-4111-96e3-3807e6463d57\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.425176 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f704685-800d-4386-a47d-8c60b0885aca-operator-scripts\") pod \"0f704685-800d-4386-a47d-8c60b0885aca\" (UID: \"0f704685-800d-4386-a47d-8c60b0885aca\") " Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.425371 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrjtg\" (UniqueName: \"kubernetes.io/projected/5f0edc25-2cc1-4111-96e3-3807e6463d57-kube-api-access-vrjtg\") pod \"5f0edc25-2cc1-4111-96e3-3807e6463d57\" (UID: \"5f0edc25-2cc1-4111-96e3-3807e6463d57\") " Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.425560 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f704685-800d-4386-a47d-8c60b0885aca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0f704685-800d-4386-a47d-8c60b0885aca" (UID: "0f704685-800d-4386-a47d-8c60b0885aca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.425702 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f0edc25-2cc1-4111-96e3-3807e6463d57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f0edc25-2cc1-4111-96e3-3807e6463d57" (UID: "5f0edc25-2cc1-4111-96e3-3807e6463d57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.426406 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0f704685-800d-4386-a47d-8c60b0885aca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.426491 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f0edc25-2cc1-4111-96e3-3807e6463d57-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.431596 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0edc25-2cc1-4111-96e3-3807e6463d57-kube-api-access-vrjtg" (OuterVolumeSpecName: "kube-api-access-vrjtg") pod "5f0edc25-2cc1-4111-96e3-3807e6463d57" (UID: "5f0edc25-2cc1-4111-96e3-3807e6463d57"). InnerVolumeSpecName "kube-api-access-vrjtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.436879 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f704685-800d-4386-a47d-8c60b0885aca-kube-api-access-t7cjr" (OuterVolumeSpecName: "kube-api-access-t7cjr") pod "0f704685-800d-4386-a47d-8c60b0885aca" (UID: "0f704685-800d-4386-a47d-8c60b0885aca"). InnerVolumeSpecName "kube-api-access-t7cjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.529296 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrjtg\" (UniqueName: \"kubernetes.io/projected/5f0edc25-2cc1-4111-96e3-3807e6463d57-kube-api-access-vrjtg\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.529616 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7cjr\" (UniqueName: \"kubernetes.io/projected/0f704685-800d-4386-a47d-8c60b0885aca-kube-api-access-t7cjr\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.870967 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-9jddz" event={"ID":"5f0edc25-2cc1-4111-96e3-3807e6463d57","Type":"ContainerDied","Data":"656b3cfce09ee41fa0191c5f939961281dbdfc743c98f1c8a45eaa3a9a0b60a7"} Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.871023 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="656b3cfce09ee41fa0191c5f939961281dbdfc743c98f1c8a45eaa3a9a0b60a7" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.871052 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-9jddz" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.875415 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a047-account-create-update-64r5q" event={"ID":"0f704685-800d-4386-a47d-8c60b0885aca","Type":"ContainerDied","Data":"8fba01d30b11a02729ca422508e1391564970bb51e5260ee86f36c1c12c7cf17"} Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.875459 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fba01d30b11a02729ca422508e1391564970bb51e5260ee86f36c1c12c7cf17" Mar 08 19:51:35 crc kubenswrapper[4885]: I0308 19:51:35.875587 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-64r5q" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.310549 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.447580 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-operator-scripts\") pod \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.447667 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vppxj\" (UniqueName: \"kubernetes.io/projected/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-kube-api-access-vppxj\") pod \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\" (UID: \"83db30de-c4aa-4a2f-9e5f-e4545e4ff475\") " Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.450261 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "83db30de-c4aa-4a2f-9e5f-e4545e4ff475" (UID: "83db30de-c4aa-4a2f-9e5f-e4545e4ff475"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.459213 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-kube-api-access-vppxj" (OuterVolumeSpecName: "kube-api-access-vppxj") pod "83db30de-c4aa-4a2f-9e5f-e4545e4ff475" (UID: "83db30de-c4aa-4a2f-9e5f-e4545e4ff475"). InnerVolumeSpecName "kube-api-access-vppxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.551316 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.551372 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vppxj\" (UniqueName: \"kubernetes.io/projected/83db30de-c4aa-4a2f-9e5f-e4545e4ff475-kube-api-access-vppxj\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.884485 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-rrt62" event={"ID":"83db30de-c4aa-4a2f-9e5f-e4545e4ff475","Type":"ContainerDied","Data":"4748f37f3293a79f70bd6bd68218faa3a5c8bfcbcfa932e6390b681055f8d44c"} Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.884970 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4748f37f3293a79f70bd6bd68218faa3a5c8bfcbcfa932e6390b681055f8d44c" Mar 08 19:51:36 crc kubenswrapper[4885]: I0308 19:51:36.884576 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-rrt62" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.310395 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-pq8mq"] Mar 08 19:51:37 crc kubenswrapper[4885]: E0308 19:51:37.310802 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83db30de-c4aa-4a2f-9e5f-e4545e4ff475" containerName="mariadb-account-create-update" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.310829 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="83db30de-c4aa-4a2f-9e5f-e4545e4ff475" containerName="mariadb-account-create-update" Mar 08 19:51:37 crc kubenswrapper[4885]: E0308 19:51:37.310851 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0edc25-2cc1-4111-96e3-3807e6463d57" containerName="mariadb-database-create" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.310864 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0edc25-2cc1-4111-96e3-3807e6463d57" containerName="mariadb-database-create" Mar 08 19:51:37 crc kubenswrapper[4885]: E0308 19:51:37.310902 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f704685-800d-4386-a47d-8c60b0885aca" containerName="mariadb-account-create-update" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.310913 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f704685-800d-4386-a47d-8c60b0885aca" containerName="mariadb-account-create-update" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.311133 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f704685-800d-4386-a47d-8c60b0885aca" containerName="mariadb-account-create-update" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.311150 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0edc25-2cc1-4111-96e3-3807e6463d57" containerName="mariadb-database-create" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.311163 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="83db30de-c4aa-4a2f-9e5f-e4545e4ff475" containerName="mariadb-account-create-update" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.311917 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.317823 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.319776 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zffrj" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.320401 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pq8mq"] Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.475080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-db-sync-config-data\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.475274 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-combined-ca-bundle\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.475565 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfl6p\" (UniqueName: \"kubernetes.io/projected/618b5189-8b29-473f-b59c-e911fca71041-kube-api-access-sfl6p\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.475615 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-config-data\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.577957 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-config-data\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.578167 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-db-sync-config-data\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.578262 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-combined-ca-bundle\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.578462 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfl6p\" (UniqueName: \"kubernetes.io/projected/618b5189-8b29-473f-b59c-e911fca71041-kube-api-access-sfl6p\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.583295 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-combined-ca-bundle\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.593385 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-db-sync-config-data\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.593713 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-config-data\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.602506 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfl6p\" (UniqueName: \"kubernetes.io/projected/618b5189-8b29-473f-b59c-e911fca71041-kube-api-access-sfl6p\") pod \"glance-db-sync-pq8mq\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.649856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pq8mq" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.963425 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-ll64z"] Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.965002 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:37 crc kubenswrapper[4885]: I0308 19:51:37.985712 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ll64z"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.001103 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7884923-e1d5-4b4d-a285-680bfbe38277-operator-scripts\") pod \"keystone-db-create-ll64z\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.001189 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxtkz\" (UniqueName: \"kubernetes.io/projected/f7884923-e1d5-4b4d-a285-680bfbe38277-kube-api-access-nxtkz\") pod \"keystone-db-create-ll64z\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.074443 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3705-account-create-update-2brz9"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.075657 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.081908 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.082893 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3705-account-create-update-2brz9"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.103174 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7884923-e1d5-4b4d-a285-680bfbe38277-operator-scripts\") pod \"keystone-db-create-ll64z\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.103234 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxtkz\" (UniqueName: \"kubernetes.io/projected/f7884923-e1d5-4b4d-a285-680bfbe38277-kube-api-access-nxtkz\") pod \"keystone-db-create-ll64z\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.103323 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp66p\" (UniqueName: \"kubernetes.io/projected/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-kube-api-access-fp66p\") pod \"keystone-3705-account-create-update-2brz9\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.103377 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-operator-scripts\") pod \"keystone-3705-account-create-update-2brz9\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.103875 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7884923-e1d5-4b4d-a285-680bfbe38277-operator-scripts\") pod \"keystone-db-create-ll64z\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.128499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxtkz\" (UniqueName: \"kubernetes.io/projected/f7884923-e1d5-4b4d-a285-680bfbe38277-kube-api-access-nxtkz\") pod \"keystone-db-create-ll64z\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.185320 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-25qrp"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.186432 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.197718 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-25qrp"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.204875 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g56sf\" (UniqueName: \"kubernetes.io/projected/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-kube-api-access-g56sf\") pod \"placement-db-create-25qrp\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.204946 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-operator-scripts\") pod \"placement-db-create-25qrp\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.205124 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp66p\" (UniqueName: \"kubernetes.io/projected/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-kube-api-access-fp66p\") pod \"keystone-3705-account-create-update-2brz9\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.205181 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-operator-scripts\") pod \"keystone-3705-account-create-update-2brz9\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.206270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-operator-scripts\") pod \"keystone-3705-account-create-update-2brz9\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.226772 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp66p\" (UniqueName: \"kubernetes.io/projected/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-kube-api-access-fp66p\") pod \"keystone-3705-account-create-update-2brz9\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.241363 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pq8mq"] Mar 08 19:51:38 crc kubenswrapper[4885]: W0308 19:51:38.247619 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod618b5189_8b29_473f_b59c_e911fca71041.slice/crio-6a7111739f9460209a507bc904ec5f5cda31cf0c98c09b8cae21f39ac42b3d39 WatchSource:0}: Error finding container 6a7111739f9460209a507bc904ec5f5cda31cf0c98c09b8cae21f39ac42b3d39: Status 404 returned error can't find the container with id 6a7111739f9460209a507bc904ec5f5cda31cf0c98c09b8cae21f39ac42b3d39 Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.276229 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3284-account-create-update-qht6h"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.277139 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.280737 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.295968 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3284-account-create-update-qht6h"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.299423 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.305881 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-operator-scripts\") pod \"placement-3284-account-create-update-qht6h\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.305948 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g56sf\" (UniqueName: \"kubernetes.io/projected/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-kube-api-access-g56sf\") pod \"placement-db-create-25qrp\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.306051 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-operator-scripts\") pod \"placement-db-create-25qrp\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.306235 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqnf9\" (UniqueName: \"kubernetes.io/projected/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-kube-api-access-hqnf9\") pod \"placement-3284-account-create-update-qht6h\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.307882 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-operator-scripts\") pod \"placement-db-create-25qrp\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.326484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g56sf\" (UniqueName: \"kubernetes.io/projected/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-kube-api-access-g56sf\") pod \"placement-db-create-25qrp\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.391047 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.407916 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-operator-scripts\") pod \"placement-3284-account-create-update-qht6h\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.408105 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqnf9\" (UniqueName: \"kubernetes.io/projected/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-kube-api-access-hqnf9\") pod \"placement-3284-account-create-update-qht6h\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.409661 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-operator-scripts\") pod \"placement-3284-account-create-update-qht6h\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.424841 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqnf9\" (UniqueName: \"kubernetes.io/projected/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-kube-api-access-hqnf9\") pod \"placement-3284-account-create-update-qht6h\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.507705 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-25qrp" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.594910 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.729412 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-ll64z"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.853626 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3705-account-create-update-2brz9"] Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.921839 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3705-account-create-update-2brz9" event={"ID":"8b3418f5-a92a-4fe6-b0ea-929b54ecb052","Type":"ContainerStarted","Data":"9b1459e7cf206ea3c6375a47a07ece269bc28d4b4c10bc223fc70b9125df1823"} Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.934848 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ll64z" event={"ID":"f7884923-e1d5-4b4d-a285-680bfbe38277","Type":"ContainerStarted","Data":"386f6aaa7f8daad0074b28ca5968bd9be45e6528e49c396b92ec2011c6026b34"} Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.936239 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pq8mq" event={"ID":"618b5189-8b29-473f-b59c-e911fca71041","Type":"ContainerStarted","Data":"6a7111739f9460209a507bc904ec5f5cda31cf0c98c09b8cae21f39ac42b3d39"} Mar 08 19:51:38 crc kubenswrapper[4885]: I0308 19:51:38.956659 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-25qrp"] Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.021337 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-rk2jx"] Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.021608 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" podUID="f432b919-7772-41bf-9113-94eefe45e347" containerName="dnsmasq-dns" containerID="cri-o://e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2" gracePeriod=10 Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.030059 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.045272 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-2cszd"] Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.046742 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.060913 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-2cszd"] Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.117346 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3284-account-create-update-qht6h"] Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.118504 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.118571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.118601 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq487\" (UniqueName: \"kubernetes.io/projected/67aa348d-fe05-4e05-af01-a0b22d170a9b-kube-api-access-jq487\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.118654 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-config\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.118682 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-dns-svc\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.234786 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.234855 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.234900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq487\" (UniqueName: \"kubernetes.io/projected/67aa348d-fe05-4e05-af01-a0b22d170a9b-kube-api-access-jq487\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.234958 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-config\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.234986 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-dns-svc\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.236028 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-dns-svc\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.236739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-sb\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.238046 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.239205 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-config\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.445009 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq487\" (UniqueName: \"kubernetes.io/projected/67aa348d-fe05-4e05-af01-a0b22d170a9b-kube-api-access-jq487\") pod \"dnsmasq-dns-675f7dd995-2cszd\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.534555 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.725793 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.847989 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-ovsdbserver-nb\") pod \"f432b919-7772-41bf-9113-94eefe45e347\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.848089 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-dns-svc\") pod \"f432b919-7772-41bf-9113-94eefe45e347\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.848115 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsn2v\" (UniqueName: \"kubernetes.io/projected/f432b919-7772-41bf-9113-94eefe45e347-kube-api-access-fsn2v\") pod \"f432b919-7772-41bf-9113-94eefe45e347\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.848213 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-config\") pod \"f432b919-7772-41bf-9113-94eefe45e347\" (UID: \"f432b919-7772-41bf-9113-94eefe45e347\") " Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.855102 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f432b919-7772-41bf-9113-94eefe45e347-kube-api-access-fsn2v" (OuterVolumeSpecName: "kube-api-access-fsn2v") pod "f432b919-7772-41bf-9113-94eefe45e347" (UID: "f432b919-7772-41bf-9113-94eefe45e347"). InnerVolumeSpecName "kube-api-access-fsn2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.888942 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-config" (OuterVolumeSpecName: "config") pod "f432b919-7772-41bf-9113-94eefe45e347" (UID: "f432b919-7772-41bf-9113-94eefe45e347"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.889087 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f432b919-7772-41bf-9113-94eefe45e347" (UID: "f432b919-7772-41bf-9113-94eefe45e347"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.905121 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f432b919-7772-41bf-9113-94eefe45e347" (UID: "f432b919-7772-41bf-9113-94eefe45e347"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.950002 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.950045 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsn2v\" (UniqueName: \"kubernetes.io/projected/f432b919-7772-41bf-9113-94eefe45e347-kube-api-access-fsn2v\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.950062 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.950076 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f432b919-7772-41bf-9113-94eefe45e347-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.958707 4885 generic.go:334] "Generic (PLEG): container finished" podID="f432b919-7772-41bf-9113-94eefe45e347" containerID="e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2" exitCode=0 Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.958767 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" event={"ID":"f432b919-7772-41bf-9113-94eefe45e347","Type":"ContainerDied","Data":"e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.958793 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" event={"ID":"f432b919-7772-41bf-9113-94eefe45e347","Type":"ContainerDied","Data":"472650bb697781b76975122a9d79fd5f4e880cc3eb0edd5811698f77d98f72e5"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.958811 4885 scope.go:117] "RemoveContainer" containerID="e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.958942 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6444958b7f-rk2jx" Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.966665 4885 generic.go:334] "Generic (PLEG): container finished" podID="761f5c93-2ed3-43f0-acaf-ee92d0719ec3" containerID="ca2add6996115e29bd86a097fbce1cceadad7160db189d6c7e405a523a1ccb6e" exitCode=0 Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.966747 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-25qrp" event={"ID":"761f5c93-2ed3-43f0-acaf-ee92d0719ec3","Type":"ContainerDied","Data":"ca2add6996115e29bd86a097fbce1cceadad7160db189d6c7e405a523a1ccb6e"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.966770 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-25qrp" event={"ID":"761f5c93-2ed3-43f0-acaf-ee92d0719ec3","Type":"ContainerStarted","Data":"0de5eb20fec0068fad7220dbd8c11cda9bad6ac30e75bf5f0e0e2846ae820c1a"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.968790 4885 generic.go:334] "Generic (PLEG): container finished" podID="f7884923-e1d5-4b4d-a285-680bfbe38277" containerID="b4398eab96435c81b8a2366ba9291b7b0c13edf908fc801823865f8458709b7a" exitCode=0 Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.968826 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ll64z" event={"ID":"f7884923-e1d5-4b4d-a285-680bfbe38277","Type":"ContainerDied","Data":"b4398eab96435c81b8a2366ba9291b7b0c13edf908fc801823865f8458709b7a"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.974660 4885 generic.go:334] "Generic (PLEG): container finished" podID="6fff4a7a-1b14-4e29-8c84-d7fc55de879c" containerID="57d8097d34b17ff81e694e75a211c6042455808aeca7d092f8501d703a78d088" exitCode=0 Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.974734 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3284-account-create-update-qht6h" event={"ID":"6fff4a7a-1b14-4e29-8c84-d7fc55de879c","Type":"ContainerDied","Data":"57d8097d34b17ff81e694e75a211c6042455808aeca7d092f8501d703a78d088"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.974764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3284-account-create-update-qht6h" event={"ID":"6fff4a7a-1b14-4e29-8c84-d7fc55de879c","Type":"ContainerStarted","Data":"bd759665cb6fe07543d620b6ac74746d2ebba785ee89af9724b1dbf768d76262"} Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.980188 4885 generic.go:334] "Generic (PLEG): container finished" podID="8b3418f5-a92a-4fe6-b0ea-929b54ecb052" containerID="2ff4df6777cb04e247eca00bf1613dce65653cf286ef17867253f3e89e727d13" exitCode=0 Mar 08 19:51:39 crc kubenswrapper[4885]: I0308 19:51:39.980263 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3705-account-create-update-2brz9" event={"ID":"8b3418f5-a92a-4fe6-b0ea-929b54ecb052","Type":"ContainerDied","Data":"2ff4df6777cb04e247eca00bf1613dce65653cf286ef17867253f3e89e727d13"} Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.034776 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-2cszd"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.074564 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-rk2jx"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.080650 4885 scope.go:117] "RemoveContainer" containerID="fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.082071 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6444958b7f-rk2jx"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.102448 4885 scope.go:117] "RemoveContainer" containerID="e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2" Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.104440 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2\": container with ID starting with e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2 not found: ID does not exist" containerID="e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.104488 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2"} err="failed to get container status \"e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2\": rpc error: code = NotFound desc = could not find container \"e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2\": container with ID starting with e8adb86a90e44c391965f743d310d9a68bfa148c83ca2fdd1676b9c7c2346ae2 not found: ID does not exist" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.104516 4885 scope.go:117] "RemoveContainer" containerID="fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0" Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.104860 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0\": container with ID starting with fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0 not found: ID does not exist" containerID="fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.104902 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0"} err="failed to get container status \"fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0\": rpc error: code = NotFound desc = could not find container \"fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0\": container with ID starting with fcde18adbf5a949104e5492d08fe8e8e423167f0fe6350ebae57670e6602c7b0 not found: ID does not exist" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.200710 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.201055 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f432b919-7772-41bf-9113-94eefe45e347" containerName="dnsmasq-dns" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.201067 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f432b919-7772-41bf-9113-94eefe45e347" containerName="dnsmasq-dns" Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.201089 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f432b919-7772-41bf-9113-94eefe45e347" containerName="init" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.201115 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f432b919-7772-41bf-9113-94eefe45e347" containerName="init" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.201268 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f432b919-7772-41bf-9113-94eefe45e347" containerName="dnsmasq-dns" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.209510 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.213899 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.214131 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.214155 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.214373 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-nbh5b" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.226080 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.229736 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-rrt62"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.235526 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-rrt62"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.355510 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.355576 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-lock\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.355608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.355665 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-cache\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.355701 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22mr6\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-kube-api-access-22mr6\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.355739 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458428 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458526 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-cache\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458571 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22mr6\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-kube-api-access-22mr6\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458623 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458671 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458730 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-lock\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.458897 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.459041 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.459067 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.459115 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift podName:aa276a05-ab6a-4aa1-9a9f-a990dc1513bd nodeName:}" failed. No retries permitted until 2026-03-08 19:51:40.959098016 +0000 UTC m=+1202.355152039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift") pod "swift-storage-0" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd") : configmap "swift-ring-files" not found Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.459200 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-cache\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.459234 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-lock\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.467304 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.476943 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22mr6\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-kube-api-access-22mr6\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.516549 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.806718 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-rqttf"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.807852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.811316 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.811509 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.811704 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.825310 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rqttf"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.834458 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rqttf"] Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.837627 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-fffzj ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-fffzj ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-rqttf" podUID="f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.844485 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-mn5x8"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.845907 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.849108 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.863327 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mn5x8"] Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967580 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlg4w\" (UniqueName: \"kubernetes.io/projected/e4353f36-d8f9-41ff-8062-f874bd53ef12-kube-api-access-rlg4w\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967628 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-ring-data-devices\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967695 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-combined-ca-bundle\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967731 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-ring-data-devices\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967765 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-dispersionconf\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967785 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-combined-ca-bundle\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967821 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fffzj\" (UniqueName: \"kubernetes.io/projected/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-kube-api-access-fffzj\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-scripts\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967894 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-scripts\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.967971 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.968006 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4353f36-d8f9-41ff-8062-f874bd53ef12-etc-swift\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.968036 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-swiftconf\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.968056 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-swiftconf\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.968078 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-dispersionconf\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.968116 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-etc-swift\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.970219 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.970261 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 19:51:40 crc kubenswrapper[4885]: E0308 19:51:40.970316 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift podName:aa276a05-ab6a-4aa1-9a9f-a990dc1513bd nodeName:}" failed. No retries permitted until 2026-03-08 19:51:41.970297302 +0000 UTC m=+1203.366351325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift") pod "swift-storage-0" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd") : configmap "swift-ring-files" not found Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.989688 4885 generic.go:334] "Generic (PLEG): container finished" podID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerID="b8826682ae559379d101397fb94513059f3cfbd38258fd8e20b0bbd2e14276d1" exitCode=0 Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.990177 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.989888 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" event={"ID":"67aa348d-fe05-4e05-af01-a0b22d170a9b","Type":"ContainerDied","Data":"b8826682ae559379d101397fb94513059f3cfbd38258fd8e20b0bbd2e14276d1"} Mar 08 19:51:40 crc kubenswrapper[4885]: I0308 19:51:40.990766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" event={"ID":"67aa348d-fe05-4e05-af01-a0b22d170a9b","Type":"ContainerStarted","Data":"e012096eca7d05b70c67ac3ab0c256b21f51521655fd61aa44ededf7f87ad72c"} Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.002964 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.069850 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-scripts\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070201 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4353f36-d8f9-41ff-8062-f874bd53ef12-etc-swift\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070229 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-swiftconf\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070245 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-swiftconf\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070264 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-dispersionconf\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070288 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-etc-swift\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070328 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlg4w\" (UniqueName: \"kubernetes.io/projected/e4353f36-d8f9-41ff-8062-f874bd53ef12-kube-api-access-rlg4w\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070364 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-ring-data-devices\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070382 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-combined-ca-bundle\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070402 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-ring-data-devices\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070423 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-dispersionconf\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070441 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-combined-ca-bundle\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070462 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fffzj\" (UniqueName: \"kubernetes.io/projected/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-kube-api-access-fffzj\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070478 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-scripts\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.070707 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-scripts\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.071158 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-scripts\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.071219 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4353f36-d8f9-41ff-8062-f874bd53ef12-etc-swift\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.071634 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-ring-data-devices\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.073820 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-etc-swift\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.074860 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-combined-ca-bundle\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.075532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-ring-data-devices\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.076441 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-swiftconf\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.079065 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-swiftconf\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.083099 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-dispersionconf\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.088642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-dispersionconf\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.090158 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-combined-ca-bundle\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.091510 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fffzj\" (UniqueName: \"kubernetes.io/projected/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-kube-api-access-fffzj\") pod \"swift-ring-rebalance-rqttf\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.103356 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlg4w\" (UniqueName: \"kubernetes.io/projected/e4353f36-d8f9-41ff-8062-f874bd53ef12-kube-api-access-rlg4w\") pod \"swift-ring-rebalance-mn5x8\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.166554 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.171370 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-ring-data-devices\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.171556 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-scripts\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.173133 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.174484 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-scripts" (OuterVolumeSpecName: "scripts") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.272748 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fffzj\" (UniqueName: \"kubernetes.io/projected/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-kube-api-access-fffzj\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.273071 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-dispersionconf\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.273130 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-swiftconf\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.273236 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-etc-swift\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.273308 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-combined-ca-bundle\") pod \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\" (UID: \"f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.273703 4885 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.273714 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.274846 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.280233 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.280420 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-kube-api-access-fffzj" (OuterVolumeSpecName: "kube-api-access-fffzj") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "kube-api-access-fffzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.281033 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.289073 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" (UID: "f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.313542 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.375247 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.375279 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fffzj\" (UniqueName: \"kubernetes.io/projected/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-kube-api-access-fffzj\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.375289 4885 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.375299 4885 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.375307 4885 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.426668 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83db30de-c4aa-4a2f-9e5f-e4545e4ff475" path="/var/lib/kubelet/pods/83db30de-c4aa-4a2f-9e5f-e4545e4ff475/volumes" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.427318 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f432b919-7772-41bf-9113-94eefe45e347" path="/var/lib/kubelet/pods/f432b919-7772-41bf-9113-94eefe45e347/volumes" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.482435 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxtkz\" (UniqueName: \"kubernetes.io/projected/f7884923-e1d5-4b4d-a285-680bfbe38277-kube-api-access-nxtkz\") pod \"f7884923-e1d5-4b4d-a285-680bfbe38277\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.482499 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7884923-e1d5-4b4d-a285-680bfbe38277-operator-scripts\") pod \"f7884923-e1d5-4b4d-a285-680bfbe38277\" (UID: \"f7884923-e1d5-4b4d-a285-680bfbe38277\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.496180 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7884923-e1d5-4b4d-a285-680bfbe38277-kube-api-access-nxtkz" (OuterVolumeSpecName: "kube-api-access-nxtkz") pod "f7884923-e1d5-4b4d-a285-680bfbe38277" (UID: "f7884923-e1d5-4b4d-a285-680bfbe38277"). InnerVolumeSpecName "kube-api-access-nxtkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.496314 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7884923-e1d5-4b4d-a285-680bfbe38277-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f7884923-e1d5-4b4d-a285-680bfbe38277" (UID: "f7884923-e1d5-4b4d-a285-680bfbe38277"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.522217 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.530650 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-25qrp" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.546553 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.596148 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxtkz\" (UniqueName: \"kubernetes.io/projected/f7884923-e1d5-4b4d-a285-680bfbe38277-kube-api-access-nxtkz\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.596191 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f7884923-e1d5-4b4d-a285-680bfbe38277-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.698266 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp66p\" (UniqueName: \"kubernetes.io/projected/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-kube-api-access-fp66p\") pod \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.698410 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-operator-scripts\") pod \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.698455 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-operator-scripts\") pod \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.698499 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqnf9\" (UniqueName: \"kubernetes.io/projected/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-kube-api-access-hqnf9\") pod \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\" (UID: \"6fff4a7a-1b14-4e29-8c84-d7fc55de879c\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.698519 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g56sf\" (UniqueName: \"kubernetes.io/projected/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-kube-api-access-g56sf\") pod \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\" (UID: \"761f5c93-2ed3-43f0-acaf-ee92d0719ec3\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.698578 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-operator-scripts\") pod \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\" (UID: \"8b3418f5-a92a-4fe6-b0ea-929b54ecb052\") " Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.699190 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6fff4a7a-1b14-4e29-8c84-d7fc55de879c" (UID: "6fff4a7a-1b14-4e29-8c84-d7fc55de879c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.699456 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8b3418f5-a92a-4fe6-b0ea-929b54ecb052" (UID: "8b3418f5-a92a-4fe6-b0ea-929b54ecb052"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.699972 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "761f5c93-2ed3-43f0-acaf-ee92d0719ec3" (UID: "761f5c93-2ed3-43f0-acaf-ee92d0719ec3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.702007 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-kube-api-access-hqnf9" (OuterVolumeSpecName: "kube-api-access-hqnf9") pod "6fff4a7a-1b14-4e29-8c84-d7fc55de879c" (UID: "6fff4a7a-1b14-4e29-8c84-d7fc55de879c"). InnerVolumeSpecName "kube-api-access-hqnf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.703085 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-kube-api-access-g56sf" (OuterVolumeSpecName: "kube-api-access-g56sf") pod "761f5c93-2ed3-43f0-acaf-ee92d0719ec3" (UID: "761f5c93-2ed3-43f0-acaf-ee92d0719ec3"). InnerVolumeSpecName "kube-api-access-g56sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.703587 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-kube-api-access-fp66p" (OuterVolumeSpecName: "kube-api-access-fp66p") pod "8b3418f5-a92a-4fe6-b0ea-929b54ecb052" (UID: "8b3418f5-a92a-4fe6-b0ea-929b54ecb052"). InnerVolumeSpecName "kube-api-access-fp66p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.707821 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-mn5x8"] Mar 08 19:51:41 crc kubenswrapper[4885]: W0308 19:51:41.711333 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4353f36_d8f9_41ff_8062_f874bd53ef12.slice/crio-d73244ef927ebe753f9e2ed770210ef41d35892b5f2baaac3443b9d3c958c66e WatchSource:0}: Error finding container d73244ef927ebe753f9e2ed770210ef41d35892b5f2baaac3443b9d3c958c66e: Status 404 returned error can't find the container with id d73244ef927ebe753f9e2ed770210ef41d35892b5f2baaac3443b9d3c958c66e Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.800190 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.800210 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp66p\" (UniqueName: \"kubernetes.io/projected/8b3418f5-a92a-4fe6-b0ea-929b54ecb052-kube-api-access-fp66p\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.800221 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.800231 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.800239 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqnf9\" (UniqueName: \"kubernetes.io/projected/6fff4a7a-1b14-4e29-8c84-d7fc55de879c-kube-api-access-hqnf9\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:41 crc kubenswrapper[4885]: I0308 19:51:41.800246 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g56sf\" (UniqueName: \"kubernetes.io/projected/761f5c93-2ed3-43f0-acaf-ee92d0719ec3-kube-api-access-g56sf\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:41.999213 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-25qrp" event={"ID":"761f5c93-2ed3-43f0-acaf-ee92d0719ec3","Type":"ContainerDied","Data":"0de5eb20fec0068fad7220dbd8c11cda9bad6ac30e75bf5f0e0e2846ae820c1a"} Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:41.999280 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0de5eb20fec0068fad7220dbd8c11cda9bad6ac30e75bf5f0e0e2846ae820c1a" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:41.999318 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-25qrp" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.000821 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-ll64z" event={"ID":"f7884923-e1d5-4b4d-a285-680bfbe38277","Type":"ContainerDied","Data":"386f6aaa7f8daad0074b28ca5968bd9be45e6528e49c396b92ec2011c6026b34"} Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.000856 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="386f6aaa7f8daad0074b28ca5968bd9be45e6528e49c396b92ec2011c6026b34" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.000861 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-ll64z" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.003687 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:42 crc kubenswrapper[4885]: E0308 19:51:42.003862 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 19:51:42 crc kubenswrapper[4885]: E0308 19:51:42.003885 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 19:51:42 crc kubenswrapper[4885]: E0308 19:51:42.003955 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift podName:aa276a05-ab6a-4aa1-9a9f-a990dc1513bd nodeName:}" failed. No retries permitted until 2026-03-08 19:51:44.003934245 +0000 UTC m=+1205.399988268 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift") pod "swift-storage-0" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd") : configmap "swift-ring-files" not found Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.016056 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" event={"ID":"67aa348d-fe05-4e05-af01-a0b22d170a9b","Type":"ContainerStarted","Data":"701f88d27bc0253f9783e551e39a9c634999051422220378d318c698c63ae0f5"} Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.016320 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.019256 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3284-account-create-update-qht6h" event={"ID":"6fff4a7a-1b14-4e29-8c84-d7fc55de879c","Type":"ContainerDied","Data":"bd759665cb6fe07543d620b6ac74746d2ebba785ee89af9724b1dbf768d76262"} Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.019316 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd759665cb6fe07543d620b6ac74746d2ebba785ee89af9724b1dbf768d76262" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.019371 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3284-account-create-update-qht6h" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.028165 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3705-account-create-update-2brz9" event={"ID":"8b3418f5-a92a-4fe6-b0ea-929b54ecb052","Type":"ContainerDied","Data":"9b1459e7cf206ea3c6375a47a07ece269bc28d4b4c10bc223fc70b9125df1823"} Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.028501 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b1459e7cf206ea3c6375a47a07ece269bc28d4b4c10bc223fc70b9125df1823" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.028564 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-2brz9" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.031499 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-rqttf" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.031592 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mn5x8" event={"ID":"e4353f36-d8f9-41ff-8062-f874bd53ef12","Type":"ContainerStarted","Data":"d73244ef927ebe753f9e2ed770210ef41d35892b5f2baaac3443b9d3c958c66e"} Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.038748 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" podStartSLOduration=3.03873029 podStartE2EDuration="3.03873029s" podCreationTimestamp="2026-03-08 19:51:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:51:42.035488554 +0000 UTC m=+1203.431542587" watchObservedRunningTime="2026-03-08 19:51:42.03873029 +0000 UTC m=+1203.434784313" Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.086014 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-rqttf"] Mar 08 19:51:42 crc kubenswrapper[4885]: I0308 19:51:42.092346 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-rqttf"] Mar 08 19:51:43 crc kubenswrapper[4885]: I0308 19:51:43.376553 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d" path="/var/lib/kubelet/pods/f6b9e7ed-0e10-4002-a9b2-57c55a1dcf6d/volumes" Mar 08 19:51:44 crc kubenswrapper[4885]: I0308 19:51:44.044632 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:44 crc kubenswrapper[4885]: E0308 19:51:44.044778 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 19:51:44 crc kubenswrapper[4885]: E0308 19:51:44.045457 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 19:51:44 crc kubenswrapper[4885]: E0308 19:51:44.045510 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift podName:aa276a05-ab6a-4aa1-9a9f-a990dc1513bd nodeName:}" failed. No retries permitted until 2026-03-08 19:51:48.045493935 +0000 UTC m=+1209.441547958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift") pod "swift-storage-0" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd") : configmap "swift-ring-files" not found Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.212941 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2b5kk"] Mar 08 19:51:45 crc kubenswrapper[4885]: E0308 19:51:45.213249 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3418f5-a92a-4fe6-b0ea-929b54ecb052" containerName="mariadb-account-create-update" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213261 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3418f5-a92a-4fe6-b0ea-929b54ecb052" containerName="mariadb-account-create-update" Mar 08 19:51:45 crc kubenswrapper[4885]: E0308 19:51:45.213287 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fff4a7a-1b14-4e29-8c84-d7fc55de879c" containerName="mariadb-account-create-update" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213293 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fff4a7a-1b14-4e29-8c84-d7fc55de879c" containerName="mariadb-account-create-update" Mar 08 19:51:45 crc kubenswrapper[4885]: E0308 19:51:45.213305 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7884923-e1d5-4b4d-a285-680bfbe38277" containerName="mariadb-database-create" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213312 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7884923-e1d5-4b4d-a285-680bfbe38277" containerName="mariadb-database-create" Mar 08 19:51:45 crc kubenswrapper[4885]: E0308 19:51:45.213324 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="761f5c93-2ed3-43f0-acaf-ee92d0719ec3" containerName="mariadb-database-create" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213330 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="761f5c93-2ed3-43f0-acaf-ee92d0719ec3" containerName="mariadb-database-create" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213463 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7884923-e1d5-4b4d-a285-680bfbe38277" containerName="mariadb-database-create" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213480 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="761f5c93-2ed3-43f0-acaf-ee92d0719ec3" containerName="mariadb-database-create" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213490 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fff4a7a-1b14-4e29-8c84-d7fc55de879c" containerName="mariadb-account-create-update" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.213503 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3418f5-a92a-4fe6-b0ea-929b54ecb052" containerName="mariadb-account-create-update" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.214532 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.216512 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.221257 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2b5kk"] Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.272285 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b6d8115-d92a-4305-a2d2-8d9874a81390-operator-scripts\") pod \"root-account-create-update-2b5kk\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.272398 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nr5k\" (UniqueName: \"kubernetes.io/projected/3b6d8115-d92a-4305-a2d2-8d9874a81390-kube-api-access-9nr5k\") pod \"root-account-create-update-2b5kk\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.374462 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b6d8115-d92a-4305-a2d2-8d9874a81390-operator-scripts\") pod \"root-account-create-update-2b5kk\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.374539 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nr5k\" (UniqueName: \"kubernetes.io/projected/3b6d8115-d92a-4305-a2d2-8d9874a81390-kube-api-access-9nr5k\") pod \"root-account-create-update-2b5kk\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.375265 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b6d8115-d92a-4305-a2d2-8d9874a81390-operator-scripts\") pod \"root-account-create-update-2b5kk\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.392418 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nr5k\" (UniqueName: \"kubernetes.io/projected/3b6d8115-d92a-4305-a2d2-8d9874a81390-kube-api-access-9nr5k\") pod \"root-account-create-update-2b5kk\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:45 crc kubenswrapper[4885]: I0308 19:51:45.580207 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:46 crc kubenswrapper[4885]: I0308 19:51:46.070140 4885 generic.go:334] "Generic (PLEG): container finished" podID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerID="67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0" exitCode=0 Mar 08 19:51:46 crc kubenswrapper[4885]: I0308 19:51:46.070209 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"96257eac-42ec-44cf-80be-9be68c0ebb1b","Type":"ContainerDied","Data":"67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0"} Mar 08 19:51:46 crc kubenswrapper[4885]: I0308 19:51:46.072734 4885 generic.go:334] "Generic (PLEG): container finished" podID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerID="f7d40d12aee399534fa9d02af86ea25978b99ea1398acccdac988f16615d42dd" exitCode=0 Mar 08 19:51:46 crc kubenswrapper[4885]: I0308 19:51:46.072803 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01dc1fd5-4e2f-4129-9452-ed50fa1d182b","Type":"ContainerDied","Data":"f7d40d12aee399534fa9d02af86ea25978b99ea1398acccdac988f16615d42dd"} Mar 08 19:51:48 crc kubenswrapper[4885]: I0308 19:51:48.125790 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:48 crc kubenswrapper[4885]: E0308 19:51:48.126003 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 19:51:48 crc kubenswrapper[4885]: E0308 19:51:48.126223 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 19:51:48 crc kubenswrapper[4885]: E0308 19:51:48.126284 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift podName:aa276a05-ab6a-4aa1-9a9f-a990dc1513bd nodeName:}" failed. No retries permitted until 2026-03-08 19:51:56.126266942 +0000 UTC m=+1217.522320955 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift") pod "swift-storage-0" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd") : configmap "swift-ring-files" not found Mar 08 19:51:49 crc kubenswrapper[4885]: I0308 19:51:49.537259 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:51:49 crc kubenswrapper[4885]: I0308 19:51:49.602527 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-fdwqp"] Mar 08 19:51:49 crc kubenswrapper[4885]: I0308 19:51:49.602742 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="dnsmasq-dns" containerID="cri-o://b0f30ec931ed12658474aee25cc28d8b7b4c49ada3c642c748084351a40ddd97" gracePeriod=10 Mar 08 19:51:50 crc kubenswrapper[4885]: E0308 19:51:50.575664 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod201fa134_20f7_4902_8fd4_ba352e7f4e95.slice/crio-conmon-b0f30ec931ed12658474aee25cc28d8b7b4c49ada3c642c748084351a40ddd97.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:51:50 crc kubenswrapper[4885]: I0308 19:51:50.846172 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Mar 08 19:51:51 crc kubenswrapper[4885]: I0308 19:51:51.133882 4885 generic.go:334] "Generic (PLEG): container finished" podID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerID="b0f30ec931ed12658474aee25cc28d8b7b4c49ada3c642c748084351a40ddd97" exitCode=0 Mar 08 19:51:51 crc kubenswrapper[4885]: I0308 19:51:51.133994 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" event={"ID":"201fa134-20f7-4902-8fd4-ba352e7f4e95","Type":"ContainerDied","Data":"b0f30ec931ed12658474aee25cc28d8b7b4c49ada3c642c748084351a40ddd97"} Mar 08 19:51:51 crc kubenswrapper[4885]: I0308 19:51:51.477500 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 08 19:51:53 crc kubenswrapper[4885]: I0308 19:51:53.428989 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mn4lz" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" probeResult="failure" output=< Mar 08 19:51:53 crc kubenswrapper[4885]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 19:51:53 crc kubenswrapper[4885]: > Mar 08 19:51:55 crc kubenswrapper[4885]: E0308 19:51:55.282190 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api@sha256:ed912eee9adeda5c44804688cc7661695a42ab1a40fa46b28bdc819cefa98f07" Mar 08 19:51:55 crc kubenswrapper[4885]: E0308 19:51:55.282789 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api@sha256:ed912eee9adeda5c44804688cc7661695a42ab1a40fa46b28bdc819cefa98f07,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sfl6p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-pq8mq_openstack(618b5189-8b29-473f-b59c-e911fca71041): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:51:55 crc kubenswrapper[4885]: E0308 19:51:55.284055 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-pq8mq" podUID="618b5189-8b29-473f-b59c-e911fca71041" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.584286 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.667010 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-nb\") pod \"201fa134-20f7-4902-8fd4-ba352e7f4e95\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.667228 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-dns-svc\") pod \"201fa134-20f7-4902-8fd4-ba352e7f4e95\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.667259 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-config\") pod \"201fa134-20f7-4902-8fd4-ba352e7f4e95\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.667393 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-sb\") pod \"201fa134-20f7-4902-8fd4-ba352e7f4e95\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.667515 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jm7q\" (UniqueName: \"kubernetes.io/projected/201fa134-20f7-4902-8fd4-ba352e7f4e95-kube-api-access-6jm7q\") pod \"201fa134-20f7-4902-8fd4-ba352e7f4e95\" (UID: \"201fa134-20f7-4902-8fd4-ba352e7f4e95\") " Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.671682 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/201fa134-20f7-4902-8fd4-ba352e7f4e95-kube-api-access-6jm7q" (OuterVolumeSpecName: "kube-api-access-6jm7q") pod "201fa134-20f7-4902-8fd4-ba352e7f4e95" (UID: "201fa134-20f7-4902-8fd4-ba352e7f4e95"). InnerVolumeSpecName "kube-api-access-6jm7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.700657 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "201fa134-20f7-4902-8fd4-ba352e7f4e95" (UID: "201fa134-20f7-4902-8fd4-ba352e7f4e95"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.709004 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "201fa134-20f7-4902-8fd4-ba352e7f4e95" (UID: "201fa134-20f7-4902-8fd4-ba352e7f4e95"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.710450 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "201fa134-20f7-4902-8fd4-ba352e7f4e95" (UID: "201fa134-20f7-4902-8fd4-ba352e7f4e95"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.712778 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-config" (OuterVolumeSpecName: "config") pod "201fa134-20f7-4902-8fd4-ba352e7f4e95" (UID: "201fa134-20f7-4902-8fd4-ba352e7f4e95"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.769409 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.769447 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jm7q\" (UniqueName: \"kubernetes.io/projected/201fa134-20f7-4902-8fd4-ba352e7f4e95-kube-api-access-6jm7q\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.769465 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.769477 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.769489 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/201fa134-20f7-4902-8fd4-ba352e7f4e95-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:55 crc kubenswrapper[4885]: I0308 19:51:55.825284 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2b5kk"] Mar 08 19:51:55 crc kubenswrapper[4885]: W0308 19:51:55.828368 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b6d8115_d92a_4305_a2d2_8d9874a81390.slice/crio-bfde1ff561455fe922891fc2eec8ca08153ecdae99eec4ff5ab4d833b2c5cacb WatchSource:0}: Error finding container bfde1ff561455fe922891fc2eec8ca08153ecdae99eec4ff5ab4d833b2c5cacb: Status 404 returned error can't find the container with id bfde1ff561455fe922891fc2eec8ca08153ecdae99eec4ff5ab4d833b2c5cacb Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.176301 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:51:56 crc kubenswrapper[4885]: E0308 19:51:56.176544 4885 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 08 19:51:56 crc kubenswrapper[4885]: E0308 19:51:56.176581 4885 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 08 19:51:56 crc kubenswrapper[4885]: E0308 19:51:56.176642 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift podName:aa276a05-ab6a-4aa1-9a9f-a990dc1513bd nodeName:}" failed. No retries permitted until 2026-03-08 19:52:12.176620897 +0000 UTC m=+1233.572674940 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift") pod "swift-storage-0" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd") : configmap "swift-ring-files" not found Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.191761 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"96257eac-42ec-44cf-80be-9be68c0ebb1b","Type":"ContainerStarted","Data":"c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e"} Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.192073 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.195522 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01dc1fd5-4e2f-4129-9452-ed50fa1d182b","Type":"ContainerStarted","Data":"ab00747eae0e5726409cc3faafb18065815833a98680950d3e1962529cb0f73d"} Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.195757 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.197700 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mn5x8" event={"ID":"e4353f36-d8f9-41ff-8062-f874bd53ef12","Type":"ContainerStarted","Data":"2f31946378ed0ae4efcfd55a18f638cc84b0a18f97193739711ef28dac2174f9"} Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.200108 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" event={"ID":"201fa134-20f7-4902-8fd4-ba352e7f4e95","Type":"ContainerDied","Data":"4817fdc2e93c45edc661de711d2e74d24fa8c501db17ccf9809b6e4e92461549"} Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.200149 4885 scope.go:117] "RemoveContainer" containerID="b0f30ec931ed12658474aee25cc28d8b7b4c49ada3c642c748084351a40ddd97" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.200412 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b57d9888c-fdwqp" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.201887 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2b5kk" event={"ID":"3b6d8115-d92a-4305-a2d2-8d9874a81390","Type":"ContainerStarted","Data":"58b318e6af3a5db8b09b96a9de226a379d7375fee61bd37b949548ceef13806c"} Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.201974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2b5kk" event={"ID":"3b6d8115-d92a-4305-a2d2-8d9874a81390","Type":"ContainerStarted","Data":"bfde1ff561455fe922891fc2eec8ca08153ecdae99eec4ff5ab4d833b2c5cacb"} Mar 08 19:51:56 crc kubenswrapper[4885]: E0308 19:51:56.204370 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api@sha256:ed912eee9adeda5c44804688cc7661695a42ab1a40fa46b28bdc819cefa98f07\\\"\"" pod="openstack/glance-db-sync-pq8mq" podUID="618b5189-8b29-473f-b59c-e911fca71041" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.228562 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=46.851361303 podStartE2EDuration="1m4.228535369s" podCreationTimestamp="2026-03-08 19:50:52 +0000 UTC" firstStartedPulling="2026-03-08 19:50:54.394218989 +0000 UTC m=+1155.790273012" lastFinishedPulling="2026-03-08 19:51:11.771393025 +0000 UTC m=+1173.167447078" observedRunningTime="2026-03-08 19:51:56.227787999 +0000 UTC m=+1217.623842032" watchObservedRunningTime="2026-03-08 19:51:56.228535369 +0000 UTC m=+1217.624589422" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.236862 4885 scope.go:117] "RemoveContainer" containerID="e6cf715e1922fcc27c66130c6d2113c75ad20d6e5d12122260396ed02a84d181" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.263954 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-mn5x8" podStartSLOduration=2.611882495 podStartE2EDuration="16.263934311s" podCreationTimestamp="2026-03-08 19:51:40 +0000 UTC" firstStartedPulling="2026-03-08 19:51:41.713397261 +0000 UTC m=+1203.109451284" lastFinishedPulling="2026-03-08 19:51:55.365449047 +0000 UTC m=+1216.761503100" observedRunningTime="2026-03-08 19:51:56.260422218 +0000 UTC m=+1217.656476251" watchObservedRunningTime="2026-03-08 19:51:56.263934311 +0000 UTC m=+1217.659988344" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.283931 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-2b5kk" podStartSLOduration=11.283899973 podStartE2EDuration="11.283899973s" podCreationTimestamp="2026-03-08 19:51:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:51:56.277696907 +0000 UTC m=+1217.673750940" watchObservedRunningTime="2026-03-08 19:51:56.283899973 +0000 UTC m=+1217.679953996" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.314432 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=46.289334076 podStartE2EDuration="1m5.314413314s" podCreationTimestamp="2026-03-08 19:50:51 +0000 UTC" firstStartedPulling="2026-03-08 19:50:53.527935881 +0000 UTC m=+1154.923989904" lastFinishedPulling="2026-03-08 19:51:12.553015119 +0000 UTC m=+1173.949069142" observedRunningTime="2026-03-08 19:51:56.308350373 +0000 UTC m=+1217.704404396" watchObservedRunningTime="2026-03-08 19:51:56.314413314 +0000 UTC m=+1217.710467337" Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.355900 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-fdwqp"] Mar 08 19:51:56 crc kubenswrapper[4885]: I0308 19:51:56.364767 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b57d9888c-fdwqp"] Mar 08 19:51:57 crc kubenswrapper[4885]: I0308 19:51:57.213059 4885 generic.go:334] "Generic (PLEG): container finished" podID="3b6d8115-d92a-4305-a2d2-8d9874a81390" containerID="58b318e6af3a5db8b09b96a9de226a379d7375fee61bd37b949548ceef13806c" exitCode=0 Mar 08 19:51:57 crc kubenswrapper[4885]: I0308 19:51:57.213106 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2b5kk" event={"ID":"3b6d8115-d92a-4305-a2d2-8d9874a81390","Type":"ContainerDied","Data":"58b318e6af3a5db8b09b96a9de226a379d7375fee61bd37b949548ceef13806c"} Mar 08 19:51:57 crc kubenswrapper[4885]: I0308 19:51:57.385768 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" path="/var/lib/kubelet/pods/201fa134-20f7-4902-8fd4-ba352e7f4e95/volumes" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.397484 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-mn4lz" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" probeResult="failure" output=< Mar 08 19:51:58 crc kubenswrapper[4885]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 19:51:58 crc kubenswrapper[4885]: > Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.464185 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.471803 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.602644 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2b5kk" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.690102 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mn4lz-config-hxzwd"] Mar 08 19:51:58 crc kubenswrapper[4885]: E0308 19:51:58.690387 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b6d8115-d92a-4305-a2d2-8d9874a81390" containerName="mariadb-account-create-update" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.690404 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b6d8115-d92a-4305-a2d2-8d9874a81390" containerName="mariadb-account-create-update" Mar 08 19:51:58 crc kubenswrapper[4885]: E0308 19:51:58.690413 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="dnsmasq-dns" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.690421 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="dnsmasq-dns" Mar 08 19:51:58 crc kubenswrapper[4885]: E0308 19:51:58.690434 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="init" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.690442 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="init" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.690595 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b6d8115-d92a-4305-a2d2-8d9874a81390" containerName="mariadb-account-create-update" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.690604 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="201fa134-20f7-4902-8fd4-ba352e7f4e95" containerName="dnsmasq-dns" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.691123 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.693135 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.705915 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mn4lz-config-hxzwd"] Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.726978 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b6d8115-d92a-4305-a2d2-8d9874a81390-operator-scripts\") pod \"3b6d8115-d92a-4305-a2d2-8d9874a81390\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.727151 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nr5k\" (UniqueName: \"kubernetes.io/projected/3b6d8115-d92a-4305-a2d2-8d9874a81390-kube-api-access-9nr5k\") pod \"3b6d8115-d92a-4305-a2d2-8d9874a81390\" (UID: \"3b6d8115-d92a-4305-a2d2-8d9874a81390\") " Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.728011 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b6d8115-d92a-4305-a2d2-8d9874a81390-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b6d8115-d92a-4305-a2d2-8d9874a81390" (UID: "3b6d8115-d92a-4305-a2d2-8d9874a81390"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.733192 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b6d8115-d92a-4305-a2d2-8d9874a81390-kube-api-access-9nr5k" (OuterVolumeSpecName: "kube-api-access-9nr5k") pod "3b6d8115-d92a-4305-a2d2-8d9874a81390" (UID: "3b6d8115-d92a-4305-a2d2-8d9874a81390"). InnerVolumeSpecName "kube-api-access-9nr5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.828931 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-log-ovn\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run-ovn\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829029 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-additional-scripts\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829077 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-scripts\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829119 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j46k\" (UniqueName: \"kubernetes.io/projected/b08aa6bb-932f-4790-a637-f3667471149c-kube-api-access-4j46k\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829141 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829197 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nr5k\" (UniqueName: \"kubernetes.io/projected/3b6d8115-d92a-4305-a2d2-8d9874a81390-kube-api-access-9nr5k\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.829212 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b6d8115-d92a-4305-a2d2-8d9874a81390-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.930785 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-additional-scripts\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.930977 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-scripts\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931098 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j46k\" (UniqueName: \"kubernetes.io/projected/b08aa6bb-932f-4790-a637-f3667471149c-kube-api-access-4j46k\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931151 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931259 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-log-ovn\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931324 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run-ovn\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931614 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run-ovn\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931629 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-log-ovn\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.931985 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-additional-scripts\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.932733 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-scripts\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:58 crc kubenswrapper[4885]: I0308 19:51:58.956304 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j46k\" (UniqueName: \"kubernetes.io/projected/b08aa6bb-932f-4790-a637-f3667471149c-kube-api-access-4j46k\") pod \"ovn-controller-mn4lz-config-hxzwd\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:59 crc kubenswrapper[4885]: I0308 19:51:59.006728 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:51:59 crc kubenswrapper[4885]: I0308 19:51:59.245719 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2b5kk" event={"ID":"3b6d8115-d92a-4305-a2d2-8d9874a81390","Type":"ContainerDied","Data":"bfde1ff561455fe922891fc2eec8ca08153ecdae99eec4ff5ab4d833b2c5cacb"} Mar 08 19:51:59 crc kubenswrapper[4885]: I0308 19:51:59.246261 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfde1ff561455fe922891fc2eec8ca08153ecdae99eec4ff5ab4d833b2c5cacb" Mar 08 19:51:59 crc kubenswrapper[4885]: I0308 19:51:59.245754 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2b5kk" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.262043 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549992-pzbpd"] Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.263888 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.267600 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.267952 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.269658 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.290472 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549992-pzbpd"] Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.317996 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mn4lz-config-hxzwd"] Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.357958 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6w8s\" (UniqueName: \"kubernetes.io/projected/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec-kube-api-access-j6w8s\") pod \"auto-csr-approver-29549992-pzbpd\" (UID: \"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec\") " pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.459324 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6w8s\" (UniqueName: \"kubernetes.io/projected/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec-kube-api-access-j6w8s\") pod \"auto-csr-approver-29549992-pzbpd\" (UID: \"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec\") " pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.481899 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6w8s\" (UniqueName: \"kubernetes.io/projected/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec-kube-api-access-j6w8s\") pod \"auto-csr-approver-29549992-pzbpd\" (UID: \"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec\") " pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:00 crc kubenswrapper[4885]: I0308 19:52:00.581632 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:01 crc kubenswrapper[4885]: I0308 19:52:01.038787 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549992-pzbpd"] Mar 08 19:52:01 crc kubenswrapper[4885]: W0308 19:52:01.046835 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeb9e9e0_35d3_4473_ad7b_7b44fd44e8ec.slice/crio-79ffd60319e1bf7dc32ff7241cf4b2beb236af55abf8facaa64ffdcf1e51c9f8 WatchSource:0}: Error finding container 79ffd60319e1bf7dc32ff7241cf4b2beb236af55abf8facaa64ffdcf1e51c9f8: Status 404 returned error can't find the container with id 79ffd60319e1bf7dc32ff7241cf4b2beb236af55abf8facaa64ffdcf1e51c9f8 Mar 08 19:52:01 crc kubenswrapper[4885]: I0308 19:52:01.261371 4885 generic.go:334] "Generic (PLEG): container finished" podID="b08aa6bb-932f-4790-a637-f3667471149c" containerID="d9ea1c70756e397df6785ca6ac5c032d1dcba35d8ce3a74fd9e9a044ec85b1ad" exitCode=0 Mar 08 19:52:01 crc kubenswrapper[4885]: I0308 19:52:01.261456 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz-config-hxzwd" event={"ID":"b08aa6bb-932f-4790-a637-f3667471149c","Type":"ContainerDied","Data":"d9ea1c70756e397df6785ca6ac5c032d1dcba35d8ce3a74fd9e9a044ec85b1ad"} Mar 08 19:52:01 crc kubenswrapper[4885]: I0308 19:52:01.261520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz-config-hxzwd" event={"ID":"b08aa6bb-932f-4790-a637-f3667471149c","Type":"ContainerStarted","Data":"d88ceadc1f3902a82df865ef4e141e1ea2dcb8aae828eddebf810d0e50045854"} Mar 08 19:52:01 crc kubenswrapper[4885]: I0308 19:52:01.263285 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" event={"ID":"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec","Type":"ContainerStarted","Data":"79ffd60319e1bf7dc32ff7241cf4b2beb236af55abf8facaa64ffdcf1e51c9f8"} Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.277557 4885 generic.go:334] "Generic (PLEG): container finished" podID="e4353f36-d8f9-41ff-8062-f874bd53ef12" containerID="2f31946378ed0ae4efcfd55a18f638cc84b0a18f97193739711ef28dac2174f9" exitCode=0 Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.278179 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mn5x8" event={"ID":"e4353f36-d8f9-41ff-8062-f874bd53ef12","Type":"ContainerDied","Data":"2f31946378ed0ae4efcfd55a18f638cc84b0a18f97193739711ef28dac2174f9"} Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.667178 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711194 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-log-ovn\") pod \"b08aa6bb-932f-4790-a637-f3667471149c\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711278 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-additional-scripts\") pod \"b08aa6bb-932f-4790-a637-f3667471149c\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711334 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b08aa6bb-932f-4790-a637-f3667471149c" (UID: "b08aa6bb-932f-4790-a637-f3667471149c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711375 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run-ovn\") pod \"b08aa6bb-932f-4790-a637-f3667471149c\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711437 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j46k\" (UniqueName: \"kubernetes.io/projected/b08aa6bb-932f-4790-a637-f3667471149c-kube-api-access-4j46k\") pod \"b08aa6bb-932f-4790-a637-f3667471149c\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711451 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b08aa6bb-932f-4790-a637-f3667471149c" (UID: "b08aa6bb-932f-4790-a637-f3667471149c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711544 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-scripts\") pod \"b08aa6bb-932f-4790-a637-f3667471149c\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711600 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run\") pod \"b08aa6bb-932f-4790-a637-f3667471149c\" (UID: \"b08aa6bb-932f-4790-a637-f3667471149c\") " Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.712044 4885 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.712063 4885 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.711994 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run" (OuterVolumeSpecName: "var-run") pod "b08aa6bb-932f-4790-a637-f3667471149c" (UID: "b08aa6bb-932f-4790-a637-f3667471149c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.712262 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b08aa6bb-932f-4790-a637-f3667471149c" (UID: "b08aa6bb-932f-4790-a637-f3667471149c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.712995 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-scripts" (OuterVolumeSpecName: "scripts") pod "b08aa6bb-932f-4790-a637-f3667471149c" (UID: "b08aa6bb-932f-4790-a637-f3667471149c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.717215 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b08aa6bb-932f-4790-a637-f3667471149c-kube-api-access-4j46k" (OuterVolumeSpecName: "kube-api-access-4j46k") pod "b08aa6bb-932f-4790-a637-f3667471149c" (UID: "b08aa6bb-932f-4790-a637-f3667471149c"). InnerVolumeSpecName "kube-api-access-4j46k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.814667 4885 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.815098 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j46k\" (UniqueName: \"kubernetes.io/projected/b08aa6bb-932f-4790-a637-f3667471149c-kube-api-access-4j46k\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.815114 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b08aa6bb-932f-4790-a637-f3667471149c-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:02 crc kubenswrapper[4885]: I0308 19:52:02.815125 4885 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b08aa6bb-932f-4790-a637-f3667471149c-var-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.287752 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz-config-hxzwd" event={"ID":"b08aa6bb-932f-4790-a637-f3667471149c","Type":"ContainerDied","Data":"d88ceadc1f3902a82df865ef4e141e1ea2dcb8aae828eddebf810d0e50045854"} Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.287803 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d88ceadc1f3902a82df865ef4e141e1ea2dcb8aae828eddebf810d0e50045854" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.288869 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-hxzwd" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.289611 4885 generic.go:334] "Generic (PLEG): container finished" podID="deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec" containerID="e06b6952cf7f49bba090c40e1251201f80874fce311561d18f2cd3c7169feb77" exitCode=0 Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.289711 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" event={"ID":"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec","Type":"ContainerDied","Data":"e06b6952cf7f49bba090c40e1251201f80874fce311561d18f2cd3c7169feb77"} Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.412144 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-mn4lz" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.616018 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.727966 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4353f36-d8f9-41ff-8062-f874bd53ef12-etc-swift\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728103 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-swiftconf\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728127 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-dispersionconf\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728191 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-scripts\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728276 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-combined-ca-bundle\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728297 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-ring-data-devices\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728370 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlg4w\" (UniqueName: \"kubernetes.io/projected/e4353f36-d8f9-41ff-8062-f874bd53ef12-kube-api-access-rlg4w\") pod \"e4353f36-d8f9-41ff-8062-f874bd53ef12\" (UID: \"e4353f36-d8f9-41ff-8062-f874bd53ef12\") " Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.728992 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4353f36-d8f9-41ff-8062-f874bd53ef12-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.729071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.754495 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4353f36-d8f9-41ff-8062-f874bd53ef12-kube-api-access-rlg4w" (OuterVolumeSpecName: "kube-api-access-rlg4w") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "kube-api-access-rlg4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.757747 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.759375 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.759505 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.768628 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-scripts" (OuterVolumeSpecName: "scripts") pod "e4353f36-d8f9-41ff-8062-f874bd53ef12" (UID: "e4353f36-d8f9-41ff-8062-f874bd53ef12"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.799230 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mn4lz-config-hxzwd"] Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.813638 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mn4lz-config-hxzwd"] Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830103 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlg4w\" (UniqueName: \"kubernetes.io/projected/e4353f36-d8f9-41ff-8062-f874bd53ef12-kube-api-access-rlg4w\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830136 4885 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e4353f36-d8f9-41ff-8062-f874bd53ef12-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830151 4885 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830162 4885 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830174 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830185 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4353f36-d8f9-41ff-8062-f874bd53ef12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.830200 4885 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e4353f36-d8f9-41ff-8062-f874bd53ef12-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.841629 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-mn4lz-config-n4lpc"] Mar 08 19:52:03 crc kubenswrapper[4885]: E0308 19:52:03.842009 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b08aa6bb-932f-4790-a637-f3667471149c" containerName="ovn-config" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.842026 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b08aa6bb-932f-4790-a637-f3667471149c" containerName="ovn-config" Mar 08 19:52:03 crc kubenswrapper[4885]: E0308 19:52:03.842046 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4353f36-d8f9-41ff-8062-f874bd53ef12" containerName="swift-ring-rebalance" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.842055 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4353f36-d8f9-41ff-8062-f874bd53ef12" containerName="swift-ring-rebalance" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.842232 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b08aa6bb-932f-4790-a637-f3667471149c" containerName="ovn-config" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.842251 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4353f36-d8f9-41ff-8062-f874bd53ef12" containerName="swift-ring-rebalance" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.842738 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.844493 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.856369 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mn4lz-config-n4lpc"] Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.931657 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c76m\" (UniqueName: \"kubernetes.io/projected/795a0dcb-c61f-4d3b-8924-4f04474af216-kube-api-access-8c76m\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.931697 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-scripts\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.931724 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run-ovn\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.931745 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-additional-scripts\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.931843 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-log-ovn\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:03 crc kubenswrapper[4885]: I0308 19:52:03.931879 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033222 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033339 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c76m\" (UniqueName: \"kubernetes.io/projected/795a0dcb-c61f-4d3b-8924-4f04474af216-kube-api-access-8c76m\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-scripts\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033389 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run-ovn\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033410 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-additional-scripts\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033496 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-log-ovn\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033695 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-log-ovn\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033693 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run-ovn\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.033694 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.034801 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-additional-scripts\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.035893 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-scripts\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.059850 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c76m\" (UniqueName: \"kubernetes.io/projected/795a0dcb-c61f-4d3b-8924-4f04474af216-kube-api-access-8c76m\") pod \"ovn-controller-mn4lz-config-n4lpc\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.168732 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.303022 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-mn5x8" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.303187 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-mn5x8" event={"ID":"e4353f36-d8f9-41ff-8062-f874bd53ef12","Type":"ContainerDied","Data":"d73244ef927ebe753f9e2ed770210ef41d35892b5f2baaac3443b9d3c958c66e"} Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.303698 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d73244ef927ebe753f9e2ed770210ef41d35892b5f2baaac3443b9d3c958c66e" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.653490 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-mn4lz-config-n4lpc"] Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.661130 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:04 crc kubenswrapper[4885]: W0308 19:52:04.665261 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod795a0dcb_c61f_4d3b_8924_4f04474af216.slice/crio-40f9295e86b3fd356e79dca056155644c70dee3ed8a1f9b3a11470a4b5e6b2ab WatchSource:0}: Error finding container 40f9295e86b3fd356e79dca056155644c70dee3ed8a1f9b3a11470a4b5e6b2ab: Status 404 returned error can't find the container with id 40f9295e86b3fd356e79dca056155644c70dee3ed8a1f9b3a11470a4b5e6b2ab Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.844997 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6w8s\" (UniqueName: \"kubernetes.io/projected/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec-kube-api-access-j6w8s\") pod \"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec\" (UID: \"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec\") " Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.855330 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec-kube-api-access-j6w8s" (OuterVolumeSpecName: "kube-api-access-j6w8s") pod "deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec" (UID: "deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec"). InnerVolumeSpecName "kube-api-access-j6w8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:04 crc kubenswrapper[4885]: I0308 19:52:04.947538 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j6w8s\" (UniqueName: \"kubernetes.io/projected/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec-kube-api-access-j6w8s\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.312756 4885 generic.go:334] "Generic (PLEG): container finished" podID="795a0dcb-c61f-4d3b-8924-4f04474af216" containerID="d9a0dae6743044b0ee2ed3030e29d6fe34bb42caf427155033310333a42d0a5a" exitCode=0 Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.312840 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz-config-n4lpc" event={"ID":"795a0dcb-c61f-4d3b-8924-4f04474af216","Type":"ContainerDied","Data":"d9a0dae6743044b0ee2ed3030e29d6fe34bb42caf427155033310333a42d0a5a"} Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.312872 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz-config-n4lpc" event={"ID":"795a0dcb-c61f-4d3b-8924-4f04474af216","Type":"ContainerStarted","Data":"40f9295e86b3fd356e79dca056155644c70dee3ed8a1f9b3a11470a4b5e6b2ab"} Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.315373 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" event={"ID":"deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec","Type":"ContainerDied","Data":"79ffd60319e1bf7dc32ff7241cf4b2beb236af55abf8facaa64ffdcf1e51c9f8"} Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.315406 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79ffd60319e1bf7dc32ff7241cf4b2beb236af55abf8facaa64ffdcf1e51c9f8" Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.315469 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549992-pzbpd" Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.388798 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b08aa6bb-932f-4790-a637-f3667471149c" path="/var/lib/kubelet/pods/b08aa6bb-932f-4790-a637-f3667471149c/volumes" Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.742474 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549986-mpsgk"] Mar 08 19:52:05 crc kubenswrapper[4885]: I0308 19:52:05.757585 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549986-mpsgk"] Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.755146 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.888695 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-scripts\") pod \"795a0dcb-c61f-4d3b-8924-4f04474af216\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.889189 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run-ovn\") pod \"795a0dcb-c61f-4d3b-8924-4f04474af216\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.889342 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "795a0dcb-c61f-4d3b-8924-4f04474af216" (UID: "795a0dcb-c61f-4d3b-8924-4f04474af216"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.889529 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-additional-scripts\") pod \"795a0dcb-c61f-4d3b-8924-4f04474af216\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.889734 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-log-ovn\") pod \"795a0dcb-c61f-4d3b-8924-4f04474af216\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.889875 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c76m\" (UniqueName: \"kubernetes.io/projected/795a0dcb-c61f-4d3b-8924-4f04474af216-kube-api-access-8c76m\") pod \"795a0dcb-c61f-4d3b-8924-4f04474af216\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.889886 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "795a0dcb-c61f-4d3b-8924-4f04474af216" (UID: "795a0dcb-c61f-4d3b-8924-4f04474af216"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.890060 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run\") pod \"795a0dcb-c61f-4d3b-8924-4f04474af216\" (UID: \"795a0dcb-c61f-4d3b-8924-4f04474af216\") " Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.890268 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "795a0dcb-c61f-4d3b-8924-4f04474af216" (UID: "795a0dcb-c61f-4d3b-8924-4f04474af216"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.890397 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run" (OuterVolumeSpecName: "var-run") pod "795a0dcb-c61f-4d3b-8924-4f04474af216" (UID: "795a0dcb-c61f-4d3b-8924-4f04474af216"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.890636 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-scripts" (OuterVolumeSpecName: "scripts") pod "795a0dcb-c61f-4d3b-8924-4f04474af216" (UID: "795a0dcb-c61f-4d3b-8924-4f04474af216"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.891325 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.891380 4885 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.891410 4885 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/795a0dcb-c61f-4d3b-8924-4f04474af216-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.891440 4885 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.891464 4885 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/795a0dcb-c61f-4d3b-8924-4f04474af216-var-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.896515 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795a0dcb-c61f-4d3b-8924-4f04474af216-kube-api-access-8c76m" (OuterVolumeSpecName: "kube-api-access-8c76m") pod "795a0dcb-c61f-4d3b-8924-4f04474af216" (UID: "795a0dcb-c61f-4d3b-8924-4f04474af216"). InnerVolumeSpecName "kube-api-access-8c76m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:06 crc kubenswrapper[4885]: I0308 19:52:06.993193 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c76m\" (UniqueName: \"kubernetes.io/projected/795a0dcb-c61f-4d3b-8924-4f04474af216-kube-api-access-8c76m\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:07 crc kubenswrapper[4885]: I0308 19:52:07.339216 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz-config-n4lpc" event={"ID":"795a0dcb-c61f-4d3b-8924-4f04474af216","Type":"ContainerDied","Data":"40f9295e86b3fd356e79dca056155644c70dee3ed8a1f9b3a11470a4b5e6b2ab"} Mar 08 19:52:07 crc kubenswrapper[4885]: I0308 19:52:07.339272 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40f9295e86b3fd356e79dca056155644c70dee3ed8a1f9b3a11470a4b5e6b2ab" Mar 08 19:52:07 crc kubenswrapper[4885]: I0308 19:52:07.339345 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz-config-n4lpc" Mar 08 19:52:07 crc kubenswrapper[4885]: I0308 19:52:07.382960 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa96ded3-40b5-4e54-9f54-72f64edfb672" path="/var/lib/kubelet/pods/fa96ded3-40b5-4e54-9f54-72f64edfb672/volumes" Mar 08 19:52:07 crc kubenswrapper[4885]: I0308 19:52:07.881196 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mn4lz-config-n4lpc"] Mar 08 19:52:07 crc kubenswrapper[4885]: I0308 19:52:07.889830 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mn4lz-config-n4lpc"] Mar 08 19:52:09 crc kubenswrapper[4885]: I0308 19:52:09.392393 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="795a0dcb-c61f-4d3b-8924-4f04474af216" path="/var/lib/kubelet/pods/795a0dcb-c61f-4d3b-8924-4f04474af216/volumes" Mar 08 19:52:11 crc kubenswrapper[4885]: I0308 19:52:11.383337 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pq8mq" event={"ID":"618b5189-8b29-473f-b59c-e911fca71041","Type":"ContainerStarted","Data":"88f0fd52df3aa60bc754c49bef747bbf48ae9a2eeb839f1af08e43921bc83090"} Mar 08 19:52:11 crc kubenswrapper[4885]: I0308 19:52:11.410464 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-pq8mq" podStartSLOduration=2.828529803 podStartE2EDuration="34.410351643s" podCreationTimestamp="2026-03-08 19:51:37 +0000 UTC" firstStartedPulling="2026-03-08 19:51:38.249720449 +0000 UTC m=+1199.645774482" lastFinishedPulling="2026-03-08 19:52:09.831542269 +0000 UTC m=+1231.227596322" observedRunningTime="2026-03-08 19:52:11.402229735 +0000 UTC m=+1232.798283768" watchObservedRunningTime="2026-03-08 19:52:11.410351643 +0000 UTC m=+1232.806405666" Mar 08 19:52:12 crc kubenswrapper[4885]: I0308 19:52:12.181774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:52:12 crc kubenswrapper[4885]: I0308 19:52:12.192595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"swift-storage-0\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " pod="openstack/swift-storage-0" Mar 08 19:52:12 crc kubenswrapper[4885]: I0308 19:52:12.338351 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 19:52:12 crc kubenswrapper[4885]: I0308 19:52:12.782254 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 08 19:52:12 crc kubenswrapper[4885]: W0308 19:52:12.785172 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa276a05_ab6a_4aa1_9a9f_a990dc1513bd.slice/crio-b1e0ba5e2e1e5dbd0a9f1e5b591c441cf6aacd9c13c59e7ce49d3304923b9158 WatchSource:0}: Error finding container b1e0ba5e2e1e5dbd0a9f1e5b591c441cf6aacd9c13c59e7ce49d3304923b9158: Status 404 returned error can't find the container with id b1e0ba5e2e1e5dbd0a9f1e5b591c441cf6aacd9c13c59e7ce49d3304923b9158 Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.009304 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.317489 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-8wdm8"] Mar 08 19:52:13 crc kubenswrapper[4885]: E0308 19:52:13.317826 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec" containerName="oc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.317844 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec" containerName="oc" Mar 08 19:52:13 crc kubenswrapper[4885]: E0308 19:52:13.317861 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="795a0dcb-c61f-4d3b-8924-4f04474af216" containerName="ovn-config" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.317868 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="795a0dcb-c61f-4d3b-8924-4f04474af216" containerName="ovn-config" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.318050 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="795a0dcb-c61f-4d3b-8924-4f04474af216" containerName="ovn-config" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.318070 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec" containerName="oc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.318547 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.327396 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8wdm8"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.432730 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f33b-account-create-update-hcg7j"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.438455 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.440011 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"b1e0ba5e2e1e5dbd0a9f1e5b591c441cf6aacd9c13c59e7ce49d3304923b9158"} Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.440845 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.456884 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f33b-account-create-update-hcg7j"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.509852 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-operator-scripts\") pod \"cinder-db-create-8wdm8\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.509910 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nzxm\" (UniqueName: \"kubernetes.io/projected/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-kube-api-access-9nzxm\") pod \"cinder-db-create-8wdm8\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.611297 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4swl\" (UniqueName: \"kubernetes.io/projected/57032abe-6c4f-4711-9f48-5733d6a29ec3-kube-api-access-k4swl\") pod \"cinder-f33b-account-create-update-hcg7j\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.611988 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57032abe-6c4f-4711-9f48-5733d6a29ec3-operator-scripts\") pod \"cinder-f33b-account-create-update-hcg7j\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.612164 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-operator-scripts\") pod \"cinder-db-create-8wdm8\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.612314 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nzxm\" (UniqueName: \"kubernetes.io/projected/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-kube-api-access-9nzxm\") pod \"cinder-db-create-8wdm8\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.613384 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-operator-scripts\") pod \"cinder-db-create-8wdm8\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.618979 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-td7dc"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.620096 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.641376 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-86ea-account-create-update-tdtcq"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.642430 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.647213 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.657394 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-td7dc"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.668550 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nzxm\" (UniqueName: \"kubernetes.io/projected/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-kube-api-access-9nzxm\") pod \"cinder-db-create-8wdm8\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.668572 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-86ea-account-create-update-tdtcq"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.673653 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.713979 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4swl\" (UniqueName: \"kubernetes.io/projected/57032abe-6c4f-4711-9f48-5733d6a29ec3-kube-api-access-k4swl\") pod \"cinder-f33b-account-create-update-hcg7j\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.714227 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57032abe-6c4f-4711-9f48-5733d6a29ec3-operator-scripts\") pod \"cinder-f33b-account-create-update-hcg7j\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.715039 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57032abe-6c4f-4711-9f48-5733d6a29ec3-operator-scripts\") pod \"cinder-f33b-account-create-update-hcg7j\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.715191 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-zfp8t"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.716399 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.725882 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.726274 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.726470 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.726697 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nj2kw" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.726948 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zfp8t"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.740131 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-wphzx"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.741571 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.742489 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4swl\" (UniqueName: \"kubernetes.io/projected/57032abe-6c4f-4711-9f48-5733d6a29ec3-kube-api-access-k4swl\") pod \"cinder-f33b-account-create-update-hcg7j\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.749485 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wphzx"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.771198 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.815802 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-operator-scripts\") pod \"barbican-86ea-account-create-update-tdtcq\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.815862 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-combined-ca-bundle\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.815887 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqms\" (UniqueName: \"kubernetes.io/projected/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-kube-api-access-xxqms\") pod \"barbican-86ea-account-create-update-tdtcq\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.815952 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-operator-scripts\") pod \"barbican-db-create-td7dc\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.816147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-config-data\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.816203 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp454\" (UniqueName: \"kubernetes.io/projected/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-kube-api-access-hp454\") pod \"barbican-db-create-td7dc\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.816230 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvwxr\" (UniqueName: \"kubernetes.io/projected/321f89cf-ed1f-4f10-a198-e55c23171363-kube-api-access-xvwxr\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.821090 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.850948 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-031a-account-create-update-qvdpr"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.852063 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.859129 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.869264 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-031a-account-create-update-qvdpr"] Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917590 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-operator-scripts\") pod \"neutron-db-create-wphzx\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-operator-scripts\") pod \"barbican-86ea-account-create-update-tdtcq\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917696 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-combined-ca-bundle\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917718 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxqms\" (UniqueName: \"kubernetes.io/projected/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-kube-api-access-xxqms\") pod \"barbican-86ea-account-create-update-tdtcq\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917767 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-operator-scripts\") pod \"barbican-db-create-td7dc\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917820 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78wf8\" (UniqueName: \"kubernetes.io/projected/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-kube-api-access-78wf8\") pod \"neutron-db-create-wphzx\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.917844 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-config-data\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.918038 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp454\" (UniqueName: \"kubernetes.io/projected/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-kube-api-access-hp454\") pod \"barbican-db-create-td7dc\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.918069 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvwxr\" (UniqueName: \"kubernetes.io/projected/321f89cf-ed1f-4f10-a198-e55c23171363-kube-api-access-xvwxr\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.919543 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-operator-scripts\") pod \"barbican-86ea-account-create-update-tdtcq\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.922944 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-operator-scripts\") pod \"barbican-db-create-td7dc\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.934631 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-config-data\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.935191 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvwxr\" (UniqueName: \"kubernetes.io/projected/321f89cf-ed1f-4f10-a198-e55c23171363-kube-api-access-xvwxr\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.941195 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-combined-ca-bundle\") pod \"keystone-db-sync-zfp8t\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.949277 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxqms\" (UniqueName: \"kubernetes.io/projected/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-kube-api-access-xxqms\") pod \"barbican-86ea-account-create-update-tdtcq\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.956792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp454\" (UniqueName: \"kubernetes.io/projected/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-kube-api-access-hp454\") pod \"barbican-db-create-td7dc\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:13 crc kubenswrapper[4885]: I0308 19:52:13.993077 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.019766 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsfwr\" (UniqueName: \"kubernetes.io/projected/85d25daf-f279-4be1-be4a-75e05e47923c-kube-api-access-nsfwr\") pod \"neutron-031a-account-create-update-qvdpr\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.019900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-operator-scripts\") pod \"neutron-db-create-wphzx\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.020011 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78wf8\" (UniqueName: \"kubernetes.io/projected/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-kube-api-access-78wf8\") pod \"neutron-db-create-wphzx\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.020064 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d25daf-f279-4be1-be4a-75e05e47923c-operator-scripts\") pod \"neutron-031a-account-create-update-qvdpr\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.021649 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-operator-scripts\") pod \"neutron-db-create-wphzx\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.039405 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.042602 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78wf8\" (UniqueName: \"kubernetes.io/projected/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-kube-api-access-78wf8\") pod \"neutron-db-create-wphzx\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.121215 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d25daf-f279-4be1-be4a-75e05e47923c-operator-scripts\") pod \"neutron-031a-account-create-update-qvdpr\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.121260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsfwr\" (UniqueName: \"kubernetes.io/projected/85d25daf-f279-4be1-be4a-75e05e47923c-kube-api-access-nsfwr\") pod \"neutron-031a-account-create-update-qvdpr\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.122219 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d25daf-f279-4be1-be4a-75e05e47923c-operator-scripts\") pod \"neutron-031a-account-create-update-qvdpr\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.144056 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsfwr\" (UniqueName: \"kubernetes.io/projected/85d25daf-f279-4be1-be4a-75e05e47923c-kube-api-access-nsfwr\") pod \"neutron-031a-account-create-update-qvdpr\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.145817 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.175551 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.234263 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.314164 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-8wdm8"] Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.439879 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-zfp8t"] Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.448644 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f33b-account-create-update-hcg7j"] Mar 08 19:52:14 crc kubenswrapper[4885]: I0308 19:52:14.520668 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-86ea-account-create-update-tdtcq"] Mar 08 19:52:14 crc kubenswrapper[4885]: W0308 19:52:14.683831 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57032abe_6c4f_4711_9f48_5733d6a29ec3.slice/crio-90c76edc2afdff8bb379a82502dcb919d7db08a99680ff9ba206f023093b4911 WatchSource:0}: Error finding container 90c76edc2afdff8bb379a82502dcb919d7db08a99680ff9ba206f023093b4911: Status 404 returned error can't find the container with id 90c76edc2afdff8bb379a82502dcb919d7db08a99680ff9ba206f023093b4911 Mar 08 19:52:14 crc kubenswrapper[4885]: W0308 19:52:14.686075 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9cdd234_0e3f_4bd4_9382_1f4ca59aeb44.slice/crio-5215dfd2a231215dccf020d68fc0146208c51a8ff3bacc9c383ce121222eb15e WatchSource:0}: Error finding container 5215dfd2a231215dccf020d68fc0146208c51a8ff3bacc9c383ce121222eb15e: Status 404 returned error can't find the container with id 5215dfd2a231215dccf020d68fc0146208c51a8ff3bacc9c383ce121222eb15e Mar 08 19:52:14 crc kubenswrapper[4885]: W0308 19:52:14.691282 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod915cd482_d3dc_42c1_96cc_0fcc18bbaff2.slice/crio-dbf93a3d4a78525d3b49a3717393773317c846bb5ad656c71478a9b4b356dead WatchSource:0}: Error finding container dbf93a3d4a78525d3b49a3717393773317c846bb5ad656c71478a9b4b356dead: Status 404 returned error can't find the container with id dbf93a3d4a78525d3b49a3717393773317c846bb5ad656c71478a9b4b356dead Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.041550 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-031a-account-create-update-qvdpr"] Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.358306 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-td7dc"] Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.364278 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-wphzx"] Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.470794 4885 generic.go:334] "Generic (PLEG): container finished" podID="a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" containerID="a1a66f9e3c39e6448e08179a06c354d7f53b5cb971ddf727953fa9e3689c988d" exitCode=0 Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.470846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8wdm8" event={"ID":"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44","Type":"ContainerDied","Data":"a1a66f9e3c39e6448e08179a06c354d7f53b5cb971ddf727953fa9e3689c988d"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.470869 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8wdm8" event={"ID":"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44","Type":"ContainerStarted","Data":"5215dfd2a231215dccf020d68fc0146208c51a8ff3bacc9c383ce121222eb15e"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.474465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zfp8t" event={"ID":"321f89cf-ed1f-4f10-a198-e55c23171363","Type":"ContainerStarted","Data":"ba35ec27137480c28a7924c7547829e67cb3c39063decc95b516bf7fa2fd263c"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.475861 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86ea-account-create-update-tdtcq" event={"ID":"915cd482-d3dc-42c1-96cc-0fcc18bbaff2","Type":"ContainerStarted","Data":"41a09112d08bd7901521db3ad7a70721bb9ad48344056086b5a05f6b55d65d91"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.475885 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86ea-account-create-update-tdtcq" event={"ID":"915cd482-d3dc-42c1-96cc-0fcc18bbaff2","Type":"ContainerStarted","Data":"dbf93a3d4a78525d3b49a3717393773317c846bb5ad656c71478a9b4b356dead"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.482604 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-031a-account-create-update-qvdpr" event={"ID":"85d25daf-f279-4be1-be4a-75e05e47923c","Type":"ContainerStarted","Data":"33bf72d09b758f81e7db370aead1484095f9a13481bdaed0653c1631df7c254b"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.482645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-031a-account-create-update-qvdpr" event={"ID":"85d25daf-f279-4be1-be4a-75e05e47923c","Type":"ContainerStarted","Data":"7f5dea3799ddcfc32ea6b6de68fa84ae194141eb0c42a4c52d68c935699abcf2"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.492100 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wphzx" event={"ID":"4a7dd20b-387a-4061-ab5a-a53ee6a240ef","Type":"ContainerStarted","Data":"b7796e7bfe61bbaa210a6875095f7c4c89b048c5376a3868a20df0f89e8e23e7"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.499800 4885 generic.go:334] "Generic (PLEG): container finished" podID="57032abe-6c4f-4711-9f48-5733d6a29ec3" containerID="17d561daa3a3a15f18cf22c1e06443b53b3323129a45f06fd40855d4bb9fbf6a" exitCode=0 Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.500014 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f33b-account-create-update-hcg7j" event={"ID":"57032abe-6c4f-4711-9f48-5733d6a29ec3","Type":"ContainerDied","Data":"17d561daa3a3a15f18cf22c1e06443b53b3323129a45f06fd40855d4bb9fbf6a"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.500136 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f33b-account-create-update-hcg7j" event={"ID":"57032abe-6c4f-4711-9f48-5733d6a29ec3","Type":"ContainerStarted","Data":"90c76edc2afdff8bb379a82502dcb919d7db08a99680ff9ba206f023093b4911"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.506579 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-td7dc" event={"ID":"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb","Type":"ContainerStarted","Data":"712feccfa7b692c5b0e68df6a7b4fe113fe7bc69a17406d91c0444bea2b37a87"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.524394 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.524435 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542"} Mar 08 19:52:15 crc kubenswrapper[4885]: I0308 19:52:15.534028 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-031a-account-create-update-qvdpr" podStartSLOduration=2.534007511 podStartE2EDuration="2.534007511s" podCreationTimestamp="2026-03-08 19:52:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:15.519297608 +0000 UTC m=+1236.915351631" watchObservedRunningTime="2026-03-08 19:52:15.534007511 +0000 UTC m=+1236.930061534" Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.534521 4885 generic.go:334] "Generic (PLEG): container finished" podID="915cd482-d3dc-42c1-96cc-0fcc18bbaff2" containerID="41a09112d08bd7901521db3ad7a70721bb9ad48344056086b5a05f6b55d65d91" exitCode=0 Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.535018 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86ea-account-create-update-tdtcq" event={"ID":"915cd482-d3dc-42c1-96cc-0fcc18bbaff2","Type":"ContainerDied","Data":"41a09112d08bd7901521db3ad7a70721bb9ad48344056086b5a05f6b55d65d91"} Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.536460 4885 generic.go:334] "Generic (PLEG): container finished" podID="85d25daf-f279-4be1-be4a-75e05e47923c" containerID="33bf72d09b758f81e7db370aead1484095f9a13481bdaed0653c1631df7c254b" exitCode=0 Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.536523 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-031a-account-create-update-qvdpr" event={"ID":"85d25daf-f279-4be1-be4a-75e05e47923c","Type":"ContainerDied","Data":"33bf72d09b758f81e7db370aead1484095f9a13481bdaed0653c1631df7c254b"} Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.545440 4885 generic.go:334] "Generic (PLEG): container finished" podID="4a7dd20b-387a-4061-ab5a-a53ee6a240ef" containerID="fd8d322616b5f6a1a40bb29dccbdca346cc9726c81d97364f599af639e9e8eb7" exitCode=0 Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.545612 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wphzx" event={"ID":"4a7dd20b-387a-4061-ab5a-a53ee6a240ef","Type":"ContainerDied","Data":"fd8d322616b5f6a1a40bb29dccbdca346cc9726c81d97364f599af639e9e8eb7"} Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.553794 4885 generic.go:334] "Generic (PLEG): container finished" podID="dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" containerID="6c5fd0c87fdc37dc689d9957740eb80226bab6f4a5010aca5f3d0a66ab0c82c3" exitCode=0 Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.553858 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-td7dc" event={"ID":"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb","Type":"ContainerDied","Data":"6c5fd0c87fdc37dc689d9957740eb80226bab6f4a5010aca5f3d0a66ab0c82c3"} Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.558425 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653"} Mar 08 19:52:16 crc kubenswrapper[4885]: I0308 19:52:16.558453 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6"} Mar 08 19:52:17 crc kubenswrapper[4885]: I0308 19:52:17.576491 4885 generic.go:334] "Generic (PLEG): container finished" podID="618b5189-8b29-473f-b59c-e911fca71041" containerID="88f0fd52df3aa60bc754c49bef747bbf48ae9a2eeb839f1af08e43921bc83090" exitCode=0 Mar 08 19:52:17 crc kubenswrapper[4885]: I0308 19:52:17.576526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pq8mq" event={"ID":"618b5189-8b29-473f-b59c-e911fca71041","Type":"ContainerDied","Data":"88f0fd52df3aa60bc754c49bef747bbf48ae9a2eeb839f1af08e43921bc83090"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.549035 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.556988 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.565426 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.579373 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.600097 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.605385 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pq8mq" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.618120 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-031a-account-create-update-qvdpr" event={"ID":"85d25daf-f279-4be1-be4a-75e05e47923c","Type":"ContainerDied","Data":"7f5dea3799ddcfc32ea6b6de68fa84ae194141eb0c42a4c52d68c935699abcf2"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.618164 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f5dea3799ddcfc32ea6b6de68fa84ae194141eb0c42a4c52d68c935699abcf2" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.618237 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-qvdpr" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.620379 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.620547 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-wphzx" event={"ID":"4a7dd20b-387a-4061-ab5a-a53ee6a240ef","Type":"ContainerDied","Data":"b7796e7bfe61bbaa210a6875095f7c4c89b048c5376a3868a20df0f89e8e23e7"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.620576 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7796e7bfe61bbaa210a6875095f7c4c89b048c5376a3868a20df0f89e8e23e7" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.620612 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-wphzx" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.622458 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-hcg7j" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.622458 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f33b-account-create-update-hcg7j" event={"ID":"57032abe-6c4f-4711-9f48-5733d6a29ec3","Type":"ContainerDied","Data":"90c76edc2afdff8bb379a82502dcb919d7db08a99680ff9ba206f023093b4911"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.622561 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90c76edc2afdff8bb379a82502dcb919d7db08a99680ff9ba206f023093b4911" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.624183 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-td7dc" event={"ID":"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb","Type":"ContainerDied","Data":"712feccfa7b692c5b0e68df6a7b4fe113fe7bc69a17406d91c0444bea2b37a87"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.624209 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="712feccfa7b692c5b0e68df6a7b4fe113fe7bc69a17406d91c0444bea2b37a87" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.624241 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-td7dc" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.628388 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsfwr\" (UniqueName: \"kubernetes.io/projected/85d25daf-f279-4be1-be4a-75e05e47923c-kube-api-access-nsfwr\") pod \"85d25daf-f279-4be1-be4a-75e05e47923c\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.628547 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d25daf-f279-4be1-be4a-75e05e47923c-operator-scripts\") pod \"85d25daf-f279-4be1-be4a-75e05e47923c\" (UID: \"85d25daf-f279-4be1-be4a-75e05e47923c\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.629193 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85d25daf-f279-4be1-be4a-75e05e47923c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "85d25daf-f279-4be1-be4a-75e05e47923c" (UID: "85d25daf-f279-4be1-be4a-75e05e47923c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.630183 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-8wdm8" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.630387 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-8wdm8" event={"ID":"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44","Type":"ContainerDied","Data":"5215dfd2a231215dccf020d68fc0146208c51a8ff3bacc9c383ce121222eb15e"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.630519 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5215dfd2a231215dccf020d68fc0146208c51a8ff3bacc9c383ce121222eb15e" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.637032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pq8mq" event={"ID":"618b5189-8b29-473f-b59c-e911fca71041","Type":"ContainerDied","Data":"6a7111739f9460209a507bc904ec5f5cda31cf0c98c09b8cae21f39ac42b3d39"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.637075 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a7111739f9460209a507bc904ec5f5cda31cf0c98c09b8cae21f39ac42b3d39" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.637141 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pq8mq" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.643333 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85d25daf-f279-4be1-be4a-75e05e47923c-kube-api-access-nsfwr" (OuterVolumeSpecName: "kube-api-access-nsfwr") pod "85d25daf-f279-4be1-be4a-75e05e47923c" (UID: "85d25daf-f279-4be1-be4a-75e05e47923c"). InnerVolumeSpecName "kube-api-access-nsfwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.647302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86ea-account-create-update-tdtcq" event={"ID":"915cd482-d3dc-42c1-96cc-0fcc18bbaff2","Type":"ContainerDied","Data":"dbf93a3d4a78525d3b49a3717393773317c846bb5ad656c71478a9b4b356dead"} Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.647345 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbf93a3d4a78525d3b49a3717393773317c846bb5ad656c71478a9b4b356dead" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.647459 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-tdtcq" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.729680 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nzxm\" (UniqueName: \"kubernetes.io/projected/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-kube-api-access-9nzxm\") pod \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.729724 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-db-sync-config-data\") pod \"618b5189-8b29-473f-b59c-e911fca71041\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.729742 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp454\" (UniqueName: \"kubernetes.io/projected/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-kube-api-access-hp454\") pod \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.729798 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-config-data\") pod \"618b5189-8b29-473f-b59c-e911fca71041\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.729837 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-operator-scripts\") pod \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\" (UID: \"dbf04089-b7ff-4c4d-acad-f41d45ac6bfb\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.729867 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-operator-scripts\") pod \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730468 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" (UID: "dbf04089-b7ff-4c4d-acad-f41d45ac6bfb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730541 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-operator-scripts\") pod \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\" (UID: \"a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730573 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxqms\" (UniqueName: \"kubernetes.io/projected/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-kube-api-access-xxqms\") pod \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\" (UID: \"915cd482-d3dc-42c1-96cc-0fcc18bbaff2\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730853 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "915cd482-d3dc-42c1-96cc-0fcc18bbaff2" (UID: "915cd482-d3dc-42c1-96cc-0fcc18bbaff2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730872 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" (UID: "a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730970 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78wf8\" (UniqueName: \"kubernetes.io/projected/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-kube-api-access-78wf8\") pod \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.730993 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4swl\" (UniqueName: \"kubernetes.io/projected/57032abe-6c4f-4711-9f48-5733d6a29ec3-kube-api-access-k4swl\") pod \"57032abe-6c4f-4711-9f48-5733d6a29ec3\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.731018 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57032abe-6c4f-4711-9f48-5733d6a29ec3-operator-scripts\") pod \"57032abe-6c4f-4711-9f48-5733d6a29ec3\" (UID: \"57032abe-6c4f-4711-9f48-5733d6a29ec3\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.731546 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57032abe-6c4f-4711-9f48-5733d6a29ec3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "57032abe-6c4f-4711-9f48-5733d6a29ec3" (UID: "57032abe-6c4f-4711-9f48-5733d6a29ec3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.731566 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-operator-scripts\") pod \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\" (UID: \"4a7dd20b-387a-4061-ab5a-a53ee6a240ef\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.731634 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-combined-ca-bundle\") pod \"618b5189-8b29-473f-b59c-e911fca71041\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.731658 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfl6p\" (UniqueName: \"kubernetes.io/projected/618b5189-8b29-473f-b59c-e911fca71041-kube-api-access-sfl6p\") pod \"618b5189-8b29-473f-b59c-e911fca71041\" (UID: \"618b5189-8b29-473f-b59c-e911fca71041\") " Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.731834 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a7dd20b-387a-4061-ab5a-a53ee6a240ef" (UID: "4a7dd20b-387a-4061-ab5a-a53ee6a240ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.735335 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-kube-api-access-9nzxm" (OuterVolumeSpecName: "kube-api-access-9nzxm") pod "a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" (UID: "a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44"). InnerVolumeSpecName "kube-api-access-9nzxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.735532 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-kube-api-access-xxqms" (OuterVolumeSpecName: "kube-api-access-xxqms") pod "915cd482-d3dc-42c1-96cc-0fcc18bbaff2" (UID: "915cd482-d3dc-42c1-96cc-0fcc18bbaff2"). InnerVolumeSpecName "kube-api-access-xxqms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.735741 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/618b5189-8b29-473f-b59c-e911fca71041-kube-api-access-sfl6p" (OuterVolumeSpecName: "kube-api-access-sfl6p") pod "618b5189-8b29-473f-b59c-e911fca71041" (UID: "618b5189-8b29-473f-b59c-e911fca71041"). InnerVolumeSpecName "kube-api-access-sfl6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.736199 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-kube-api-access-78wf8" (OuterVolumeSpecName: "kube-api-access-78wf8") pod "4a7dd20b-387a-4061-ab5a-a53ee6a240ef" (UID: "4a7dd20b-387a-4061-ab5a-a53ee6a240ef"). InnerVolumeSpecName "kube-api-access-78wf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737090 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxqms\" (UniqueName: \"kubernetes.io/projected/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-kube-api-access-xxqms\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737112 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78wf8\" (UniqueName: \"kubernetes.io/projected/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-kube-api-access-78wf8\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737122 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/57032abe-6c4f-4711-9f48-5733d6a29ec3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737130 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsfwr\" (UniqueName: \"kubernetes.io/projected/85d25daf-f279-4be1-be4a-75e05e47923c-kube-api-access-nsfwr\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737139 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a7dd20b-387a-4061-ab5a-a53ee6a240ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737148 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfl6p\" (UniqueName: \"kubernetes.io/projected/618b5189-8b29-473f-b59c-e911fca71041-kube-api-access-sfl6p\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737157 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85d25daf-f279-4be1-be4a-75e05e47923c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737167 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nzxm\" (UniqueName: \"kubernetes.io/projected/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-kube-api-access-9nzxm\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737176 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737184 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/915cd482-d3dc-42c1-96cc-0fcc18bbaff2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737193 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.737340 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "618b5189-8b29-473f-b59c-e911fca71041" (UID: "618b5189-8b29-473f-b59c-e911fca71041"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.741717 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-kube-api-access-hp454" (OuterVolumeSpecName: "kube-api-access-hp454") pod "dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" (UID: "dbf04089-b7ff-4c4d-acad-f41d45ac6bfb"). InnerVolumeSpecName "kube-api-access-hp454". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.756576 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57032abe-6c4f-4711-9f48-5733d6a29ec3-kube-api-access-k4swl" (OuterVolumeSpecName: "kube-api-access-k4swl") pod "57032abe-6c4f-4711-9f48-5733d6a29ec3" (UID: "57032abe-6c4f-4711-9f48-5733d6a29ec3"). InnerVolumeSpecName "kube-api-access-k4swl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.771008 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "618b5189-8b29-473f-b59c-e911fca71041" (UID: "618b5189-8b29-473f-b59c-e911fca71041"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.785629 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-config-data" (OuterVolumeSpecName: "config-data") pod "618b5189-8b29-473f-b59c-e911fca71041" (UID: "618b5189-8b29-473f-b59c-e911fca71041"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.838438 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4swl\" (UniqueName: \"kubernetes.io/projected/57032abe-6c4f-4711-9f48-5733d6a29ec3-kube-api-access-k4swl\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.838484 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.838494 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.838503 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp454\" (UniqueName: \"kubernetes.io/projected/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb-kube-api-access-hp454\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:19 crc kubenswrapper[4885]: I0308 19:52:19.838512 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/618b5189-8b29-473f-b59c-e911fca71041-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.074437 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-nxtzx"] Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.074968 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.074979 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.074991 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57032abe-6c4f-4711-9f48-5733d6a29ec3" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.074998 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="57032abe-6c4f-4711-9f48-5733d6a29ec3" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.075011 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a7dd20b-387a-4061-ab5a-a53ee6a240ef" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075018 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a7dd20b-387a-4061-ab5a-a53ee6a240ef" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.075026 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="915cd482-d3dc-42c1-96cc-0fcc18bbaff2" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075032 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="915cd482-d3dc-42c1-96cc-0fcc18bbaff2" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.075041 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="618b5189-8b29-473f-b59c-e911fca71041" containerName="glance-db-sync" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075046 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="618b5189-8b29-473f-b59c-e911fca71041" containerName="glance-db-sync" Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.075059 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85d25daf-f279-4be1-be4a-75e05e47923c" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075065 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="85d25daf-f279-4be1-be4a-75e05e47923c" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: E0308 19:52:21.075073 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075079 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075211 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a7dd20b-387a-4061-ab5a-a53ee6a240ef" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075224 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="618b5189-8b29-473f-b59c-e911fca71041" containerName="glance-db-sync" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075237 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075244 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" containerName="mariadb-database-create" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075251 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="85d25daf-f279-4be1-be4a-75e05e47923c" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075258 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="57032abe-6c4f-4711-9f48-5733d6a29ec3" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.075269 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="915cd482-d3dc-42c1-96cc-0fcc18bbaff2" containerName="mariadb-account-create-update" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.078032 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.092856 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-nxtzx"] Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.173861 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-config\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.173961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.173988 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6nhq\" (UniqueName: \"kubernetes.io/projected/c6fd8def-cffc-4f64-9805-55040dae82c6-kube-api-access-l6nhq\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.174098 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.174127 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.275638 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.275913 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6nhq\" (UniqueName: \"kubernetes.io/projected/c6fd8def-cffc-4f64-9805-55040dae82c6-kube-api-access-l6nhq\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.276016 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.276040 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.276080 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-config\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.277137 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-config\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.279625 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-dns-svc\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.279558 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-sb\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.279950 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-nb\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.292131 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6nhq\" (UniqueName: \"kubernetes.io/projected/c6fd8def-cffc-4f64-9805-55040dae82c6-kube-api-access-l6nhq\") pod \"dnsmasq-dns-7f58d6bb6f-nxtzx\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.493257 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.690510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zfp8t" event={"ID":"321f89cf-ed1f-4f10-a198-e55c23171363","Type":"ContainerStarted","Data":"1f2b4371c693a384eeb8722a9c031b14ca5b16214abf49aab649bdf051aaa6a9"} Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.706013 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd"} Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.706262 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc"} Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.716486 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-zfp8t" podStartSLOduration=2.35271424 podStartE2EDuration="8.716474049s" podCreationTimestamp="2026-03-08 19:52:13 +0000 UTC" firstStartedPulling="2026-03-08 19:52:14.703507225 +0000 UTC m=+1236.099561248" lastFinishedPulling="2026-03-08 19:52:21.067267034 +0000 UTC m=+1242.463321057" observedRunningTime="2026-03-08 19:52:21.712663247 +0000 UTC m=+1243.108717270" watchObservedRunningTime="2026-03-08 19:52:21.716474049 +0000 UTC m=+1243.112528072" Mar 08 19:52:21 crc kubenswrapper[4885]: I0308 19:52:21.790964 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-nxtzx"] Mar 08 19:52:22 crc kubenswrapper[4885]: I0308 19:52:22.051098 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 19:52:22 crc kubenswrapper[4885]: I0308 19:52:22.719953 4885 generic.go:334] "Generic (PLEG): container finished" podID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerID="5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf" exitCode=0 Mar 08 19:52:22 crc kubenswrapper[4885]: I0308 19:52:22.720044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" event={"ID":"c6fd8def-cffc-4f64-9805-55040dae82c6","Type":"ContainerDied","Data":"5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf"} Mar 08 19:52:22 crc kubenswrapper[4885]: I0308 19:52:22.720081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" event={"ID":"c6fd8def-cffc-4f64-9805-55040dae82c6","Type":"ContainerStarted","Data":"2371621d73898b5a7be293c5570047480578f0621eb75c65a2d656b40ff52a84"} Mar 08 19:52:22 crc kubenswrapper[4885]: I0308 19:52:22.750056 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f"} Mar 08 19:52:22 crc kubenswrapper[4885]: I0308 19:52:22.750116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81"} Mar 08 19:52:23 crc kubenswrapper[4885]: I0308 19:52:23.831014 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3"} Mar 08 19:52:23 crc kubenswrapper[4885]: I0308 19:52:23.831355 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd"} Mar 08 19:52:23 crc kubenswrapper[4885]: I0308 19:52:23.856826 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" event={"ID":"c6fd8def-cffc-4f64-9805-55040dae82c6","Type":"ContainerStarted","Data":"a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef"} Mar 08 19:52:23 crc kubenswrapper[4885]: I0308 19:52:23.857816 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:23 crc kubenswrapper[4885]: I0308 19:52:23.895406 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" podStartSLOduration=2.895386572 podStartE2EDuration="2.895386572s" podCreationTimestamp="2026-03-08 19:52:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:23.881609533 +0000 UTC m=+1245.277663556" watchObservedRunningTime="2026-03-08 19:52:23.895386572 +0000 UTC m=+1245.291440595" Mar 08 19:52:24 crc kubenswrapper[4885]: I0308 19:52:24.870192 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2"} Mar 08 19:52:24 crc kubenswrapper[4885]: I0308 19:52:24.870465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb"} Mar 08 19:52:24 crc kubenswrapper[4885]: I0308 19:52:24.870480 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca"} Mar 08 19:52:24 crc kubenswrapper[4885]: I0308 19:52:24.870488 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474"} Mar 08 19:52:24 crc kubenswrapper[4885]: I0308 19:52:24.870497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerStarted","Data":"37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a"} Mar 08 19:52:24 crc kubenswrapper[4885]: I0308 19:52:24.913132 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=35.372379833 podStartE2EDuration="45.913114598s" podCreationTimestamp="2026-03-08 19:51:39 +0000 UTC" firstStartedPulling="2026-03-08 19:52:12.789843201 +0000 UTC m=+1234.185897234" lastFinishedPulling="2026-03-08 19:52:23.330577976 +0000 UTC m=+1244.726631999" observedRunningTime="2026-03-08 19:52:24.906300356 +0000 UTC m=+1246.302354379" watchObservedRunningTime="2026-03-08 19:52:24.913114598 +0000 UTC m=+1246.309168611" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.177626 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-nxtzx"] Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.208096 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-d9vgh"] Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.209765 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.219730 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-d9vgh"] Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.224047 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.356633 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.356814 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.356878 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cclxk\" (UniqueName: \"kubernetes.io/projected/b502bd3e-eafb-44cf-a81e-c10d647302a4-kube-api-access-cclxk\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.357046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.357111 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.357158 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-config\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.458993 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-config\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459084 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459167 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459190 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cclxk\" (UniqueName: \"kubernetes.io/projected/b502bd3e-eafb-44cf-a81e-c10d647302a4-kube-api-access-cclxk\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459210 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459226 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459905 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-config\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.459955 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-sb\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.460479 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-swift-storage-0\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.460969 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-svc\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.461390 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-nb\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.477693 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cclxk\" (UniqueName: \"kubernetes.io/projected/b502bd3e-eafb-44cf-a81e-c10d647302a4-kube-api-access-cclxk\") pod \"dnsmasq-dns-75c886f8b5-d9vgh\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.537885 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.887477 4885 generic.go:334] "Generic (PLEG): container finished" podID="321f89cf-ed1f-4f10-a198-e55c23171363" containerID="1f2b4371c693a384eeb8722a9c031b14ca5b16214abf49aab649bdf051aaa6a9" exitCode=0 Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.887627 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zfp8t" event={"ID":"321f89cf-ed1f-4f10-a198-e55c23171363","Type":"ContainerDied","Data":"1f2b4371c693a384eeb8722a9c031b14ca5b16214abf49aab649bdf051aaa6a9"} Mar 08 19:52:25 crc kubenswrapper[4885]: I0308 19:52:25.956723 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-d9vgh"] Mar 08 19:52:26 crc kubenswrapper[4885]: I0308 19:52:26.900810 4885 generic.go:334] "Generic (PLEG): container finished" podID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerID="bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70" exitCode=0 Mar 08 19:52:26 crc kubenswrapper[4885]: I0308 19:52:26.900894 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" event={"ID":"b502bd3e-eafb-44cf-a81e-c10d647302a4","Type":"ContainerDied","Data":"bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70"} Mar 08 19:52:26 crc kubenswrapper[4885]: I0308 19:52:26.901590 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" event={"ID":"b502bd3e-eafb-44cf-a81e-c10d647302a4","Type":"ContainerStarted","Data":"a9372b012e69a1c2d0bad407661605d16c694bf0bbe0256a7ac15648fdbf0a98"} Mar 08 19:52:26 crc kubenswrapper[4885]: I0308 19:52:26.901742 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerName="dnsmasq-dns" containerID="cri-o://a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef" gracePeriod=10 Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.232836 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.312039 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389173 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-config-data\") pod \"321f89cf-ed1f-4f10-a198-e55c23171363\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-nb\") pod \"c6fd8def-cffc-4f64-9805-55040dae82c6\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389362 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-combined-ca-bundle\") pod \"321f89cf-ed1f-4f10-a198-e55c23171363\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389399 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6nhq\" (UniqueName: \"kubernetes.io/projected/c6fd8def-cffc-4f64-9805-55040dae82c6-kube-api-access-l6nhq\") pod \"c6fd8def-cffc-4f64-9805-55040dae82c6\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389438 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvwxr\" (UniqueName: \"kubernetes.io/projected/321f89cf-ed1f-4f10-a198-e55c23171363-kube-api-access-xvwxr\") pod \"321f89cf-ed1f-4f10-a198-e55c23171363\" (UID: \"321f89cf-ed1f-4f10-a198-e55c23171363\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389464 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-dns-svc\") pod \"c6fd8def-cffc-4f64-9805-55040dae82c6\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389522 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-sb\") pod \"c6fd8def-cffc-4f64-9805-55040dae82c6\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.389539 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-config\") pod \"c6fd8def-cffc-4f64-9805-55040dae82c6\" (UID: \"c6fd8def-cffc-4f64-9805-55040dae82c6\") " Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.395078 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/321f89cf-ed1f-4f10-a198-e55c23171363-kube-api-access-xvwxr" (OuterVolumeSpecName: "kube-api-access-xvwxr") pod "321f89cf-ed1f-4f10-a198-e55c23171363" (UID: "321f89cf-ed1f-4f10-a198-e55c23171363"). InnerVolumeSpecName "kube-api-access-xvwxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.395848 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6fd8def-cffc-4f64-9805-55040dae82c6-kube-api-access-l6nhq" (OuterVolumeSpecName: "kube-api-access-l6nhq") pod "c6fd8def-cffc-4f64-9805-55040dae82c6" (UID: "c6fd8def-cffc-4f64-9805-55040dae82c6"). InnerVolumeSpecName "kube-api-access-l6nhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.421929 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "321f89cf-ed1f-4f10-a198-e55c23171363" (UID: "321f89cf-ed1f-4f10-a198-e55c23171363"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.432622 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-config" (OuterVolumeSpecName: "config") pod "c6fd8def-cffc-4f64-9805-55040dae82c6" (UID: "c6fd8def-cffc-4f64-9805-55040dae82c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.436319 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c6fd8def-cffc-4f64-9805-55040dae82c6" (UID: "c6fd8def-cffc-4f64-9805-55040dae82c6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.438400 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c6fd8def-cffc-4f64-9805-55040dae82c6" (UID: "c6fd8def-cffc-4f64-9805-55040dae82c6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.439015 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c6fd8def-cffc-4f64-9805-55040dae82c6" (UID: "c6fd8def-cffc-4f64-9805-55040dae82c6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.445877 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-config-data" (OuterVolumeSpecName: "config-data") pod "321f89cf-ed1f-4f10-a198-e55c23171363" (UID: "321f89cf-ed1f-4f10-a198-e55c23171363"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491467 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6nhq\" (UniqueName: \"kubernetes.io/projected/c6fd8def-cffc-4f64-9805-55040dae82c6-kube-api-access-l6nhq\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491498 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvwxr\" (UniqueName: \"kubernetes.io/projected/321f89cf-ed1f-4f10-a198-e55c23171363-kube-api-access-xvwxr\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491511 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491522 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491530 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491540 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491549 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6fd8def-cffc-4f64-9805-55040dae82c6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.491557 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/321f89cf-ed1f-4f10-a198-e55c23171363-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.912089 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" event={"ID":"b502bd3e-eafb-44cf-a81e-c10d647302a4","Type":"ContainerStarted","Data":"efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920"} Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.912319 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.913880 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-zfp8t" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.914162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-zfp8t" event={"ID":"321f89cf-ed1f-4f10-a198-e55c23171363","Type":"ContainerDied","Data":"ba35ec27137480c28a7924c7547829e67cb3c39063decc95b516bf7fa2fd263c"} Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.914200 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba35ec27137480c28a7924c7547829e67cb3c39063decc95b516bf7fa2fd263c" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.915979 4885 generic.go:334] "Generic (PLEG): container finished" podID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerID="a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef" exitCode=0 Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.916020 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" event={"ID":"c6fd8def-cffc-4f64-9805-55040dae82c6","Type":"ContainerDied","Data":"a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef"} Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.916043 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" event={"ID":"c6fd8def-cffc-4f64-9805-55040dae82c6","Type":"ContainerDied","Data":"2371621d73898b5a7be293c5570047480578f0621eb75c65a2d656b40ff52a84"} Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.916063 4885 scope.go:117] "RemoveContainer" containerID="a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.916184 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f58d6bb6f-nxtzx" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.955430 4885 scope.go:117] "RemoveContainer" containerID="5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.956709 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" podStartSLOduration=2.956687492 podStartE2EDuration="2.956687492s" podCreationTimestamp="2026-03-08 19:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:27.94805823 +0000 UTC m=+1249.344112263" watchObservedRunningTime="2026-03-08 19:52:27.956687492 +0000 UTC m=+1249.352741525" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.976040 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-nxtzx"] Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.992381 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f58d6bb6f-nxtzx"] Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.992531 4885 scope.go:117] "RemoveContainer" containerID="a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef" Mar 08 19:52:27 crc kubenswrapper[4885]: E0308 19:52:27.993008 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef\": container with ID starting with a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef not found: ID does not exist" containerID="a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.993040 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef"} err="failed to get container status \"a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef\": rpc error: code = NotFound desc = could not find container \"a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef\": container with ID starting with a944e9abe1e76bc44ad781615288a705cd64966962f8bc873c45b75ef016daef not found: ID does not exist" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.993058 4885 scope.go:117] "RemoveContainer" containerID="5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf" Mar 08 19:52:27 crc kubenswrapper[4885]: E0308 19:52:27.993415 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf\": container with ID starting with 5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf not found: ID does not exist" containerID="5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf" Mar 08 19:52:27 crc kubenswrapper[4885]: I0308 19:52:27.993449 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf"} err="failed to get container status \"5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf\": rpc error: code = NotFound desc = could not find container \"5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf\": container with ID starting with 5e89a32258ba7c2730b753df36193155946a0243aa39e0568a6e07c2a1835cdf not found: ID does not exist" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.161635 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-d9vgh"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.234440 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-znfl6"] Mar 08 19:52:28 crc kubenswrapper[4885]: E0308 19:52:28.234769 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerName="init" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.234784 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerName="init" Mar 08 19:52:28 crc kubenswrapper[4885]: E0308 19:52:28.234813 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerName="dnsmasq-dns" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.234820 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerName="dnsmasq-dns" Mar 08 19:52:28 crc kubenswrapper[4885]: E0308 19:52:28.234830 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="321f89cf-ed1f-4f10-a198-e55c23171363" containerName="keystone-db-sync" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.234837 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="321f89cf-ed1f-4f10-a198-e55c23171363" containerName="keystone-db-sync" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.234994 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="321f89cf-ed1f-4f10-a198-e55c23171363" containerName="keystone-db-sync" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.235014 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" containerName="dnsmasq-dns" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.235791 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.304067 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-n7295"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.306504 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.310638 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.311058 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.311103 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.311258 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nj2kw" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.311981 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.329792 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n7295"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.342269 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-znfl6"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.408726 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.408776 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.408859 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-combined-ca-bundle\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.408880 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.408900 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8l7x\" (UniqueName: \"kubernetes.io/projected/6a7fcdff-b452-4cc7-becc-65a43827b50b-kube-api-access-c8l7x\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.408998 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-config\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.409020 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-fernet-keys\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.409049 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-svc\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.409065 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-config-data\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.409084 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-scripts\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.409101 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5qff\" (UniqueName: \"kubernetes.io/projected/991f909f-207b-4663-adea-a4f8cd0c1cb6-kube-api-access-m5qff\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.409125 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-credential-keys\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512731 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-combined-ca-bundle\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512772 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512792 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8l7x\" (UniqueName: \"kubernetes.io/projected/6a7fcdff-b452-4cc7-becc-65a43827b50b-kube-api-access-c8l7x\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512825 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-config\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512843 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-fernet-keys\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512874 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-svc\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512892 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-config-data\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512912 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-scripts\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512957 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5qff\" (UniqueName: \"kubernetes.io/projected/991f909f-207b-4663-adea-a4f8cd0c1cb6-kube-api-access-m5qff\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.512978 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-credential-keys\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.513014 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.513035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.514028 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-swift-storage-0\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.514770 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-svc\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.515808 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-nb\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.516344 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-config\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.521473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-scripts\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.523572 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-sb\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.531763 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-config-data\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.539366 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-credential-keys\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.543549 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-fernet-keys\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.546954 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5qff\" (UniqueName: \"kubernetes.io/projected/991f909f-207b-4663-adea-a4f8cd0c1cb6-kube-api-access-m5qff\") pod \"dnsmasq-dns-5985c59c55-znfl6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.552073 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.555272 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-combined-ca-bundle\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.580454 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8l7x\" (UniqueName: \"kubernetes.io/projected/6a7fcdff-b452-4cc7-becc-65a43827b50b-kube-api-access-c8l7x\") pod \"keystone-bootstrap-n7295\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.615334 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-mp8c9"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.616337 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.628495 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.628812 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x9f48" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.628958 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.631246 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.714080 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mp8c9"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.722148 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-combined-ca-bundle\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.722265 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-config-data\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.722307 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2efc22fd-a92b-422c-876d-7b80f06928b2-etc-machine-id\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.722414 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-scripts\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.722510 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-db-sync-config-data\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.722664 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlqnv\" (UniqueName: \"kubernetes.io/projected/2efc22fd-a92b-422c-876d-7b80f06928b2-kube-api-access-wlqnv\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.745597 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-zz9c5"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.750805 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.754889 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.755074 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kg22q" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.755369 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824083 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-scripts\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-db-sync-config-data\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824178 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-combined-ca-bundle\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824204 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlqnv\" (UniqueName: \"kubernetes.io/projected/2efc22fd-a92b-422c-876d-7b80f06928b2-kube-api-access-wlqnv\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824227 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-combined-ca-bundle\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824246 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-config-data\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824267 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-logs\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824288 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-config-data\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824314 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2efc22fd-a92b-422c-876d-7b80f06928b2-etc-machine-id\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824350 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-scripts\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.824368 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhz2b\" (UniqueName: \"kubernetes.io/projected/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-kube-api-access-mhz2b\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.825303 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zz9c5"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.825473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2efc22fd-a92b-422c-876d-7b80f06928b2-etc-machine-id\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.857375 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-config-data\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.858435 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-scripts\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.858523 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kpvl2"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.859749 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.860872 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-combined-ca-bundle\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.864811 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-db-sync-config-data\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.864946 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlqnv\" (UniqueName: \"kubernetes.io/projected/2efc22fd-a92b-422c-876d-7b80f06928b2-kube-api-access-wlqnv\") pod \"cinder-db-sync-mp8c9\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.872936 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-znfl6"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.874683 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.874894 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.883048 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kpvl2"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.886178 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xgk7z" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.928946 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhz2b\" (UniqueName: \"kubernetes.io/projected/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-kube-api-access-mhz2b\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.928993 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-combined-ca-bundle\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.929015 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg4r7\" (UniqueName: \"kubernetes.io/projected/b04611e7-17b5-48ae-8169-534f684a101b-kube-api-access-dg4r7\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.929053 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-scripts\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.929103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-combined-ca-bundle\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.929146 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-config-data\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.929165 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-logs\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.929211 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-config\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.933317 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-gscv2"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.934406 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-logs\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.934776 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.938513 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-combined-ca-bundle\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.944534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-config-data\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.945144 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-scripts\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.960772 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhz2b\" (UniqueName: \"kubernetes.io/projected/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-kube-api-access-mhz2b\") pod \"placement-db-sync-zz9c5\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.962293 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-fwjsc"] Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.963494 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.997521 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 08 19:52:28 crc kubenswrapper[4885]: I0308 19:52:28.997805 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zrht4" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.000286 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-gscv2"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.007034 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fwjsc"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.014414 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.016652 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.021615 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.021735 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.022070 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030260 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-db-sync-config-data\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030302 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030351 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m92h4\" (UniqueName: \"kubernetes.io/projected/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-kube-api-access-m92h4\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-config\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030439 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030477 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6f64\" (UniqueName: \"kubernetes.io/projected/46290bd2-6ad7-46f4-86f4-48aa73bc304a-kube-api-access-w6f64\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030510 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-config\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030541 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030570 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030587 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-combined-ca-bundle\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg4r7\" (UniqueName: \"kubernetes.io/projected/b04611e7-17b5-48ae-8169-534f684a101b-kube-api-access-dg4r7\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.030641 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-combined-ca-bundle\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.037774 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-combined-ca-bundle\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.039709 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-config\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.055660 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg4r7\" (UniqueName: \"kubernetes.io/projected/b04611e7-17b5-48ae-8169-534f684a101b-kube-api-access-dg4r7\") pod \"neutron-db-sync-kpvl2\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.107609 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.131588 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.132395 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6f64\" (UniqueName: \"kubernetes.io/projected/46290bd2-6ad7-46f4-86f4-48aa73bc304a-kube-api-access-w6f64\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.132490 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.132801 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.132873 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-combined-ca-bundle\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.132943 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-scripts\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-run-httpd\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133071 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-db-sync-config-data\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133138 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133158 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-log-httpd\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133172 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133213 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-config-data\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133277 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m92h4\" (UniqueName: \"kubernetes.io/projected/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-kube-api-access-m92h4\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133325 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-config\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133338 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-nb\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133353 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bkw5\" (UniqueName: \"kubernetes.io/projected/624830da-2b73-4843-bb04-6db9c1a7b281-kube-api-access-5bkw5\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133394 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.133704 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-svc\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.134212 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-swift-storage-0\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.134398 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-sb\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.134986 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-config\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.139848 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-db-sync-config-data\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.152181 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6f64\" (UniqueName: \"kubernetes.io/projected/46290bd2-6ad7-46f4-86f4-48aa73bc304a-kube-api-access-w6f64\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.152531 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-combined-ca-bundle\") pod \"barbican-db-sync-fwjsc\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.165619 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m92h4\" (UniqueName: \"kubernetes.io/projected/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-kube-api-access-m92h4\") pod \"dnsmasq-dns-ccd7c9f8f-gscv2\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.193693 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bkw5\" (UniqueName: \"kubernetes.io/projected/624830da-2b73-4843-bb04-6db9c1a7b281-kube-api-access-5bkw5\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235285 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235367 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-scripts\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235396 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-run-httpd\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235426 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-log-httpd\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235439 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.235457 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-config-data\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.236312 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-run-httpd\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.236714 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-log-httpd\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.244956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-config-data\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.245062 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.245442 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.248223 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-scripts\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.257166 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bkw5\" (UniqueName: \"kubernetes.io/projected/624830da-2b73-4843-bb04-6db9c1a7b281-kube-api-access-5bkw5\") pod \"ceilometer-0\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.268317 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.270577 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-znfl6"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.295845 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.332711 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.382133 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6fd8def-cffc-4f64-9805-55040dae82c6" path="/var/lib/kubelet/pods/c6fd8def-cffc-4f64-9805-55040dae82c6/volumes" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.437573 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-n7295"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.578959 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.580509 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.586146 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.586317 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.586605 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zffrj" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.586800 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.601064 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.672035 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.674050 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.677346 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.677490 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757207 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljzt7\" (UniqueName: \"kubernetes.io/projected/138479fb-965c-4e04-bd95-c7a683a5b0be-kube-api-access-ljzt7\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757327 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-config-data\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757355 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-scripts\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757401 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757480 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757507 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757544 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.757565 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-logs\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.771856 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.814853 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mp8c9"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859383 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859446 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-logs\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859473 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859550 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljzt7\" (UniqueName: \"kubernetes.io/projected/138479fb-965c-4e04-bd95-c7a683a5b0be-kube-api-access-ljzt7\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859593 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-config-data\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859618 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-scripts\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859675 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq4fc\" (UniqueName: \"kubernetes.io/projected/68f1c367-d6c5-4e77-a051-f73fddfa1085-kube-api-access-kq4fc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859700 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-config-data\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859715 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-scripts\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859826 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859871 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859912 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-logs\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859954 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.859971 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.860374 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.860414 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-logs\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.860607 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.865631 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.866760 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.875363 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-config-data\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.879529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-scripts\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.892792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljzt7\" (UniqueName: \"kubernetes.io/projected/138479fb-965c-4e04-bd95-c7a683a5b0be-kube-api-access-ljzt7\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.915279 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.942587 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-zz9c5"] Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961650 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961720 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-logs\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961738 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961779 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-config-data\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961853 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961878 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-scripts\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.961900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq4fc\" (UniqueName: \"kubernetes.io/projected/68f1c367-d6c5-4e77-a051-f73fddfa1085-kube-api-access-kq4fc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.963395 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-logs\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.963465 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.963531 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.970534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-scripts\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.975146 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.977379 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.980580 4885 generic.go:334] "Generic (PLEG): container finished" podID="991f909f-207b-4663-adea-a4f8cd0c1cb6" containerID="24dcdf1aa7d0ce84609119c93df1bffbdbadae5f02169074317cbae4aeab74f6" exitCode=0 Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.980867 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" event={"ID":"991f909f-207b-4663-adea-a4f8cd0c1cb6","Type":"ContainerDied","Data":"24dcdf1aa7d0ce84609119c93df1bffbdbadae5f02169074317cbae4aeab74f6"} Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.980896 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" event={"ID":"991f909f-207b-4663-adea-a4f8cd0c1cb6","Type":"ContainerStarted","Data":"cf44c30d4d8bfe81037b523dba452fde19d3e3d6ea7ee6356c19f638cd1925fd"} Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.987829 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.991422 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-config-data\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.991420 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zz9c5" event={"ID":"3dcb0d23-1927-4f70-ac45-bcc01f9a081a","Type":"ContainerStarted","Data":"25f4ab49981792217a76b7fdf7f3c26a7cebd2e01dffa1cebd3dfdfec45e537a"} Mar 08 19:52:29 crc kubenswrapper[4885]: I0308 19:52:29.995444 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq4fc\" (UniqueName: \"kubernetes.io/projected/68f1c367-d6c5-4e77-a051-f73fddfa1085-kube-api-access-kq4fc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.000207 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7295" event={"ID":"6a7fcdff-b452-4cc7-becc-65a43827b50b","Type":"ContainerStarted","Data":"74b67198f27ef75d7d190ca202d1aaac73c4511ab6579ab6cc6cd813c7ff04f7"} Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.000255 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7295" event={"ID":"6a7fcdff-b452-4cc7-becc-65a43827b50b","Type":"ContainerStarted","Data":"9cbc3dacc4f542eaf92c4ec2cfcb1c4cb3f323ffff727ffe3e01a0953b94c96a"} Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.002083 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.013175 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerName="dnsmasq-dns" containerID="cri-o://efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920" gracePeriod=10 Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.013287 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mp8c9" event={"ID":"2efc22fd-a92b-422c-876d-7b80f06928b2","Type":"ContainerStarted","Data":"d6ed11f43445c4cfcc1c38c863e1b91670c997a17a500e0e306eca10269bbe60"} Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.030630 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-n7295" podStartSLOduration=2.030612484 podStartE2EDuration="2.030612484s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:30.028442376 +0000 UTC m=+1251.424496399" watchObservedRunningTime="2026-03-08 19:52:30.030612484 +0000 UTC m=+1251.426666507" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.085747 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kpvl2"] Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.126658 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.189866 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-fwjsc"] Mar 08 19:52:30 crc kubenswrapper[4885]: W0308 19:52:30.204537 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46290bd2_6ad7_46f4_86f4_48aa73bc304a.slice/crio-e55e97244abd013b52340deec77649890bb853264595abb22df0c82e67349268 WatchSource:0}: Error finding container e55e97244abd013b52340deec77649890bb853264595abb22df0c82e67349268: Status 404 returned error can't find the container with id e55e97244abd013b52340deec77649890bb853264595abb22df0c82e67349268 Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.338634 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-gscv2"] Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.368972 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.505074 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.539913 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.582266 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-sb\") pod \"991f909f-207b-4663-adea-a4f8cd0c1cb6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.582612 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-swift-storage-0\") pod \"991f909f-207b-4663-adea-a4f8cd0c1cb6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.582642 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-svc\") pod \"991f909f-207b-4663-adea-a4f8cd0c1cb6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.582723 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-nb\") pod \"991f909f-207b-4663-adea-a4f8cd0c1cb6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.584444 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5qff\" (UniqueName: \"kubernetes.io/projected/991f909f-207b-4663-adea-a4f8cd0c1cb6-kube-api-access-m5qff\") pod \"991f909f-207b-4663-adea-a4f8cd0c1cb6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.584569 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-config\") pod \"991f909f-207b-4663-adea-a4f8cd0c1cb6\" (UID: \"991f909f-207b-4663-adea-a4f8cd0c1cb6\") " Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.591809 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/991f909f-207b-4663-adea-a4f8cd0c1cb6-kube-api-access-m5qff" (OuterVolumeSpecName: "kube-api-access-m5qff") pod "991f909f-207b-4663-adea-a4f8cd0c1cb6" (UID: "991f909f-207b-4663-adea-a4f8cd0c1cb6"). InnerVolumeSpecName "kube-api-access-m5qff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.631164 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.640209 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "991f909f-207b-4663-adea-a4f8cd0c1cb6" (UID: "991f909f-207b-4663-adea-a4f8cd0c1cb6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.690080 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.690114 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5qff\" (UniqueName: \"kubernetes.io/projected/991f909f-207b-4663-adea-a4f8cd0c1cb6-kube-api-access-m5qff\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.698485 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "991f909f-207b-4663-adea-a4f8cd0c1cb6" (UID: "991f909f-207b-4663-adea-a4f8cd0c1cb6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.706457 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "991f909f-207b-4663-adea-a4f8cd0c1cb6" (UID: "991f909f-207b-4663-adea-a4f8cd0c1cb6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.742014 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "991f909f-207b-4663-adea-a4f8cd0c1cb6" (UID: "991f909f-207b-4663-adea-a4f8cd0c1cb6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.749442 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-config" (OuterVolumeSpecName: "config") pod "991f909f-207b-4663-adea-a4f8cd0c1cb6" (UID: "991f909f-207b-4663-adea-a4f8cd0c1cb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.771325 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.793480 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.793503 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.793543 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.793556 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/991f909f-207b-4663-adea-a4f8cd0c1cb6-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.809260 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:30 crc kubenswrapper[4885]: W0308 19:52:30.846311 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod138479fb_965c_4e04_bd95_c7a683a5b0be.slice/crio-7bb80c6ac30c275247f5237a85b2e40b7de54586f2af99cfba7b3c794012d9a1 WatchSource:0}: Error finding container 7bb80c6ac30c275247f5237a85b2e40b7de54586f2af99cfba7b3c794012d9a1: Status 404 returned error can't find the container with id 7bb80c6ac30c275247f5237a85b2e40b7de54586f2af99cfba7b3c794012d9a1 Mar 08 19:52:30 crc kubenswrapper[4885]: I0308 19:52:30.920950 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.031267 4885 generic.go:334] "Generic (PLEG): container finished" podID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerID="1640371a030e2f7def6f8a64bd3eab03cae3c318d2de3e683f5f0df21e52c94a" exitCode=0 Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.031330 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" event={"ID":"4e1d0a39-d199-43d0-bdea-33c0ecfff06f","Type":"ContainerDied","Data":"1640371a030e2f7def6f8a64bd3eab03cae3c318d2de3e683f5f0df21e52c94a"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.031355 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" event={"ID":"4e1d0a39-d199-43d0-bdea-33c0ecfff06f","Type":"ContainerStarted","Data":"3afeb0eb326591c1d8125d846debd099f1fede59cbbe242947add0ce27db5e6c"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.042166 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerStarted","Data":"7df05fec0614fac93aab8ae29777fc59abe79ffd7aca5937f60cd7d723e3ec63"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.053770 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.082361 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fwjsc" event={"ID":"46290bd2-6ad7-46f4-86f4-48aa73bc304a","Type":"ContainerStarted","Data":"e55e97244abd013b52340deec77649890bb853264595abb22df0c82e67349268"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.097306 4885 generic.go:334] "Generic (PLEG): container finished" podID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerID="efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920" exitCode=0 Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.097358 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" event={"ID":"b502bd3e-eafb-44cf-a81e-c10d647302a4","Type":"ContainerDied","Data":"efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.097383 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" event={"ID":"b502bd3e-eafb-44cf-a81e-c10d647302a4","Type":"ContainerDied","Data":"a9372b012e69a1c2d0bad407661605d16c694bf0bbe0256a7ac15648fdbf0a98"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.097400 4885 scope.go:117] "RemoveContainer" containerID="efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.097513 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75c886f8b5-d9vgh" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.099027 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-sb\") pod \"b502bd3e-eafb-44cf-a81e-c10d647302a4\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.099090 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-config\") pod \"b502bd3e-eafb-44cf-a81e-c10d647302a4\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.099122 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-nb\") pod \"b502bd3e-eafb-44cf-a81e-c10d647302a4\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.099152 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-svc\") pod \"b502bd3e-eafb-44cf-a81e-c10d647302a4\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.099198 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-swift-storage-0\") pod \"b502bd3e-eafb-44cf-a81e-c10d647302a4\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.099274 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cclxk\" (UniqueName: \"kubernetes.io/projected/b502bd3e-eafb-44cf-a81e-c10d647302a4-kube-api-access-cclxk\") pod \"b502bd3e-eafb-44cf-a81e-c10d647302a4\" (UID: \"b502bd3e-eafb-44cf-a81e-c10d647302a4\") " Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.107621 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" event={"ID":"991f909f-207b-4663-adea-a4f8cd0c1cb6","Type":"ContainerDied","Data":"cf44c30d4d8bfe81037b523dba452fde19d3e3d6ea7ee6356c19f638cd1925fd"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.107757 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5985c59c55-znfl6" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.110675 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b502bd3e-eafb-44cf-a81e-c10d647302a4-kube-api-access-cclxk" (OuterVolumeSpecName: "kube-api-access-cclxk") pod "b502bd3e-eafb-44cf-a81e-c10d647302a4" (UID: "b502bd3e-eafb-44cf-a81e-c10d647302a4"). InnerVolumeSpecName "kube-api-access-cclxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.121070 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kpvl2" event={"ID":"b04611e7-17b5-48ae-8169-534f684a101b","Type":"ContainerStarted","Data":"f13c64b3a8cac3c8bcb02e4b62a77799ac33e44598003cd3a842dd2a34fd0963"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.121116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kpvl2" event={"ID":"b04611e7-17b5-48ae-8169-534f684a101b","Type":"ContainerStarted","Data":"a46189fa57c904c31a1baeb88b028a78fe837ba28e4a7c2248f80625bc82f539"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.125717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"138479fb-965c-4e04-bd95-c7a683a5b0be","Type":"ContainerStarted","Data":"7bb80c6ac30c275247f5237a85b2e40b7de54586f2af99cfba7b3c794012d9a1"} Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.151623 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b502bd3e-eafb-44cf-a81e-c10d647302a4" (UID: "b502bd3e-eafb-44cf-a81e-c10d647302a4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.166818 4885 scope.go:117] "RemoveContainer" containerID="bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.202137 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kpvl2" podStartSLOduration=3.18691909 podStartE2EDuration="3.18691909s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:31.14024914 +0000 UTC m=+1252.536303173" watchObservedRunningTime="2026-03-08 19:52:31.18691909 +0000 UTC m=+1252.582973113" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.206709 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cclxk\" (UniqueName: \"kubernetes.io/projected/b502bd3e-eafb-44cf-a81e-c10d647302a4-kube-api-access-cclxk\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.206752 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.236047 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b502bd3e-eafb-44cf-a81e-c10d647302a4" (UID: "b502bd3e-eafb-44cf-a81e-c10d647302a4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.248458 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b502bd3e-eafb-44cf-a81e-c10d647302a4" (UID: "b502bd3e-eafb-44cf-a81e-c10d647302a4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.283619 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-config" (OuterVolumeSpecName: "config") pod "b502bd3e-eafb-44cf-a81e-c10d647302a4" (UID: "b502bd3e-eafb-44cf-a81e-c10d647302a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.309034 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-znfl6"] Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.309414 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.309442 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.309454 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.319042 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5985c59c55-znfl6"] Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.329529 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b502bd3e-eafb-44cf-a81e-c10d647302a4" (UID: "b502bd3e-eafb-44cf-a81e-c10d647302a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.384664 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="991f909f-207b-4663-adea-a4f8cd0c1cb6" path="/var/lib/kubelet/pods/991f909f-207b-4663-adea-a4f8cd0c1cb6/volumes" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.423244 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b502bd3e-eafb-44cf-a81e-c10d647302a4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.446134 4885 scope.go:117] "RemoveContainer" containerID="efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920" Mar 08 19:52:31 crc kubenswrapper[4885]: E0308 19:52:31.495677 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920\": container with ID starting with efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920 not found: ID does not exist" containerID="efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.495736 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920"} err="failed to get container status \"efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920\": rpc error: code = NotFound desc = could not find container \"efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920\": container with ID starting with efef484aa7d059483ac1f0c639ab5675651a87a0ae5cf52bf9c8467bba20c920 not found: ID does not exist" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.495763 4885 scope.go:117] "RemoveContainer" containerID="bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70" Mar 08 19:52:31 crc kubenswrapper[4885]: E0308 19:52:31.505228 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70\": container with ID starting with bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70 not found: ID does not exist" containerID="bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.505506 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70"} err="failed to get container status \"bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70\": rpc error: code = NotFound desc = could not find container \"bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70\": container with ID starting with bc41fc22b6cadcd1a77358b4b7f79ad04d2c4d44a04496f4e19e8f27a62a2a70 not found: ID does not exist" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.505927 4885 scope.go:117] "RemoveContainer" containerID="24dcdf1aa7d0ce84609119c93df1bffbdbadae5f02169074317cbae4aeab74f6" Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.816595 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-d9vgh"] Mar 08 19:52:31 crc kubenswrapper[4885]: I0308 19:52:31.826282 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75c886f8b5-d9vgh"] Mar 08 19:52:31 crc kubenswrapper[4885]: E0308 19:52:31.888746 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb502bd3e_eafb_44cf_a81e_c10d647302a4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb502bd3e_eafb_44cf_a81e_c10d647302a4.slice/crio-a9372b012e69a1c2d0bad407661605d16c694bf0bbe0256a7ac15648fdbf0a98\": RecentStats: unable to find data in memory cache]" Mar 08 19:52:32 crc kubenswrapper[4885]: I0308 19:52:32.163408 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" event={"ID":"4e1d0a39-d199-43d0-bdea-33c0ecfff06f","Type":"ContainerStarted","Data":"7cc6d42eb38cb1c77825df2d6223ceee7a5e25787b09d1e978bf98a5ce1ee38e"} Mar 08 19:52:32 crc kubenswrapper[4885]: I0308 19:52:32.163792 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:32 crc kubenswrapper[4885]: I0308 19:52:32.171180 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68f1c367-d6c5-4e77-a051-f73fddfa1085","Type":"ContainerStarted","Data":"34d894a56c80dc928bc7836837d51c9e6fa877a5201b35158a80f5ce7bf422e1"} Mar 08 19:52:32 crc kubenswrapper[4885]: I0308 19:52:32.206674 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"138479fb-965c-4e04-bd95-c7a683a5b0be","Type":"ContainerStarted","Data":"911fa3cdbcf965e7428fcda4a5efd3eb7bd888422b953698cab04537d5e51708"} Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.220643 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"138479fb-965c-4e04-bd95-c7a683a5b0be","Type":"ContainerStarted","Data":"523c04f0937be42a22fbca9b82de419e54c01c986998bce3144f8515d06aaa53"} Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.220753 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-log" containerID="cri-o://911fa3cdbcf965e7428fcda4a5efd3eb7bd888422b953698cab04537d5e51708" gracePeriod=30 Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.221019 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-httpd" containerID="cri-o://523c04f0937be42a22fbca9b82de419e54c01c986998bce3144f8515d06aaa53" gracePeriod=30 Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.235573 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68f1c367-d6c5-4e77-a051-f73fddfa1085","Type":"ContainerStarted","Data":"f1e60f93f4471d460063b494acaa641e15b5d2ebdd2145e718bfce9a4a9245aa"} Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.247576 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" podStartSLOduration=5.247553467 podStartE2EDuration="5.247553467s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:32.202348005 +0000 UTC m=+1253.598402028" watchObservedRunningTime="2026-03-08 19:52:33.247553467 +0000 UTC m=+1254.643607490" Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.255253 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.255235133 podStartE2EDuration="5.255235133s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:33.245842331 +0000 UTC m=+1254.641896354" watchObservedRunningTime="2026-03-08 19:52:33.255235133 +0000 UTC m=+1254.651289156" Mar 08 19:52:33 crc kubenswrapper[4885]: I0308 19:52:33.389667 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" path="/var/lib/kubelet/pods/b502bd3e-eafb-44cf-a81e-c10d647302a4/volumes" Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.256153 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68f1c367-d6c5-4e77-a051-f73fddfa1085","Type":"ContainerStarted","Data":"25f0208e3538d6118cfee075db8e7f7b7dec77ce4a76f77fd3385ea0e30208a9"} Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.256278 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-log" containerID="cri-o://f1e60f93f4471d460063b494acaa641e15b5d2ebdd2145e718bfce9a4a9245aa" gracePeriod=30 Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.256608 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-httpd" containerID="cri-o://25f0208e3538d6118cfee075db8e7f7b7dec77ce4a76f77fd3385ea0e30208a9" gracePeriod=30 Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.262401 4885 generic.go:334] "Generic (PLEG): container finished" podID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerID="523c04f0937be42a22fbca9b82de419e54c01c986998bce3144f8515d06aaa53" exitCode=0 Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.262432 4885 generic.go:334] "Generic (PLEG): container finished" podID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerID="911fa3cdbcf965e7428fcda4a5efd3eb7bd888422b953698cab04537d5e51708" exitCode=143 Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.262453 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"138479fb-965c-4e04-bd95-c7a683a5b0be","Type":"ContainerDied","Data":"523c04f0937be42a22fbca9b82de419e54c01c986998bce3144f8515d06aaa53"} Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.262478 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"138479fb-965c-4e04-bd95-c7a683a5b0be","Type":"ContainerDied","Data":"911fa3cdbcf965e7428fcda4a5efd3eb7bd888422b953698cab04537d5e51708"} Mar 08 19:52:34 crc kubenswrapper[4885]: I0308 19:52:34.278532 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.278513138 podStartE2EDuration="6.278513138s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:34.273867173 +0000 UTC m=+1255.669921196" watchObservedRunningTime="2026-03-08 19:52:34.278513138 +0000 UTC m=+1255.674567161" Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.289744 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a7fcdff-b452-4cc7-becc-65a43827b50b" containerID="74b67198f27ef75d7d190ca202d1aaac73c4511ab6579ab6cc6cd813c7ff04f7" exitCode=0 Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.289838 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7295" event={"ID":"6a7fcdff-b452-4cc7-becc-65a43827b50b","Type":"ContainerDied","Data":"74b67198f27ef75d7d190ca202d1aaac73c4511ab6579ab6cc6cd813c7ff04f7"} Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.295193 4885 generic.go:334] "Generic (PLEG): container finished" podID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerID="25f0208e3538d6118cfee075db8e7f7b7dec77ce4a76f77fd3385ea0e30208a9" exitCode=0 Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.295263 4885 generic.go:334] "Generic (PLEG): container finished" podID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerID="f1e60f93f4471d460063b494acaa641e15b5d2ebdd2145e718bfce9a4a9245aa" exitCode=143 Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.295322 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68f1c367-d6c5-4e77-a051-f73fddfa1085","Type":"ContainerDied","Data":"25f0208e3538d6118cfee075db8e7f7b7dec77ce4a76f77fd3385ea0e30208a9"} Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.295399 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68f1c367-d6c5-4e77-a051-f73fddfa1085","Type":"ContainerDied","Data":"f1e60f93f4471d460063b494acaa641e15b5d2ebdd2145e718bfce9a4a9245aa"} Mar 08 19:52:35 crc kubenswrapper[4885]: I0308 19:52:35.968115 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.053533 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-combined-ca-bundle\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.053600 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-httpd-run\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.053639 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-scripts\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.053802 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljzt7\" (UniqueName: \"kubernetes.io/projected/138479fb-965c-4e04-bd95-c7a683a5b0be-kube-api-access-ljzt7\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.054259 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-logs\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.054375 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-config-data\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.054405 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.054432 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-public-tls-certs\") pod \"138479fb-965c-4e04-bd95-c7a683a5b0be\" (UID: \"138479fb-965c-4e04-bd95-c7a683a5b0be\") " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.054093 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.055152 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-logs" (OuterVolumeSpecName: "logs") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.055557 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.055574 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/138479fb-965c-4e04-bd95-c7a683a5b0be-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.060387 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.060529 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138479fb-965c-4e04-bd95-c7a683a5b0be-kube-api-access-ljzt7" (OuterVolumeSpecName: "kube-api-access-ljzt7") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "kube-api-access-ljzt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.060583 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-scripts" (OuterVolumeSpecName: "scripts") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.087094 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.109958 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.111573 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-config-data" (OuterVolumeSpecName: "config-data") pod "138479fb-965c-4e04-bd95-c7a683a5b0be" (UID: "138479fb-965c-4e04-bd95-c7a683a5b0be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.157943 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.157978 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.157988 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljzt7\" (UniqueName: \"kubernetes.io/projected/138479fb-965c-4e04-bd95-c7a683a5b0be-kube-api-access-ljzt7\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.158035 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.158062 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.158073 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/138479fb-965c-4e04-bd95-c7a683a5b0be-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.178960 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.260953 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.309818 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.315231 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"138479fb-965c-4e04-bd95-c7a683a5b0be","Type":"ContainerDied","Data":"7bb80c6ac30c275247f5237a85b2e40b7de54586f2af99cfba7b3c794012d9a1"} Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.315283 4885 scope.go:117] "RemoveContainer" containerID="523c04f0937be42a22fbca9b82de419e54c01c986998bce3144f8515d06aaa53" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.362267 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.372844 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.379415 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:36 crc kubenswrapper[4885]: E0308 19:52:36.379807 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-httpd" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.379820 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-httpd" Mar 08 19:52:36 crc kubenswrapper[4885]: E0308 19:52:36.379843 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="991f909f-207b-4663-adea-a4f8cd0c1cb6" containerName="init" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.379849 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="991f909f-207b-4663-adea-a4f8cd0c1cb6" containerName="init" Mar 08 19:52:36 crc kubenswrapper[4885]: E0308 19:52:36.379862 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerName="dnsmasq-dns" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.379869 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerName="dnsmasq-dns" Mar 08 19:52:36 crc kubenswrapper[4885]: E0308 19:52:36.379888 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerName="init" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.379912 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerName="init" Mar 08 19:52:36 crc kubenswrapper[4885]: E0308 19:52:36.379934 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-log" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.379940 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-log" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.380099 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-log" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.380115 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="991f909f-207b-4663-adea-a4f8cd0c1cb6" containerName="init" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.380131 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b502bd3e-eafb-44cf-a81e-c10d647302a4" containerName="dnsmasq-dns" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.380139 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" containerName="glance-httpd" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.381431 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.383440 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.386163 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.400109 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.578577 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586047 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586156 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586235 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586278 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586380 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-logs\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586492 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzlbm\" (UniqueName: \"kubernetes.io/projected/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-kube-api-access-hzlbm\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.586539 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688377 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzlbm\" (UniqueName: \"kubernetes.io/projected/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-kube-api-access-hzlbm\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688616 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688699 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688730 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688758 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688776 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.688802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-logs\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.689024 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.689095 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-logs\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.689742 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.697721 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-scripts\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.697799 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.707806 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-config-data\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.708379 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.717997 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzlbm\" (UniqueName: \"kubernetes.io/projected/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-kube-api-access-hzlbm\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:36 crc kubenswrapper[4885]: I0308 19:52:36.788833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " pod="openstack/glance-default-external-api-0" Mar 08 19:52:37 crc kubenswrapper[4885]: I0308 19:52:37.005845 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:52:37 crc kubenswrapper[4885]: I0308 19:52:37.389717 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="138479fb-965c-4e04-bd95-c7a683a5b0be" path="/var/lib/kubelet/pods/138479fb-965c-4e04-bd95-c7a683a5b0be/volumes" Mar 08 19:52:39 crc kubenswrapper[4885]: I0308 19:52:39.270133 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:52:39 crc kubenswrapper[4885]: I0308 19:52:39.329453 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-2cszd"] Mar 08 19:52:39 crc kubenswrapper[4885]: I0308 19:52:39.329726 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" containerID="cri-o://701f88d27bc0253f9783e551e39a9c634999051422220378d318c698c63ae0f5" gracePeriod=10 Mar 08 19:52:39 crc kubenswrapper[4885]: I0308 19:52:39.535681 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 08 19:52:40 crc kubenswrapper[4885]: I0308 19:52:40.370039 4885 generic.go:334] "Generic (PLEG): container finished" podID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerID="701f88d27bc0253f9783e551e39a9c634999051422220378d318c698c63ae0f5" exitCode=0 Mar 08 19:52:40 crc kubenswrapper[4885]: I0308 19:52:40.370281 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" event={"ID":"67aa348d-fe05-4e05-af01-a0b22d170a9b","Type":"ContainerDied","Data":"701f88d27bc0253f9783e551e39a9c634999051422220378d318c698c63ae0f5"} Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.855661 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.864285 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.999431 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.999503 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-credential-keys\") pod \"6a7fcdff-b452-4cc7-becc-65a43827b50b\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.999530 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-fernet-keys\") pod \"6a7fcdff-b452-4cc7-becc-65a43827b50b\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.999572 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-logs\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:41 crc kubenswrapper[4885]: I0308 19:52:41.999936 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-logs" (OuterVolumeSpecName: "logs") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000328 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-config-data\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000368 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-httpd-run\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000389 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-scripts\") pod \"6a7fcdff-b452-4cc7-becc-65a43827b50b\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000686 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-combined-ca-bundle\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000697 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000726 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq4fc\" (UniqueName: \"kubernetes.io/projected/68f1c367-d6c5-4e77-a051-f73fddfa1085-kube-api-access-kq4fc\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000758 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-combined-ca-bundle\") pod \"6a7fcdff-b452-4cc7-becc-65a43827b50b\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000813 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-internal-tls-certs\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000855 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-scripts\") pod \"68f1c367-d6c5-4e77-a051-f73fddfa1085\" (UID: \"68f1c367-d6c5-4e77-a051-f73fddfa1085\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000951 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-config-data\") pod \"6a7fcdff-b452-4cc7-becc-65a43827b50b\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.000989 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8l7x\" (UniqueName: \"kubernetes.io/projected/6a7fcdff-b452-4cc7-becc-65a43827b50b-kube-api-access-c8l7x\") pod \"6a7fcdff-b452-4cc7-becc-65a43827b50b\" (UID: \"6a7fcdff-b452-4cc7-becc-65a43827b50b\") " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.005483 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-scripts" (OuterVolumeSpecName: "scripts") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.006372 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.006390 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68f1c367-d6c5-4e77-a051-f73fddfa1085-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.006401 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.009120 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.009320 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-scripts" (OuterVolumeSpecName: "scripts") pod "6a7fcdff-b452-4cc7-becc-65a43827b50b" (UID: "6a7fcdff-b452-4cc7-becc-65a43827b50b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.009393 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6a7fcdff-b452-4cc7-becc-65a43827b50b" (UID: "6a7fcdff-b452-4cc7-becc-65a43827b50b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.010855 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6a7fcdff-b452-4cc7-becc-65a43827b50b" (UID: "6a7fcdff-b452-4cc7-becc-65a43827b50b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.011287 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a7fcdff-b452-4cc7-becc-65a43827b50b-kube-api-access-c8l7x" (OuterVolumeSpecName: "kube-api-access-c8l7x") pod "6a7fcdff-b452-4cc7-becc-65a43827b50b" (UID: "6a7fcdff-b452-4cc7-becc-65a43827b50b"). InnerVolumeSpecName "kube-api-access-c8l7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.015388 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f1c367-d6c5-4e77-a051-f73fddfa1085-kube-api-access-kq4fc" (OuterVolumeSpecName: "kube-api-access-kq4fc") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "kube-api-access-kq4fc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.030231 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.032214 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a7fcdff-b452-4cc7-becc-65a43827b50b" (UID: "6a7fcdff-b452-4cc7-becc-65a43827b50b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.032838 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-config-data" (OuterVolumeSpecName: "config-data") pod "6a7fcdff-b452-4cc7-becc-65a43827b50b" (UID: "6a7fcdff-b452-4cc7-becc-65a43827b50b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.063656 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-config-data" (OuterVolumeSpecName: "config-data") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.080484 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "68f1c367-d6c5-4e77-a051-f73fddfa1085" (UID: "68f1c367-d6c5-4e77-a051-f73fddfa1085"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.107872 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.107910 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c8l7x\" (UniqueName: \"kubernetes.io/projected/6a7fcdff-b452-4cc7-becc-65a43827b50b-kube-api-access-c8l7x\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.107968 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.107982 4885 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.107995 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.108006 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.108018 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.108030 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.108040 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq4fc\" (UniqueName: \"kubernetes.io/projected/68f1c367-d6c5-4e77-a051-f73fddfa1085-kube-api-access-kq4fc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.108050 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7fcdff-b452-4cc7-becc-65a43827b50b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.108061 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/68f1c367-d6c5-4e77-a051-f73fddfa1085-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.128157 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.209114 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.400965 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-n7295" event={"ID":"6a7fcdff-b452-4cc7-becc-65a43827b50b","Type":"ContainerDied","Data":"9cbc3dacc4f542eaf92c4ec2cfcb1c4cb3f323ffff727ffe3e01a0953b94c96a"} Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.401013 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cbc3dacc4f542eaf92c4ec2cfcb1c4cb3f323ffff727ffe3e01a0953b94c96a" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.401046 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-n7295" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.404285 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"68f1c367-d6c5-4e77-a051-f73fddfa1085","Type":"ContainerDied","Data":"34d894a56c80dc928bc7836837d51c9e6fa877a5201b35158a80f5ce7bf422e1"} Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.404467 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.454997 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.469193 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.506215 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:42 crc kubenswrapper[4885]: E0308 19:52:42.512303 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a7fcdff-b452-4cc7-becc-65a43827b50b" containerName="keystone-bootstrap" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.512521 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a7fcdff-b452-4cc7-becc-65a43827b50b" containerName="keystone-bootstrap" Mar 08 19:52:42 crc kubenswrapper[4885]: E0308 19:52:42.512599 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-log" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.512605 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-log" Mar 08 19:52:42 crc kubenswrapper[4885]: E0308 19:52:42.512619 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-httpd" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.512627 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-httpd" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.512937 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-log" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.512950 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" containerName="glance-httpd" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.512966 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a7fcdff-b452-4cc7-becc-65a43827b50b" containerName="keystone-bootstrap" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.513984 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.516004 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.517479 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.522912 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.615461 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkzrv\" (UniqueName: \"kubernetes.io/projected/405b5d21-a208-4f86-b046-66968c326aa4-kube-api-access-vkzrv\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.615517 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.615549 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.615599 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.615760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.615863 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.616005 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.616127 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-logs\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.717826 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.717896 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.717950 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.717997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.718039 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-logs\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.718089 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkzrv\" (UniqueName: \"kubernetes.io/projected/405b5d21-a208-4f86-b046-66968c326aa4-kube-api-access-vkzrv\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.718134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.718175 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.718721 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.719495 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.719609 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-logs\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.722825 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.722862 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.723232 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.728736 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.735945 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkzrv\" (UniqueName: \"kubernetes.io/projected/405b5d21-a208-4f86-b046-66968c326aa4-kube-api-access-vkzrv\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.749821 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.835477 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.942561 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-n7295"] Mar 08 19:52:42 crc kubenswrapper[4885]: I0308 19:52:42.949951 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-n7295"] Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.027993 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-j97wh"] Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.029271 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.031692 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nj2kw" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.031743 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.031890 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.032343 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.034375 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.038622 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j97wh"] Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.125631 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-config-data\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.125725 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-fernet-keys\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.125874 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-credential-keys\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.125982 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2r5x\" (UniqueName: \"kubernetes.io/projected/43dd77c8-6951-423a-9334-502f66c3d1b5-kube-api-access-b2r5x\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.126056 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-combined-ca-bundle\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.126117 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-scripts\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.228813 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-config-data\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.228894 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-fernet-keys\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.230842 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-credential-keys\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.231044 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2r5x\" (UniqueName: \"kubernetes.io/projected/43dd77c8-6951-423a-9334-502f66c3d1b5-kube-api-access-b2r5x\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.231107 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-combined-ca-bundle\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.232317 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-scripts\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.235519 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-credential-keys\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.235849 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-fernet-keys\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.235854 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-scripts\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.236082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-combined-ca-bundle\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.237050 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-config-data\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.254202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2r5x\" (UniqueName: \"kubernetes.io/projected/43dd77c8-6951-423a-9334-502f66c3d1b5-kube-api-access-b2r5x\") pod \"keystone-bootstrap-j97wh\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.352614 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.379242 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f1c367-d6c5-4e77-a051-f73fddfa1085" path="/var/lib/kubelet/pods/68f1c367-d6c5-4e77-a051-f73fddfa1085/volumes" Mar 08 19:52:43 crc kubenswrapper[4885]: I0308 19:52:43.380109 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a7fcdff-b452-4cc7-becc-65a43827b50b" path="/var/lib/kubelet/pods/6a7fcdff-b452-4cc7-becc-65a43827b50b/volumes" Mar 08 19:52:44 crc kubenswrapper[4885]: I0308 19:52:44.536005 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Mar 08 19:52:47 crc kubenswrapper[4885]: I0308 19:52:47.458210 4885 generic.go:334] "Generic (PLEG): container finished" podID="b04611e7-17b5-48ae-8169-534f684a101b" containerID="f13c64b3a8cac3c8bcb02e4b62a77799ac33e44598003cd3a842dd2a34fd0963" exitCode=0 Mar 08 19:52:47 crc kubenswrapper[4885]: I0308 19:52:47.458528 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kpvl2" event={"ID":"b04611e7-17b5-48ae-8169-534f684a101b","Type":"ContainerDied","Data":"f13c64b3a8cac3c8bcb02e4b62a77799ac33e44598003cd3a842dd2a34fd0963"} Mar 08 19:52:49 crc kubenswrapper[4885]: E0308 19:52:49.724905 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d" Mar 08 19:52:49 crc kubenswrapper[4885]: E0308 19:52:49.725500 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w6f64,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-fwjsc_openstack(46290bd2-6ad7-46f4-86f4-48aa73bc304a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:52:49 crc kubenswrapper[4885]: E0308 19:52:49.726772 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-fwjsc" podUID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.747792 4885 scope.go:117] "RemoveContainer" containerID="911fa3cdbcf965e7428fcda4a5efd3eb7bd888422b953698cab04537d5e51708" Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.840769 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.973866 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg4r7\" (UniqueName: \"kubernetes.io/projected/b04611e7-17b5-48ae-8169-534f684a101b-kube-api-access-dg4r7\") pod \"b04611e7-17b5-48ae-8169-534f684a101b\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.973901 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-combined-ca-bundle\") pod \"b04611e7-17b5-48ae-8169-534f684a101b\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.974050 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-config\") pod \"b04611e7-17b5-48ae-8169-534f684a101b\" (UID: \"b04611e7-17b5-48ae-8169-534f684a101b\") " Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.991814 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04611e7-17b5-48ae-8169-534f684a101b-kube-api-access-dg4r7" (OuterVolumeSpecName: "kube-api-access-dg4r7") pod "b04611e7-17b5-48ae-8169-534f684a101b" (UID: "b04611e7-17b5-48ae-8169-534f684a101b"). InnerVolumeSpecName "kube-api-access-dg4r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:49 crc kubenswrapper[4885]: I0308 19:52:49.995077 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b04611e7-17b5-48ae-8169-534f684a101b" (UID: "b04611e7-17b5-48ae-8169-534f684a101b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.015895 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-config" (OuterVolumeSpecName: "config") pod "b04611e7-17b5-48ae-8169-534f684a101b" (UID: "b04611e7-17b5-48ae-8169-534f684a101b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.077294 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.077350 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg4r7\" (UniqueName: \"kubernetes.io/projected/b04611e7-17b5-48ae-8169-534f684a101b-kube-api-access-dg4r7\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.077370 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b04611e7-17b5-48ae-8169-534f684a101b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.510681 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kpvl2" event={"ID":"b04611e7-17b5-48ae-8169-534f684a101b","Type":"ContainerDied","Data":"a46189fa57c904c31a1baeb88b028a78fe837ba28e4a7c2248f80625bc82f539"} Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.510731 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a46189fa57c904c31a1baeb88b028a78fe837ba28e4a7c2248f80625bc82f539" Mar 08 19:52:50 crc kubenswrapper[4885]: I0308 19:52:50.510807 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kpvl2" Mar 08 19:52:50 crc kubenswrapper[4885]: E0308 19:52:50.517087 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:2c52c0f4b4baa15796eb284522adff7fa9e5c85a2d77c2e47ef4afdf8e4a7c7d\\\"\"" pod="openstack/barbican-db-sync-fwjsc" podUID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.103131 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-gpkjd"] Mar 08 19:52:51 crc kubenswrapper[4885]: E0308 19:52:51.103781 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04611e7-17b5-48ae-8169-534f684a101b" containerName="neutron-db-sync" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.103793 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04611e7-17b5-48ae-8169-534f684a101b" containerName="neutron-db-sync" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.103996 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04611e7-17b5-48ae-8169-534f684a101b" containerName="neutron-db-sync" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.104896 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.119634 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-gpkjd"] Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.197768 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.197833 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.197861 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-svc\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.197893 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.197951 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-config\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.198220 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkbcw\" (UniqueName: \"kubernetes.io/projected/3c32ee1c-ff69-4043-a31d-92be1d77a404-kube-api-access-mkbcw\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.198842 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6bbc5d6644-tztss"] Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.200390 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.205098 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-xgk7z" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.205262 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.205318 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.205490 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.227532 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bbc5d6644-tztss"] Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.299812 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-config\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.299926 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-httpd-config\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.299962 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkbcw\" (UniqueName: \"kubernetes.io/projected/3c32ee1c-ff69-4043-a31d-92be1d77a404-kube-api-access-mkbcw\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300142 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-combined-ca-bundle\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300374 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfv9t\" (UniqueName: \"kubernetes.io/projected/de714834-e155-41c1-83fc-a050203bde75-kube-api-access-xfv9t\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300535 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300572 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-svc\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300588 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300626 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-ovndb-tls-certs\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300665 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300700 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-config\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.300983 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-config\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.301499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-nb\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.301653 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-swift-storage-0\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.301789 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-sb\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.302186 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-svc\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.321841 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkbcw\" (UniqueName: \"kubernetes.io/projected/3c32ee1c-ff69-4043-a31d-92be1d77a404-kube-api-access-mkbcw\") pod \"dnsmasq-dns-7859c7799c-gpkjd\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.402499 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-ovndb-tls-certs\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.402566 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-config\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.402674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-httpd-config\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.402716 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-combined-ca-bundle\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.402770 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfv9t\" (UniqueName: \"kubernetes.io/projected/de714834-e155-41c1-83fc-a050203bde75-kube-api-access-xfv9t\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.406131 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-config\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.406411 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-httpd-config\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.406986 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-combined-ca-bundle\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.416212 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-ovndb-tls-certs\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.431733 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfv9t\" (UniqueName: \"kubernetes.io/projected/de714834-e155-41c1-83fc-a050203bde75-kube-api-access-xfv9t\") pod \"neutron-6bbc5d6644-tztss\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.433678 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.522101 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:51 crc kubenswrapper[4885]: E0308 19:52:51.852929 4885 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838" Mar 08 19:52:51 crc kubenswrapper[4885]: E0308 19:52:51.853407 4885 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wlqnv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-mp8c9_openstack(2efc22fd-a92b-422c-876d-7b80f06928b2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 08 19:52:51 crc kubenswrapper[4885]: E0308 19:52:51.854703 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-mp8c9" podUID="2efc22fd-a92b-422c-876d-7b80f06928b2" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.884074 4885 scope.go:117] "RemoveContainer" containerID="25f0208e3538d6118cfee075db8e7f7b7dec77ce4a76f77fd3385ea0e30208a9" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.926592 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:52:51 crc kubenswrapper[4885]: I0308 19:52:51.955192 4885 scope.go:117] "RemoveContainer" containerID="f1e60f93f4471d460063b494acaa641e15b5d2ebdd2145e718bfce9a4a9245aa" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.118154 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-config\") pod \"67aa348d-fe05-4e05-af01-a0b22d170a9b\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.118200 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-dns-svc\") pod \"67aa348d-fe05-4e05-af01-a0b22d170a9b\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.118230 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq487\" (UniqueName: \"kubernetes.io/projected/67aa348d-fe05-4e05-af01-a0b22d170a9b-kube-api-access-jq487\") pod \"67aa348d-fe05-4e05-af01-a0b22d170a9b\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.118342 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-sb\") pod \"67aa348d-fe05-4e05-af01-a0b22d170a9b\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.118366 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb\") pod \"67aa348d-fe05-4e05-af01-a0b22d170a9b\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.138020 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67aa348d-fe05-4e05-af01-a0b22d170a9b-kube-api-access-jq487" (OuterVolumeSpecName: "kube-api-access-jq487") pod "67aa348d-fe05-4e05-af01-a0b22d170a9b" (UID: "67aa348d-fe05-4e05-af01-a0b22d170a9b"). InnerVolumeSpecName "kube-api-access-jq487". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.215799 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "67aa348d-fe05-4e05-af01-a0b22d170a9b" (UID: "67aa348d-fe05-4e05-af01-a0b22d170a9b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.218667 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67aa348d-fe05-4e05-af01-a0b22d170a9b" (UID: "67aa348d-fe05-4e05-af01-a0b22d170a9b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.223874 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb\") pod \"67aa348d-fe05-4e05-af01-a0b22d170a9b\" (UID: \"67aa348d-fe05-4e05-af01-a0b22d170a9b\") " Mar 08 19:52:52 crc kubenswrapper[4885]: W0308 19:52:52.224047 4885 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/67aa348d-fe05-4e05-af01-a0b22d170a9b/volumes/kubernetes.io~configmap/ovsdbserver-nb Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.224070 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "67aa348d-fe05-4e05-af01-a0b22d170a9b" (UID: "67aa348d-fe05-4e05-af01-a0b22d170a9b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.224355 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq487\" (UniqueName: \"kubernetes.io/projected/67aa348d-fe05-4e05-af01-a0b22d170a9b-kube-api-access-jq487\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.224387 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.224401 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.235839 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "67aa348d-fe05-4e05-af01-a0b22d170a9b" (UID: "67aa348d-fe05-4e05-af01-a0b22d170a9b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.243132 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-config" (OuterVolumeSpecName: "config") pod "67aa348d-fe05-4e05-af01-a0b22d170a9b" (UID: "67aa348d-fe05-4e05-af01-a0b22d170a9b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.325976 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.326280 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67aa348d-fe05-4e05-af01-a0b22d170a9b-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:52 crc kubenswrapper[4885]: W0308 19:52:52.456359 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43dd77c8_6951_423a_9334_502f66c3d1b5.slice/crio-147f4ea542c662381182c0111772a39e3eb3ab16d6a183baad6fafb4224edc7b WatchSource:0}: Error finding container 147f4ea542c662381182c0111772a39e3eb3ab16d6a183baad6fafb4224edc7b: Status 404 returned error can't find the container with id 147f4ea542c662381182c0111772a39e3eb3ab16d6a183baad6fafb4224edc7b Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.461363 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.468270 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.476002 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j97wh"] Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.555877 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerStarted","Data":"e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f"} Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.566025 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zz9c5" event={"ID":"3dcb0d23-1927-4f70-ac45-bcc01f9a081a","Type":"ContainerStarted","Data":"302d122e7028362942b84bde8688589c00dd224f41e987890dd32bc866af958e"} Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.572763 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" event={"ID":"67aa348d-fe05-4e05-af01-a0b22d170a9b","Type":"ContainerDied","Data":"e012096eca7d05b70c67ac3ab0c256b21f51521655fd61aa44ededf7f87ad72c"} Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.572835 4885 scope.go:117] "RemoveContainer" containerID="701f88d27bc0253f9783e551e39a9c634999051422220378d318c698c63ae0f5" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.572787 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.582530 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c045d9d-f8a0-40b9-9600-0d10d5c699e7","Type":"ContainerStarted","Data":"4a7d8fc5b878c3d0d7e4f116b765cb88438382af5862067e0b9a50b02bb40fea"} Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.586262 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j97wh" event={"ID":"43dd77c8-6951-423a-9334-502f66c3d1b5","Type":"ContainerStarted","Data":"147f4ea542c662381182c0111772a39e3eb3ab16d6a183baad6fafb4224edc7b"} Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.614963 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:52:52 crc kubenswrapper[4885]: E0308 19:52:52.620699 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:44ed1ca84e17bd0f004cfbdc3c0827d767daba52abb8e83e076bfd0e6c02f838\\\"\"" pod="openstack/cinder-db-sync-mp8c9" podUID="2efc22fd-a92b-422c-876d-7b80f06928b2" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.625573 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-zz9c5" podStartSLOduration=4.837675388 podStartE2EDuration="24.625551328s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="2026-03-08 19:52:29.962998825 +0000 UTC m=+1251.359052858" lastFinishedPulling="2026-03-08 19:52:49.750874775 +0000 UTC m=+1271.146928798" observedRunningTime="2026-03-08 19:52:52.582650211 +0000 UTC m=+1273.978704234" watchObservedRunningTime="2026-03-08 19:52:52.625551328 +0000 UTC m=+1274.021605351" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.647758 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-2cszd"] Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.648182 4885 scope.go:117] "RemoveContainer" containerID="b8826682ae559379d101397fb94513059f3cfbd38258fd8e20b0bbd2e14276d1" Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.655401 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f7dd995-2cszd"] Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.662619 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-gpkjd"] Mar 08 19:52:52 crc kubenswrapper[4885]: I0308 19:52:52.673735 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6bbc5d6644-tztss"] Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.384474 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" path="/var/lib/kubelet/pods/67aa348d-fe05-4e05-af01-a0b22d170a9b/volumes" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.598099 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j97wh" event={"ID":"43dd77c8-6951-423a-9334-502f66c3d1b5","Type":"ContainerStarted","Data":"6bdf4492dc9ff59a23eeb3289e91f60d8a1697795948d983180a4ac75c5e122e"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.600105 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"405b5d21-a208-4f86-b046-66968c326aa4","Type":"ContainerStarted","Data":"bf43cbd05abc4859f6ceeadb70f7e22ee780318d3caa3829c75db5c4ff63615b"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.600146 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"405b5d21-a208-4f86-b046-66968c326aa4","Type":"ContainerStarted","Data":"0ea29bb519254f2f6d232c0a073b9e8199e006c424f0d72c1ebe3ec8f1381dff"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.602125 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbc5d6644-tztss" event={"ID":"de714834-e155-41c1-83fc-a050203bde75","Type":"ContainerStarted","Data":"47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.602161 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbc5d6644-tztss" event={"ID":"de714834-e155-41c1-83fc-a050203bde75","Type":"ContainerStarted","Data":"5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.602172 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbc5d6644-tztss" event={"ID":"de714834-e155-41c1-83fc-a050203bde75","Type":"ContainerStarted","Data":"ce2e04e0c937557f90cd3e8f47d07607bc0f9d0c6eb93b3ee180bc8da569c97b"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.602265 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.603708 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerID="695a219704dcca2bf986cd5937fa221f84e3701eae327a3dbb20ee1df55cf8bc" exitCode=0 Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.603752 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" event={"ID":"3c32ee1c-ff69-4043-a31d-92be1d77a404","Type":"ContainerDied","Data":"695a219704dcca2bf986cd5937fa221f84e3701eae327a3dbb20ee1df55cf8bc"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.603769 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" event={"ID":"3c32ee1c-ff69-4043-a31d-92be1d77a404","Type":"ContainerStarted","Data":"d542f54b493f5cfa9eb7867bd9658f59584a89f76c6986f57f0a54179d70ccd3"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.608104 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c045d9d-f8a0-40b9-9600-0d10d5c699e7","Type":"ContainerStarted","Data":"5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144"} Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.622006 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-j97wh" podStartSLOduration=10.621990936 podStartE2EDuration="10.621990936s" podCreationTimestamp="2026-03-08 19:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:53.615272426 +0000 UTC m=+1275.011326449" watchObservedRunningTime="2026-03-08 19:52:53.621990936 +0000 UTC m=+1275.018044959" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.639380 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6bbc5d6644-tztss" podStartSLOduration=2.63936361 podStartE2EDuration="2.63936361s" podCreationTimestamp="2026-03-08 19:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:53.63409905 +0000 UTC m=+1275.030153063" watchObservedRunningTime="2026-03-08 19:52:53.63936361 +0000 UTC m=+1275.035417633" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.988902 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-56dd4b5ff7-j89qr"] Mar 08 19:52:53 crc kubenswrapper[4885]: E0308 19:52:53.989596 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.989608 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" Mar 08 19:52:53 crc kubenswrapper[4885]: E0308 19:52:53.989629 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="init" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.989638 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="init" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.989797 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.990625 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.996255 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 08 19:52:53 crc kubenswrapper[4885]: I0308 19:52:53.996447 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.009986 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56dd4b5ff7-j89qr"] Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071715 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-internal-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071769 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-ovndb-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071808 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-httpd-config\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071835 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-public-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071859 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-676dz\" (UniqueName: \"kubernetes.io/projected/4de3b511-619d-4637-ac70-f7e555976c0e-kube-api-access-676dz\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071880 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-combined-ca-bundle\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.071961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-config\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173394 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-config\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173463 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-internal-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173493 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-ovndb-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173529 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-httpd-config\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173560 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-public-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173585 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-676dz\" (UniqueName: \"kubernetes.io/projected/4de3b511-619d-4637-ac70-f7e555976c0e-kube-api-access-676dz\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.173612 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-combined-ca-bundle\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.178467 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-config\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.178609 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-public-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.178978 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-combined-ca-bundle\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.179506 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-internal-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.180642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-ovndb-tls-certs\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.185254 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-httpd-config\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.192765 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-676dz\" (UniqueName: \"kubernetes.io/projected/4de3b511-619d-4637-ac70-f7e555976c0e-kube-api-access-676dz\") pod \"neutron-56dd4b5ff7-j89qr\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.371255 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.535873 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-675f7dd995-2cszd" podUID="67aa348d-fe05-4e05-af01-a0b22d170a9b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.628780 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c045d9d-f8a0-40b9-9600-0d10d5c699e7","Type":"ContainerStarted","Data":"bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2"} Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.637708 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"405b5d21-a208-4f86-b046-66968c326aa4","Type":"ContainerStarted","Data":"76f21ebe5e2a1ec7083c3221c48c316f2bf16958272fe800dd874d9d19aa99dd"} Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.640894 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerStarted","Data":"b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2"} Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.644817 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" event={"ID":"3c32ee1c-ff69-4043-a31d-92be1d77a404","Type":"ContainerStarted","Data":"054cb1ab59c45cfe6a5f78ad6e0db5b7c13aecc33b053086164f7b3a9057b5c9"} Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.645619 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.648803 4885 generic.go:334] "Generic (PLEG): container finished" podID="3dcb0d23-1927-4f70-ac45-bcc01f9a081a" containerID="302d122e7028362942b84bde8688589c00dd224f41e987890dd32bc866af958e" exitCode=0 Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.648992 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zz9c5" event={"ID":"3dcb0d23-1927-4f70-ac45-bcc01f9a081a","Type":"ContainerDied","Data":"302d122e7028362942b84bde8688589c00dd224f41e987890dd32bc866af958e"} Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.662700 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=18.662669957 podStartE2EDuration="18.662669957s" podCreationTimestamp="2026-03-08 19:52:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:54.652798672 +0000 UTC m=+1276.048852695" watchObservedRunningTime="2026-03-08 19:52:54.662669957 +0000 UTC m=+1276.058724020" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.694605 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.694587681 podStartE2EDuration="12.694587681s" podCreationTimestamp="2026-03-08 19:52:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:54.689264388 +0000 UTC m=+1276.085318411" watchObservedRunningTime="2026-03-08 19:52:54.694587681 +0000 UTC m=+1276.090641704" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.713120 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" podStartSLOduration=3.713099216 podStartE2EDuration="3.713099216s" podCreationTimestamp="2026-03-08 19:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:54.705563285 +0000 UTC m=+1276.101617308" watchObservedRunningTime="2026-03-08 19:52:54.713099216 +0000 UTC m=+1276.109153239" Mar 08 19:52:54 crc kubenswrapper[4885]: I0308 19:52:54.915521 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-56dd4b5ff7-j89qr"] Mar 08 19:52:55 crc kubenswrapper[4885]: I0308 19:52:55.452766 4885 scope.go:117] "RemoveContainer" containerID="dd28461ac62623fc6ad7ac5f483ad81428e5d2b1b26c821a328a6729559f6fbb" Mar 08 19:52:55 crc kubenswrapper[4885]: I0308 19:52:55.668231 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd4b5ff7-j89qr" event={"ID":"4de3b511-619d-4637-ac70-f7e555976c0e","Type":"ContainerStarted","Data":"0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c"} Mar 08 19:52:55 crc kubenswrapper[4885]: I0308 19:52:55.670378 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd4b5ff7-j89qr" event={"ID":"4de3b511-619d-4637-ac70-f7e555976c0e","Type":"ContainerStarted","Data":"f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304"} Mar 08 19:52:55 crc kubenswrapper[4885]: I0308 19:52:55.670397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd4b5ff7-j89qr" event={"ID":"4de3b511-619d-4637-ac70-f7e555976c0e","Type":"ContainerStarted","Data":"276c039d660964960e48e559165b8b647e2ab8ab57d7b59f8062379f583a0dc6"} Mar 08 19:52:55 crc kubenswrapper[4885]: I0308 19:52:55.670696 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:52:55 crc kubenswrapper[4885]: I0308 19:52:55.707307 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-56dd4b5ff7-j89qr" podStartSLOduration=2.707289623 podStartE2EDuration="2.707289623s" podCreationTimestamp="2026-03-08 19:52:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:52:55.697899522 +0000 UTC m=+1277.093953545" watchObservedRunningTime="2026-03-08 19:52:55.707289623 +0000 UTC m=+1277.103343646" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.090058 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.111623 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-combined-ca-bundle\") pod \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.111728 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-scripts\") pod \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.111751 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-config-data\") pod \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.111770 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-logs\") pod \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.111829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhz2b\" (UniqueName: \"kubernetes.io/projected/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-kube-api-access-mhz2b\") pod \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\" (UID: \"3dcb0d23-1927-4f70-ac45-bcc01f9a081a\") " Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.113271 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-logs" (OuterVolumeSpecName: "logs") pod "3dcb0d23-1927-4f70-ac45-bcc01f9a081a" (UID: "3dcb0d23-1927-4f70-ac45-bcc01f9a081a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.120228 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-kube-api-access-mhz2b" (OuterVolumeSpecName: "kube-api-access-mhz2b") pod "3dcb0d23-1927-4f70-ac45-bcc01f9a081a" (UID: "3dcb0d23-1927-4f70-ac45-bcc01f9a081a"). InnerVolumeSpecName "kube-api-access-mhz2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.128374 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-scripts" (OuterVolumeSpecName: "scripts") pod "3dcb0d23-1927-4f70-ac45-bcc01f9a081a" (UID: "3dcb0d23-1927-4f70-ac45-bcc01f9a081a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.144587 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-config-data" (OuterVolumeSpecName: "config-data") pod "3dcb0d23-1927-4f70-ac45-bcc01f9a081a" (UID: "3dcb0d23-1927-4f70-ac45-bcc01f9a081a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.172933 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3dcb0d23-1927-4f70-ac45-bcc01f9a081a" (UID: "3dcb0d23-1927-4f70-ac45-bcc01f9a081a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.213221 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhz2b\" (UniqueName: \"kubernetes.io/projected/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-kube-api-access-mhz2b\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.213258 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.213274 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.213285 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.213299 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dcb0d23-1927-4f70-ac45-bcc01f9a081a-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.681061 4885 generic.go:334] "Generic (PLEG): container finished" podID="43dd77c8-6951-423a-9334-502f66c3d1b5" containerID="6bdf4492dc9ff59a23eeb3289e91f60d8a1697795948d983180a4ac75c5e122e" exitCode=0 Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.682418 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j97wh" event={"ID":"43dd77c8-6951-423a-9334-502f66c3d1b5","Type":"ContainerDied","Data":"6bdf4492dc9ff59a23eeb3289e91f60d8a1697795948d983180a4ac75c5e122e"} Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.687963 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-zz9c5" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.687981 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-zz9c5" event={"ID":"3dcb0d23-1927-4f70-ac45-bcc01f9a081a","Type":"ContainerDied","Data":"25f4ab49981792217a76b7fdf7f3c26a7cebd2e01dffa1cebd3dfdfec45e537a"} Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.688233 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25f4ab49981792217a76b7fdf7f3c26a7cebd2e01dffa1cebd3dfdfec45e537a" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.764121 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b5685698-p87pb"] Mar 08 19:52:56 crc kubenswrapper[4885]: E0308 19:52:56.764555 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dcb0d23-1927-4f70-ac45-bcc01f9a081a" containerName="placement-db-sync" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.764579 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dcb0d23-1927-4f70-ac45-bcc01f9a081a" containerName="placement-db-sync" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.764809 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dcb0d23-1927-4f70-ac45-bcc01f9a081a" containerName="placement-db-sync" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.766530 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.768160 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.768390 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.772150 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.772279 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-kg22q" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.778007 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b5685698-p87pb"] Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.779669 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.834974 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-public-tls-certs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.835047 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-scripts\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.835079 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-combined-ca-bundle\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.835103 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-internal-tls-certs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.835146 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq6bs\" (UniqueName: \"kubernetes.io/projected/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-kube-api-access-dq6bs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.835220 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-config-data\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.835277 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-logs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.936726 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq6bs\" (UniqueName: \"kubernetes.io/projected/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-kube-api-access-dq6bs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.936815 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-config-data\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.936861 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-logs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.936949 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-public-tls-certs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.936975 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-scripts\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.936997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-combined-ca-bundle\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.937014 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-internal-tls-certs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.937948 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-logs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.941008 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-internal-tls-certs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.941707 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-public-tls-certs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.943227 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-scripts\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.943642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-config-data\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.955986 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq6bs\" (UniqueName: \"kubernetes.io/projected/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-kube-api-access-dq6bs\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:56 crc kubenswrapper[4885]: I0308 19:52:56.956193 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-combined-ca-bundle\") pod \"placement-6b5685698-p87pb\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.007105 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.007152 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.041059 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.082946 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.093945 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.602373 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b5685698-p87pb"] Mar 08 19:52:57 crc kubenswrapper[4885]: W0308 19:52:57.633245 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30962695_4bc8_4fd2_b6e4_5b4b1f9d75a1.slice/crio-728e2c4227d2c2ab14fe2c194baeb854bdfe4cac966691c25a22f562f3e2f82b WatchSource:0}: Error finding container 728e2c4227d2c2ab14fe2c194baeb854bdfe4cac966691c25a22f562f3e2f82b: Status 404 returned error can't find the container with id 728e2c4227d2c2ab14fe2c194baeb854bdfe4cac966691c25a22f562f3e2f82b Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.723983 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5685698-p87pb" event={"ID":"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1","Type":"ContainerStarted","Data":"728e2c4227d2c2ab14fe2c194baeb854bdfe4cac966691c25a22f562f3e2f82b"} Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.724218 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 19:52:57 crc kubenswrapper[4885]: I0308 19:52:57.724756 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 19:52:58 crc kubenswrapper[4885]: I0308 19:52:58.733131 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5685698-p87pb" event={"ID":"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1","Type":"ContainerStarted","Data":"284122f7c1790bf6573097e7966743c78c5e2016a00bf0c85b35b46fb79ec3ac"} Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.077406 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.249153 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-combined-ca-bundle\") pod \"43dd77c8-6951-423a-9334-502f66c3d1b5\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.249220 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-fernet-keys\") pod \"43dd77c8-6951-423a-9334-502f66c3d1b5\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.249262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2r5x\" (UniqueName: \"kubernetes.io/projected/43dd77c8-6951-423a-9334-502f66c3d1b5-kube-api-access-b2r5x\") pod \"43dd77c8-6951-423a-9334-502f66c3d1b5\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.249446 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-scripts\") pod \"43dd77c8-6951-423a-9334-502f66c3d1b5\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.249844 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-credential-keys\") pod \"43dd77c8-6951-423a-9334-502f66c3d1b5\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.250372 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-config-data\") pod \"43dd77c8-6951-423a-9334-502f66c3d1b5\" (UID: \"43dd77c8-6951-423a-9334-502f66c3d1b5\") " Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.255555 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "43dd77c8-6951-423a-9334-502f66c3d1b5" (UID: "43dd77c8-6951-423a-9334-502f66c3d1b5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.255663 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-scripts" (OuterVolumeSpecName: "scripts") pod "43dd77c8-6951-423a-9334-502f66c3d1b5" (UID: "43dd77c8-6951-423a-9334-502f66c3d1b5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.267054 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "43dd77c8-6951-423a-9334-502f66c3d1b5" (UID: "43dd77c8-6951-423a-9334-502f66c3d1b5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.274189 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43dd77c8-6951-423a-9334-502f66c3d1b5-kube-api-access-b2r5x" (OuterVolumeSpecName: "kube-api-access-b2r5x") pod "43dd77c8-6951-423a-9334-502f66c3d1b5" (UID: "43dd77c8-6951-423a-9334-502f66c3d1b5"). InnerVolumeSpecName "kube-api-access-b2r5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.291804 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-config-data" (OuterVolumeSpecName: "config-data") pod "43dd77c8-6951-423a-9334-502f66c3d1b5" (UID: "43dd77c8-6951-423a-9334-502f66c3d1b5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.305089 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43dd77c8-6951-423a-9334-502f66c3d1b5" (UID: "43dd77c8-6951-423a-9334-502f66c3d1b5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.352397 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.352441 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.352455 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2r5x\" (UniqueName: \"kubernetes.io/projected/43dd77c8-6951-423a-9334-502f66c3d1b5-kube-api-access-b2r5x\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.352467 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.352478 4885 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.352489 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43dd77c8-6951-423a-9334-502f66c3d1b5-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.518284 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.573410 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.753689 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j97wh" event={"ID":"43dd77c8-6951-423a-9334-502f66c3d1b5","Type":"ContainerDied","Data":"147f4ea542c662381182c0111772a39e3eb3ab16d6a183baad6fafb4224edc7b"} Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.753730 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="147f4ea542c662381182c0111772a39e3eb3ab16d6a183baad6fafb4224edc7b" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.753735 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j97wh" Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.755853 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerStarted","Data":"37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b"} Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.758227 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5685698-p87pb" event={"ID":"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1","Type":"ContainerStarted","Data":"4b8fc7ef37c75cd3a7a88b7cc2d7710779d5554badc6896eed06c03d3625b81a"} Mar 08 19:53:00 crc kubenswrapper[4885]: I0308 19:53:00.784244 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6b5685698-p87pb" podStartSLOduration=4.784211343 podStartE2EDuration="4.784211343s" podCreationTimestamp="2026-03-08 19:52:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:00.774292648 +0000 UTC m=+1282.170346671" watchObservedRunningTime="2026-03-08 19:53:00.784211343 +0000 UTC m=+1282.180265356" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.193690 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-574d5c476f-sq4hm"] Mar 08 19:53:01 crc kubenswrapper[4885]: E0308 19:53:01.194042 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43dd77c8-6951-423a-9334-502f66c3d1b5" containerName="keystone-bootstrap" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.194055 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="43dd77c8-6951-423a-9334-502f66c3d1b5" containerName="keystone-bootstrap" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.194297 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="43dd77c8-6951-423a-9334-502f66c3d1b5" containerName="keystone-bootstrap" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.194852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.198014 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.198182 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.198439 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.198556 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.198667 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.198878 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nj2kw" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.215292 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-574d5c476f-sq4hm"] Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.370874 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-fernet-keys\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.370930 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q2mg\" (UniqueName: \"kubernetes.io/projected/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-kube-api-access-2q2mg\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.370979 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-public-tls-certs\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.371000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-config-data\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.371030 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-combined-ca-bundle\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.371078 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-internal-tls-certs\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.371116 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-scripts\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.371137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-credential-keys\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.435074 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.472896 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-scripts\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.472987 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-credential-keys\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.473502 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-fernet-keys\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.473653 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q2mg\" (UniqueName: \"kubernetes.io/projected/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-kube-api-access-2q2mg\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.473740 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-public-tls-certs\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.473767 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-config-data\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.473831 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-combined-ca-bundle\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.473858 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-internal-tls-certs\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.489373 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-fernet-keys\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.490265 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-combined-ca-bundle\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.491485 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-credential-keys\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.491679 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-internal-tls-certs\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.492282 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-config-data\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.492824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-scripts\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.503511 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-gscv2"] Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.503731 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerName="dnsmasq-dns" containerID="cri-o://7cc6d42eb38cb1c77825df2d6223ceee7a5e25787b09d1e978bf98a5ce1ee38e" gracePeriod=10 Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.508598 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q2mg\" (UniqueName: \"kubernetes.io/projected/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-kube-api-access-2q2mg\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.530054 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-public-tls-certs\") pod \"keystone-574d5c476f-sq4hm\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.782857 4885 generic.go:334] "Generic (PLEG): container finished" podID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerID="7cc6d42eb38cb1c77825df2d6223ceee7a5e25787b09d1e978bf98a5ce1ee38e" exitCode=0 Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.783696 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" event={"ID":"4e1d0a39-d199-43d0-bdea-33c0ecfff06f","Type":"ContainerDied","Data":"7cc6d42eb38cb1c77825df2d6223ceee7a5e25787b09d1e978bf98a5ce1ee38e"} Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.783723 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.784064 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:53:01 crc kubenswrapper[4885]: I0308 19:53:01.810765 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.004562 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.187054 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m92h4\" (UniqueName: \"kubernetes.io/projected/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-kube-api-access-m92h4\") pod \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.187180 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-svc\") pod \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.187216 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-nb\") pod \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.187295 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-config\") pod \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.187317 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-swift-storage-0\") pod \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.187347 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-sb\") pod \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\" (UID: \"4e1d0a39-d199-43d0-bdea-33c0ecfff06f\") " Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.199153 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-kube-api-access-m92h4" (OuterVolumeSpecName: "kube-api-access-m92h4") pod "4e1d0a39-d199-43d0-bdea-33c0ecfff06f" (UID: "4e1d0a39-d199-43d0-bdea-33c0ecfff06f"). InnerVolumeSpecName "kube-api-access-m92h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.227483 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e1d0a39-d199-43d0-bdea-33c0ecfff06f" (UID: "4e1d0a39-d199-43d0-bdea-33c0ecfff06f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.233580 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-config" (OuterVolumeSpecName: "config") pod "4e1d0a39-d199-43d0-bdea-33c0ecfff06f" (UID: "4e1d0a39-d199-43d0-bdea-33c0ecfff06f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.233673 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e1d0a39-d199-43d0-bdea-33c0ecfff06f" (UID: "4e1d0a39-d199-43d0-bdea-33c0ecfff06f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.241298 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e1d0a39-d199-43d0-bdea-33c0ecfff06f" (UID: "4e1d0a39-d199-43d0-bdea-33c0ecfff06f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.242388 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4e1d0a39-d199-43d0-bdea-33c0ecfff06f" (UID: "4e1d0a39-d199-43d0-bdea-33c0ecfff06f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.289566 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.289596 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.289606 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.289615 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m92h4\" (UniqueName: \"kubernetes.io/projected/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-kube-api-access-m92h4\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.289623 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.289631 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d0a39-d199-43d0-bdea-33c0ecfff06f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.331523 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-574d5c476f-sq4hm"] Mar 08 19:53:02 crc kubenswrapper[4885]: W0308 19:53:02.333167 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a28c270_c9ef_4b8c_a8e7_bcc69a1419cc.slice/crio-bb3823d25975ffc500551905492324189bd2643724049b62ce6f52a7469d0c61 WatchSource:0}: Error finding container bb3823d25975ffc500551905492324189bd2643724049b62ce6f52a7469d0c61: Status 404 returned error can't find the container with id bb3823d25975ffc500551905492324189bd2643724049b62ce6f52a7469d0c61 Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.794509 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-574d5c476f-sq4hm" event={"ID":"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc","Type":"ContainerStarted","Data":"0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421"} Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.794785 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-574d5c476f-sq4hm" event={"ID":"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc","Type":"ContainerStarted","Data":"bb3823d25975ffc500551905492324189bd2643724049b62ce6f52a7469d0c61"} Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.795384 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.804858 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.804902 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ccd7c9f8f-gscv2" event={"ID":"4e1d0a39-d199-43d0-bdea-33c0ecfff06f","Type":"ContainerDied","Data":"3afeb0eb326591c1d8125d846debd099f1fede59cbbe242947add0ce27db5e6c"} Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.804958 4885 scope.go:117] "RemoveContainer" containerID="7cc6d42eb38cb1c77825df2d6223ceee7a5e25787b09d1e978bf98a5ce1ee38e" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.817728 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.817781 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.822492 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-574d5c476f-sq4hm" podStartSLOduration=1.822474712 podStartE2EDuration="1.822474712s" podCreationTimestamp="2026-03-08 19:53:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:02.814073087 +0000 UTC m=+1284.210127120" watchObservedRunningTime="2026-03-08 19:53:02.822474712 +0000 UTC m=+1284.218528735" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.839574 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.839615 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.846416 4885 scope.go:117] "RemoveContainer" containerID="1640371a030e2f7def6f8a64bd3eab03cae3c318d2de3e683f5f0df21e52c94a" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.846643 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-gscv2"] Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.853889 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ccd7c9f8f-gscv2"] Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.882259 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:02 crc kubenswrapper[4885]: I0308 19:53:02.889400 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:03 crc kubenswrapper[4885]: I0308 19:53:03.409373 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" path="/var/lib/kubelet/pods/4e1d0a39-d199-43d0-bdea-33c0ecfff06f/volumes" Mar 08 19:53:03 crc kubenswrapper[4885]: I0308 19:53:03.821197 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:03 crc kubenswrapper[4885]: I0308 19:53:03.821561 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:03 crc kubenswrapper[4885]: I0308 19:53:03.940260 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:53:04 crc kubenswrapper[4885]: I0308 19:53:04.841404 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fwjsc" event={"ID":"46290bd2-6ad7-46f4-86f4-48aa73bc304a","Type":"ContainerStarted","Data":"155731b1565c2836cebbf6fadafab50001c261430bf9d84221bbe681fb56634d"} Mar 08 19:53:04 crc kubenswrapper[4885]: I0308 19:53:04.844027 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mp8c9" event={"ID":"2efc22fd-a92b-422c-876d-7b80f06928b2","Type":"ContainerStarted","Data":"43662ed70d9fce30619b2928a293996c741d8618375e00a25c69cc3ec2f8804c"} Mar 08 19:53:04 crc kubenswrapper[4885]: I0308 19:53:04.861430 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-fwjsc" podStartSLOduration=3.224275969 podStartE2EDuration="36.861409829s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="2026-03-08 19:52:30.206775219 +0000 UTC m=+1251.602829242" lastFinishedPulling="2026-03-08 19:53:03.843909069 +0000 UTC m=+1285.239963102" observedRunningTime="2026-03-08 19:53:04.858728967 +0000 UTC m=+1286.254783010" watchObservedRunningTime="2026-03-08 19:53:04.861409829 +0000 UTC m=+1286.257463862" Mar 08 19:53:04 crc kubenswrapper[4885]: I0308 19:53:04.879698 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-mp8c9" podStartSLOduration=2.891526095 podStartE2EDuration="36.879674168s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="2026-03-08 19:52:29.816070173 +0000 UTC m=+1251.212124196" lastFinishedPulling="2026-03-08 19:53:03.804218246 +0000 UTC m=+1285.200272269" observedRunningTime="2026-03-08 19:53:04.874682934 +0000 UTC m=+1286.270736977" watchObservedRunningTime="2026-03-08 19:53:04.879674168 +0000 UTC m=+1286.275728191" Mar 08 19:53:05 crc kubenswrapper[4885]: I0308 19:53:05.861195 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 19:53:05 crc kubenswrapper[4885]: I0308 19:53:05.861453 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 19:53:06 crc kubenswrapper[4885]: I0308 19:53:06.022254 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:06 crc kubenswrapper[4885]: I0308 19:53:06.023523 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 19:53:06 crc kubenswrapper[4885]: I0308 19:53:06.873319 4885 generic.go:334] "Generic (PLEG): container finished" podID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" containerID="155731b1565c2836cebbf6fadafab50001c261430bf9d84221bbe681fb56634d" exitCode=0 Mar 08 19:53:06 crc kubenswrapper[4885]: I0308 19:53:06.873414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fwjsc" event={"ID":"46290bd2-6ad7-46f4-86f4-48aa73bc304a","Type":"ContainerDied","Data":"155731b1565c2836cebbf6fadafab50001c261430bf9d84221bbe681fb56634d"} Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.655590 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.704858 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6f64\" (UniqueName: \"kubernetes.io/projected/46290bd2-6ad7-46f4-86f4-48aa73bc304a-kube-api-access-w6f64\") pod \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.704946 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-db-sync-config-data\") pod \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.705046 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-combined-ca-bundle\") pod \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\" (UID: \"46290bd2-6ad7-46f4-86f4-48aa73bc304a\") " Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.714033 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "46290bd2-6ad7-46f4-86f4-48aa73bc304a" (UID: "46290bd2-6ad7-46f4-86f4-48aa73bc304a"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.714091 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46290bd2-6ad7-46f4-86f4-48aa73bc304a-kube-api-access-w6f64" (OuterVolumeSpecName: "kube-api-access-w6f64") pod "46290bd2-6ad7-46f4-86f4-48aa73bc304a" (UID: "46290bd2-6ad7-46f4-86f4-48aa73bc304a"). InnerVolumeSpecName "kube-api-access-w6f64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.734857 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46290bd2-6ad7-46f4-86f4-48aa73bc304a" (UID: "46290bd2-6ad7-46f4-86f4-48aa73bc304a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.807868 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6f64\" (UniqueName: \"kubernetes.io/projected/46290bd2-6ad7-46f4-86f4-48aa73bc304a-kube-api-access-w6f64\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.807907 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.807936 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46290bd2-6ad7-46f4-86f4-48aa73bc304a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.929387 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-fwjsc" event={"ID":"46290bd2-6ad7-46f4-86f4-48aa73bc304a","Type":"ContainerDied","Data":"e55e97244abd013b52340deec77649890bb853264595abb22df0c82e67349268"} Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.929437 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e55e97244abd013b52340deec77649890bb853264595abb22df0c82e67349268" Mar 08 19:53:08 crc kubenswrapper[4885]: I0308 19:53:08.929500 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-fwjsc" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.059049 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5b88496c9d-2g95h"] Mar 08 19:53:09 crc kubenswrapper[4885]: E0308 19:53:09.059946 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" containerName="barbican-db-sync" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.059960 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" containerName="barbican-db-sync" Mar 08 19:53:09 crc kubenswrapper[4885]: E0308 19:53:09.059997 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerName="dnsmasq-dns" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.060004 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerName="dnsmasq-dns" Mar 08 19:53:09 crc kubenswrapper[4885]: E0308 19:53:09.060029 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerName="init" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.060036 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerName="init" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.060368 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1d0a39-d199-43d0-bdea-33c0ecfff06f" containerName="dnsmasq-dns" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.060389 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" containerName="barbican-db-sync" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.063389 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.070485 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.070980 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.070195 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-zrht4" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.080094 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7dfc6b7fcc-dpq7t"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.112818 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.115819 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-combined-ca-bundle\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.115867 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data-custom\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.115903 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7268474-e124-4139-bf24-6b3f605b9511-logs\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.115954 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.115983 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td4rc\" (UniqueName: \"kubernetes.io/projected/a7268474-e124-4139-bf24-6b3f605b9511-kube-api-access-td4rc\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.119489 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.128030 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7dfc6b7fcc-dpq7t"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.140016 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b88496c9d-2g95h"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.176335 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-qf59z"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.178446 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.183960 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-qf59z"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217346 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217386 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217418 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td4rc\" (UniqueName: \"kubernetes.io/projected/a7268474-e124-4139-bf24-6b3f605b9511-kube-api-access-td4rc\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217442 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-config\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217462 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfdxd\" (UniqueName: \"kubernetes.io/projected/2a083cf5-4ca2-440c-840a-6b159151609f-kube-api-access-gfdxd\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217502 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217516 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217529 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a083cf5-4ca2-440c-840a-6b159151609f-logs\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217553 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-combined-ca-bundle\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpmg8\" (UniqueName: \"kubernetes.io/projected/42432d22-20ca-464e-be0b-e881c9ef89a7-kube-api-access-mpmg8\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217599 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data-custom\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217627 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-combined-ca-bundle\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217650 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data-custom\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217675 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217697 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.217717 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7268474-e124-4139-bf24-6b3f605b9511-logs\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.218107 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7268474-e124-4139-bf24-6b3f605b9511-logs\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.225036 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-combined-ca-bundle\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.225563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data-custom\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.226573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.239405 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td4rc\" (UniqueName: \"kubernetes.io/projected/a7268474-e124-4139-bf24-6b3f605b9511-kube-api-access-td4rc\") pod \"barbican-keystone-listener-5b88496c9d-2g95h\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320700 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320742 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320763 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a083cf5-4ca2-440c-840a-6b159151609f-logs\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-combined-ca-bundle\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpmg8\" (UniqueName: \"kubernetes.io/projected/42432d22-20ca-464e-be0b-e881c9ef89a7-kube-api-access-mpmg8\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320873 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data-custom\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320961 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.320988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.321054 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.321103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-config\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.321131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfdxd\" (UniqueName: \"kubernetes.io/projected/2a083cf5-4ca2-440c-840a-6b159151609f-kube-api-access-gfdxd\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.330895 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-svc\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.330963 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-swift-storage-0\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.334192 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data-custom\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.351309 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a083cf5-4ca2-440c-840a-6b159151609f-logs\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.351527 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-config\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.351685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-sb\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.351691 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-nb\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.353580 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-86998568fb-9gsxz"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.355162 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.355374 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.358537 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.370996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfdxd\" (UniqueName: \"kubernetes.io/projected/2a083cf5-4ca2-440c-840a-6b159151609f-kube-api-access-gfdxd\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.372407 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpmg8\" (UniqueName: \"kubernetes.io/projected/42432d22-20ca-464e-be0b-e881c9ef89a7-kube-api-access-mpmg8\") pod \"dnsmasq-dns-8449d68f4f-qf59z\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.375477 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-combined-ca-bundle\") pod \"barbican-worker-7dfc6b7fcc-dpq7t\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.390818 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86998568fb-9gsxz"] Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.423033 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-combined-ca-bundle\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.423113 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data-custom\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.423152 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nclf\" (UniqueName: \"kubernetes.io/projected/5d5033f1-b303-4891-875f-8f9bcb7585c0-kube-api-access-5nclf\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.423195 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d5033f1-b303-4891-875f-8f9bcb7585c0-logs\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.423225 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.442435 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.461582 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.520401 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.524759 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-combined-ca-bundle\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.524830 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data-custom\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.524873 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nclf\" (UniqueName: \"kubernetes.io/projected/5d5033f1-b303-4891-875f-8f9bcb7585c0-kube-api-access-5nclf\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.524977 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d5033f1-b303-4891-875f-8f9bcb7585c0-logs\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.525022 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.565300 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d5033f1-b303-4891-875f-8f9bcb7585c0-logs\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.568619 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-combined-ca-bundle\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.568988 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data-custom\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.570763 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nclf\" (UniqueName: \"kubernetes.io/projected/5d5033f1-b303-4891-875f-8f9bcb7585c0-kube-api-access-5nclf\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.572632 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data\") pod \"barbican-api-86998568fb-9gsxz\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.682525 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.943503 4885 generic.go:334] "Generic (PLEG): container finished" podID="2efc22fd-a92b-422c-876d-7b80f06928b2" containerID="43662ed70d9fce30619b2928a293996c741d8618375e00a25c69cc3ec2f8804c" exitCode=0 Mar 08 19:53:09 crc kubenswrapper[4885]: I0308 19:53:09.943542 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mp8c9" event={"ID":"2efc22fd-a92b-422c-876d-7b80f06928b2","Type":"ContainerDied","Data":"43662ed70d9fce30619b2928a293996c741d8618375e00a25c69cc3ec2f8804c"} Mar 08 19:53:10 crc kubenswrapper[4885]: I0308 19:53:10.956525 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerStarted","Data":"777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d"} Mar 08 19:53:10 crc kubenswrapper[4885]: I0308 19:53:10.956632 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-central-agent" containerID="cri-o://e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f" gracePeriod=30 Mar 08 19:53:10 crc kubenswrapper[4885]: I0308 19:53:10.957121 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="proxy-httpd" containerID="cri-o://777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d" gracePeriod=30 Mar 08 19:53:10 crc kubenswrapper[4885]: I0308 19:53:10.957145 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-notification-agent" containerID="cri-o://b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2" gracePeriod=30 Mar 08 19:53:10 crc kubenswrapper[4885]: I0308 19:53:10.957199 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="sg-core" containerID="cri-o://37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b" gracePeriod=30 Mar 08 19:53:10 crc kubenswrapper[4885]: I0308 19:53:10.987722 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.739569477 podStartE2EDuration="42.987707053s" podCreationTimestamp="2026-03-08 19:52:28 +0000 UTC" firstStartedPulling="2026-03-08 19:52:30.375314509 +0000 UTC m=+1251.771368532" lastFinishedPulling="2026-03-08 19:53:10.623452075 +0000 UTC m=+1292.019506108" observedRunningTime="2026-03-08 19:53:10.986687396 +0000 UTC m=+1292.382741419" watchObservedRunningTime="2026-03-08 19:53:10.987707053 +0000 UTC m=+1292.383761076" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.094846 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5b88496c9d-2g95h"] Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.137740 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-qf59z"] Mar 08 19:53:11 crc kubenswrapper[4885]: W0308 19:53:11.156877 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42432d22_20ca_464e_be0b_e881c9ef89a7.slice/crio-eb4101a356a90503cf71164dfd919b65584b1da7619bca006da1e2b73f5cb2e2 WatchSource:0}: Error finding container eb4101a356a90503cf71164dfd919b65584b1da7619bca006da1e2b73f5cb2e2: Status 404 returned error can't find the container with id eb4101a356a90503cf71164dfd919b65584b1da7619bca006da1e2b73f5cb2e2 Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.202827 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7dfc6b7fcc-dpq7t"] Mar 08 19:53:11 crc kubenswrapper[4885]: W0308 19:53:11.207436 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a083cf5_4ca2_440c_840a_6b159151609f.slice/crio-7e0b4a7b5579c233c2e49f4395b1b83ad7591cab769b791a33fa19e09b808340 WatchSource:0}: Error finding container 7e0b4a7b5579c233c2e49f4395b1b83ad7591cab769b791a33fa19e09b808340: Status 404 returned error can't find the container with id 7e0b4a7b5579c233c2e49f4395b1b83ad7591cab769b791a33fa19e09b808340 Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.258721 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.322736 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86998568fb-9gsxz"] Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.463532 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlqnv\" (UniqueName: \"kubernetes.io/projected/2efc22fd-a92b-422c-876d-7b80f06928b2-kube-api-access-wlqnv\") pod \"2efc22fd-a92b-422c-876d-7b80f06928b2\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.463644 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2efc22fd-a92b-422c-876d-7b80f06928b2-etc-machine-id\") pod \"2efc22fd-a92b-422c-876d-7b80f06928b2\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.463678 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-db-sync-config-data\") pod \"2efc22fd-a92b-422c-876d-7b80f06928b2\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.463703 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-scripts\") pod \"2efc22fd-a92b-422c-876d-7b80f06928b2\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.463763 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2efc22fd-a92b-422c-876d-7b80f06928b2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2efc22fd-a92b-422c-876d-7b80f06928b2" (UID: "2efc22fd-a92b-422c-876d-7b80f06928b2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.463940 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-config-data\") pod \"2efc22fd-a92b-422c-876d-7b80f06928b2\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.464029 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-combined-ca-bundle\") pod \"2efc22fd-a92b-422c-876d-7b80f06928b2\" (UID: \"2efc22fd-a92b-422c-876d-7b80f06928b2\") " Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.464700 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2efc22fd-a92b-422c-876d-7b80f06928b2-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.467808 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-scripts" (OuterVolumeSpecName: "scripts") pod "2efc22fd-a92b-422c-876d-7b80f06928b2" (UID: "2efc22fd-a92b-422c-876d-7b80f06928b2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.469230 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2efc22fd-a92b-422c-876d-7b80f06928b2-kube-api-access-wlqnv" (OuterVolumeSpecName: "kube-api-access-wlqnv") pod "2efc22fd-a92b-422c-876d-7b80f06928b2" (UID: "2efc22fd-a92b-422c-876d-7b80f06928b2"). InnerVolumeSpecName "kube-api-access-wlqnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.473205 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "2efc22fd-a92b-422c-876d-7b80f06928b2" (UID: "2efc22fd-a92b-422c-876d-7b80f06928b2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.490257 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2efc22fd-a92b-422c-876d-7b80f06928b2" (UID: "2efc22fd-a92b-422c-876d-7b80f06928b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.518500 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-config-data" (OuterVolumeSpecName: "config-data") pod "2efc22fd-a92b-422c-876d-7b80f06928b2" (UID: "2efc22fd-a92b-422c-876d-7b80f06928b2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.566187 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.566225 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.566238 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.566248 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efc22fd-a92b-422c-876d-7b80f06928b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.566259 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlqnv\" (UniqueName: \"kubernetes.io/projected/2efc22fd-a92b-422c-876d-7b80f06928b2-kube-api-access-wlqnv\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.968382 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86998568fb-9gsxz" event={"ID":"5d5033f1-b303-4891-875f-8f9bcb7585c0","Type":"ContainerStarted","Data":"49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f"} Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.968419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86998568fb-9gsxz" event={"ID":"5d5033f1-b303-4891-875f-8f9bcb7585c0","Type":"ContainerStarted","Data":"aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd"} Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.968428 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86998568fb-9gsxz" event={"ID":"5d5033f1-b303-4891-875f-8f9bcb7585c0","Type":"ContainerStarted","Data":"004645ddace08c8f81341f8f25e364cdb5c96da8e77e37b02915c3180098f6f0"} Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.970240 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.970470 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.979500 4885 generic.go:334] "Generic (PLEG): container finished" podID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerID="a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf" exitCode=0 Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.979580 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" event={"ID":"42432d22-20ca-464e-be0b-e881c9ef89a7","Type":"ContainerDied","Data":"a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf"} Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.979603 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" event={"ID":"42432d22-20ca-464e-be0b-e881c9ef89a7","Type":"ContainerStarted","Data":"eb4101a356a90503cf71164dfd919b65584b1da7619bca006da1e2b73f5cb2e2"} Mar 08 19:53:11 crc kubenswrapper[4885]: I0308 19:53:11.984210 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" event={"ID":"2a083cf5-4ca2-440c-840a-6b159151609f","Type":"ContainerStarted","Data":"7e0b4a7b5579c233c2e49f4395b1b83ad7591cab769b791a33fa19e09b808340"} Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.005293 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mp8c9" event={"ID":"2efc22fd-a92b-422c-876d-7b80f06928b2","Type":"ContainerDied","Data":"d6ed11f43445c4cfcc1c38c863e1b91670c997a17a500e0e306eca10269bbe60"} Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.005339 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6ed11f43445c4cfcc1c38c863e1b91670c997a17a500e0e306eca10269bbe60" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.005428 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mp8c9" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.011325 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" event={"ID":"a7268474-e124-4139-bf24-6b3f605b9511","Type":"ContainerStarted","Data":"7d126567b856b73925e9e50b783a515a23fdff84d4ca27fd2089e38d86b58980"} Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.014531 4885 generic.go:334] "Generic (PLEG): container finished" podID="624830da-2b73-4843-bb04-6db9c1a7b281" containerID="37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b" exitCode=2 Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.014561 4885 generic.go:334] "Generic (PLEG): container finished" podID="624830da-2b73-4843-bb04-6db9c1a7b281" containerID="e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f" exitCode=0 Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.014580 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerDied","Data":"37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b"} Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.014601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerDied","Data":"e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f"} Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.015496 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-86998568fb-9gsxz" podStartSLOduration=3.015470899 podStartE2EDuration="3.015470899s" podCreationTimestamp="2026-03-08 19:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:12.001953527 +0000 UTC m=+1293.398007550" watchObservedRunningTime="2026-03-08 19:53:12.015470899 +0000 UTC m=+1293.411524922" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.210005 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:12 crc kubenswrapper[4885]: E0308 19:53:12.210351 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efc22fd-a92b-422c-876d-7b80f06928b2" containerName="cinder-db-sync" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.210367 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efc22fd-a92b-422c-876d-7b80f06928b2" containerName="cinder-db-sync" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.210551 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efc22fd-a92b-422c-876d-7b80f06928b2" containerName="cinder-db-sync" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.211439 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.224379 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-x9f48" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.224499 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.224637 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.224674 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.229170 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.296394 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-qf59z"] Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.328595 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-lpg8x"] Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.330157 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.342726 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-lpg8x"] Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.388329 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.388405 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.388511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-scripts\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.388740 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.388846 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.388978 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zwcw\" (UniqueName: \"kubernetes.io/projected/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-kube-api-access-2zwcw\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.424588 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.426569 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.428356 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.448400 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492006 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492184 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-config\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492286 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-scripts\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492359 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r6vd\" (UniqueName: \"kubernetes.io/projected/d087e374-bcc9-4a44-8fbe-aee43a47115e-kube-api-access-9r6vd\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492463 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492538 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492624 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492701 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492793 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zwcw\" (UniqueName: \"kubernetes.io/projected/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-kube-api-access-2zwcw\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492889 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.492980 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.493059 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.496189 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.498342 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-scripts\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.498708 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.500866 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.501948 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.513294 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zwcw\" (UniqueName: \"kubernetes.io/projected/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-kube-api-access-2zwcw\") pod \"cinder-scheduler-0\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.533629 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595175 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595223 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd91d80d-b465-47bc-ab15-cc9281dbb198-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595243 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd91d80d-b465-47bc-ab15-cc9281dbb198-logs\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595265 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58gz4\" (UniqueName: \"kubernetes.io/projected/bd91d80d-b465-47bc-ab15-cc9281dbb198-kube-api-access-58gz4\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595394 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595681 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595727 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595757 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595821 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-config\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595860 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data-custom\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595887 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-scripts\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.595908 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r6vd\" (UniqueName: \"kubernetes.io/projected/d087e374-bcc9-4a44-8fbe-aee43a47115e-kube-api-access-9r6vd\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.596026 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.596535 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-svc\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.597166 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-nb\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.597422 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-sb\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.597684 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-swift-storage-0\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.601586 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-config\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.613378 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r6vd\" (UniqueName: \"kubernetes.io/projected/d087e374-bcc9-4a44-8fbe-aee43a47115e-kube-api-access-9r6vd\") pod \"dnsmasq-dns-7b8fcc65cc-lpg8x\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.662720 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697746 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697813 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data-custom\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697843 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-scripts\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697906 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd91d80d-b465-47bc-ab15-cc9281dbb198-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd91d80d-b465-47bc-ab15-cc9281dbb198-logs\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697961 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58gz4\" (UniqueName: \"kubernetes.io/projected/bd91d80d-b465-47bc-ab15-cc9281dbb198-kube-api-access-58gz4\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.697988 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.698278 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd91d80d-b465-47bc-ab15-cc9281dbb198-etc-machine-id\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.698638 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd91d80d-b465-47bc-ab15-cc9281dbb198-logs\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.701682 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.707366 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data-custom\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.707612 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-scripts\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.707855 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.713768 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58gz4\" (UniqueName: \"kubernetes.io/projected/bd91d80d-b465-47bc-ab15-cc9281dbb198-kube-api-access-58gz4\") pod \"cinder-api-0\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " pod="openstack/cinder-api-0" Mar 08 19:53:12 crc kubenswrapper[4885]: I0308 19:53:12.742837 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.081275 4885 generic.go:334] "Generic (PLEG): container finished" podID="624830da-2b73-4843-bb04-6db9c1a7b281" containerID="b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2" exitCode=0 Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.081603 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerDied","Data":"b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2"} Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.095895 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" event={"ID":"42432d22-20ca-464e-be0b-e881c9ef89a7","Type":"ContainerStarted","Data":"c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6"} Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.096317 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerName="dnsmasq-dns" containerID="cri-o://c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6" gracePeriod=10 Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.096422 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.121860 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" podStartSLOduration=4.121845248 podStartE2EDuration="4.121845248s" podCreationTimestamp="2026-03-08 19:53:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:13.111763918 +0000 UTC m=+1294.507817941" watchObservedRunningTime="2026-03-08 19:53:13.121845248 +0000 UTC m=+1294.517899261" Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.837418 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.930217 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-nb\") pod \"42432d22-20ca-464e-be0b-e881c9ef89a7\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.930486 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-swift-storage-0\") pod \"42432d22-20ca-464e-be0b-e881c9ef89a7\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.930513 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-sb\") pod \"42432d22-20ca-464e-be0b-e881c9ef89a7\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.930716 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpmg8\" (UniqueName: \"kubernetes.io/projected/42432d22-20ca-464e-be0b-e881c9ef89a7-kube-api-access-mpmg8\") pod \"42432d22-20ca-464e-be0b-e881c9ef89a7\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.930784 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-svc\") pod \"42432d22-20ca-464e-be0b-e881c9ef89a7\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.930861 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-config\") pod \"42432d22-20ca-464e-be0b-e881c9ef89a7\" (UID: \"42432d22-20ca-464e-be0b-e881c9ef89a7\") " Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.938505 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42432d22-20ca-464e-be0b-e881c9ef89a7-kube-api-access-mpmg8" (OuterVolumeSpecName: "kube-api-access-mpmg8") pod "42432d22-20ca-464e-be0b-e881c9ef89a7" (UID: "42432d22-20ca-464e-be0b-e881c9ef89a7"). InnerVolumeSpecName "kube-api-access-mpmg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:13 crc kubenswrapper[4885]: I0308 19:53:13.996869 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "42432d22-20ca-464e-be0b-e881c9ef89a7" (UID: "42432d22-20ca-464e-be0b-e881c9ef89a7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.006356 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "42432d22-20ca-464e-be0b-e881c9ef89a7" (UID: "42432d22-20ca-464e-be0b-e881c9ef89a7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.024594 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "42432d22-20ca-464e-be0b-e881c9ef89a7" (UID: "42432d22-20ca-464e-be0b-e881c9ef89a7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.026501 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-config" (OuterVolumeSpecName: "config") pod "42432d22-20ca-464e-be0b-e881c9ef89a7" (UID: "42432d22-20ca-464e-be0b-e881c9ef89a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.033122 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.033150 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.033161 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.033174 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpmg8\" (UniqueName: \"kubernetes.io/projected/42432d22-20ca-464e-be0b-e881c9ef89a7-kube-api-access-mpmg8\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.033184 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.035021 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:14 crc kubenswrapper[4885]: W0308 19:53:14.038818 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ac3e4ad_92f1_4c79_bc47_5e9707b376bf.slice/crio-b4e02e9dba863f76b95bf3b4c26ddf5112fe936b13a4f102d174a8450de9c4d1 WatchSource:0}: Error finding container b4e02e9dba863f76b95bf3b4c26ddf5112fe936b13a4f102d174a8450de9c4d1: Status 404 returned error can't find the container with id b4e02e9dba863f76b95bf3b4c26ddf5112fe936b13a4f102d174a8450de9c4d1 Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.039889 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "42432d22-20ca-464e-be0b-e881c9ef89a7" (UID: "42432d22-20ca-464e-be0b-e881c9ef89a7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.105025 4885 generic.go:334] "Generic (PLEG): container finished" podID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerID="c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6" exitCode=0 Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.105089 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" event={"ID":"42432d22-20ca-464e-be0b-e881c9ef89a7","Type":"ContainerDied","Data":"c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6"} Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.105091 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.105116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8449d68f4f-qf59z" event={"ID":"42432d22-20ca-464e-be0b-e881c9ef89a7","Type":"ContainerDied","Data":"eb4101a356a90503cf71164dfd919b65584b1da7619bca006da1e2b73f5cb2e2"} Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.105132 4885 scope.go:117] "RemoveContainer" containerID="c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.107723 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" event={"ID":"2a083cf5-4ca2-440c-840a-6b159151609f","Type":"ContainerStarted","Data":"d5cd5c3527dc17515d5a33bed3c5118e0fcbd6d15187bcfb409883f29afc80a6"} Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.111490 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" event={"ID":"a7268474-e124-4139-bf24-6b3f605b9511","Type":"ContainerStarted","Data":"3a20cf21bbfb4da8c71131e4075d64b83bae96d5c5020bc3cfadcf8d7226f8bc"} Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.113019 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf","Type":"ContainerStarted","Data":"b4e02e9dba863f76b95bf3b4c26ddf5112fe936b13a4f102d174a8450de9c4d1"} Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.134836 4885 scope.go:117] "RemoveContainer" containerID="a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.136266 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42432d22-20ca-464e-be0b-e881c9ef89a7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.155979 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.159314 4885 scope.go:117] "RemoveContainer" containerID="c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6" Mar 08 19:53:14 crc kubenswrapper[4885]: E0308 19:53:14.161184 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6\": container with ID starting with c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6 not found: ID does not exist" containerID="c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.161235 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6"} err="failed to get container status \"c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6\": rpc error: code = NotFound desc = could not find container \"c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6\": container with ID starting with c1399a68c0f9c091e2f3f66e0283deb94b9895d40d54fbd30c95d809cded19f6 not found: ID does not exist" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.161271 4885 scope.go:117] "RemoveContainer" containerID="a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf" Mar 08 19:53:14 crc kubenswrapper[4885]: E0308 19:53:14.161673 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf\": container with ID starting with a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf not found: ID does not exist" containerID="a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf" Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.161700 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf"} err="failed to get container status \"a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf\": rpc error: code = NotFound desc = could not find container \"a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf\": container with ID starting with a47310515e5cd1e5c84a70976413338e90dbd1eb75be38805b3b108e19950fbf not found: ID does not exist" Mar 08 19:53:14 crc kubenswrapper[4885]: W0308 19:53:14.163381 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd91d80d_b465_47bc_ab15_cc9281dbb198.slice/crio-6ccedb26b689eacc5264cde7ce6de809146d5707c5200202b877132d6cb530ed WatchSource:0}: Error finding container 6ccedb26b689eacc5264cde7ce6de809146d5707c5200202b877132d6cb530ed: Status 404 returned error can't find the container with id 6ccedb26b689eacc5264cde7ce6de809146d5707c5200202b877132d6cb530ed Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.164640 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-qf59z"] Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.179123 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8449d68f4f-qf59z"] Mar 08 19:53:14 crc kubenswrapper[4885]: I0308 19:53:14.202576 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-lpg8x"] Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.123293 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" event={"ID":"a7268474-e124-4139-bf24-6b3f605b9511","Type":"ContainerStarted","Data":"6e7be97046549290741b9a7850306bb8d9be298e24617283ccb5d04dda12497f"} Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.125142 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bd91d80d-b465-47bc-ab15-cc9281dbb198","Type":"ContainerStarted","Data":"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9"} Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.125185 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bd91d80d-b465-47bc-ab15-cc9281dbb198","Type":"ContainerStarted","Data":"6ccedb26b689eacc5264cde7ce6de809146d5707c5200202b877132d6cb530ed"} Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.128194 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" event={"ID":"2a083cf5-4ca2-440c-840a-6b159151609f","Type":"ContainerStarted","Data":"b8d53aa1399bba98dc12433735d0a8b3cb69b3036f3c8fb648dbc900fdb658b2"} Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.130606 4885 generic.go:334] "Generic (PLEG): container finished" podID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerID="41bd07d83b5b4958e58a7473f1f938d73689ec0cd631180b50c3f160c3251d1c" exitCode=0 Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.130660 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" event={"ID":"d087e374-bcc9-4a44-8fbe-aee43a47115e","Type":"ContainerDied","Data":"41bd07d83b5b4958e58a7473f1f938d73689ec0cd631180b50c3f160c3251d1c"} Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.130718 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" event={"ID":"d087e374-bcc9-4a44-8fbe-aee43a47115e","Type":"ContainerStarted","Data":"4547972efa3892226729dccf70e00d854a9c1e79c44132cd7c28be08c974628a"} Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.154814 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" podStartSLOduration=3.775041356 podStartE2EDuration="6.154795324s" podCreationTimestamp="2026-03-08 19:53:09 +0000 UTC" firstStartedPulling="2026-03-08 19:53:11.167135365 +0000 UTC m=+1292.563189388" lastFinishedPulling="2026-03-08 19:53:13.546889323 +0000 UTC m=+1294.942943356" observedRunningTime="2026-03-08 19:53:15.147276684 +0000 UTC m=+1296.543330727" watchObservedRunningTime="2026-03-08 19:53:15.154795324 +0000 UTC m=+1296.550849347" Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.198116 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" podStartSLOduration=3.874428516 podStartE2EDuration="6.198100873s" podCreationTimestamp="2026-03-08 19:53:09 +0000 UTC" firstStartedPulling="2026-03-08 19:53:11.218618193 +0000 UTC m=+1292.614672216" lastFinishedPulling="2026-03-08 19:53:13.54229054 +0000 UTC m=+1294.938344573" observedRunningTime="2026-03-08 19:53:15.193253764 +0000 UTC m=+1296.589307787" watchObservedRunningTime="2026-03-08 19:53:15.198100873 +0000 UTC m=+1296.594154896" Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.396237 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" path="/var/lib/kubelet/pods/42432d22-20ca-464e-be0b-e881c9ef89a7/volumes" Mar 08 19:53:15 crc kubenswrapper[4885]: I0308 19:53:15.831158 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.148090 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" event={"ID":"d087e374-bcc9-4a44-8fbe-aee43a47115e","Type":"ContainerStarted","Data":"88950a0ff1c85394753db02c54eed22b3bf6bb84a7de33f37b2a9f7d03adb063"} Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.148544 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.180143 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" podStartSLOduration=4.180127915 podStartE2EDuration="4.180127915s" podCreationTimestamp="2026-03-08 19:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:16.177085844 +0000 UTC m=+1297.573139877" watchObservedRunningTime="2026-03-08 19:53:16.180127915 +0000 UTC m=+1297.576181928" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.469878 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-796cf584f6-dfmcm"] Mar 08 19:53:16 crc kubenswrapper[4885]: E0308 19:53:16.470253 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerName="init" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.470271 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerName="init" Mar 08 19:53:16 crc kubenswrapper[4885]: E0308 19:53:16.470293 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerName="dnsmasq-dns" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.470300 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerName="dnsmasq-dns" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.470476 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="42432d22-20ca-464e-be0b-e881c9ef89a7" containerName="dnsmasq-dns" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.472743 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.476108 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.476279 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.478844 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-796cf584f6-dfmcm"] Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.485777 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.485878 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data-custom\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.485971 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e55887-f8af-4c57-820d-c46d0ee9cd9f-logs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587651 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-combined-ca-bundle\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587693 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-internal-tls-certs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587733 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-public-tls-certs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587852 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26jxd\" (UniqueName: \"kubernetes.io/projected/35e55887-f8af-4c57-820d-c46d0ee9cd9f-kube-api-access-26jxd\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587875 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data-custom\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.587900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e55887-f8af-4c57-820d-c46d0ee9cd9f-logs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.588609 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e55887-f8af-4c57-820d-c46d0ee9cd9f-logs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.592487 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.593499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data-custom\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.689034 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26jxd\" (UniqueName: \"kubernetes.io/projected/35e55887-f8af-4c57-820d-c46d0ee9cd9f-kube-api-access-26jxd\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.689431 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-combined-ca-bundle\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.689469 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-internal-tls-certs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.689533 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-public-tls-certs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.693215 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-internal-tls-certs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.693321 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-public-tls-certs\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.693781 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-combined-ca-bundle\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.717608 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26jxd\" (UniqueName: \"kubernetes.io/projected/35e55887-f8af-4c57-820d-c46d0ee9cd9f-kube-api-access-26jxd\") pod \"barbican-api-796cf584f6-dfmcm\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:16 crc kubenswrapper[4885]: I0308 19:53:16.791519 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.158472 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf","Type":"ContainerStarted","Data":"aedf83af60a05d52fadd9f2d996eac1d5ae1cee15c263accbc4acefecf4b6ed5"} Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.166380 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api-log" containerID="cri-o://d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9" gracePeriod=30 Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.166812 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bd91d80d-b465-47bc-ab15-cc9281dbb198","Type":"ContainerStarted","Data":"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d"} Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.166863 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.167271 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api" containerID="cri-o://3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d" gracePeriod=30 Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.191967 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.191946192 podStartE2EDuration="5.191946192s" podCreationTimestamp="2026-03-08 19:53:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:17.190580146 +0000 UTC m=+1298.586634179" watchObservedRunningTime="2026-03-08 19:53:17.191946192 +0000 UTC m=+1298.588000225" Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.332110 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-796cf584f6-dfmcm"] Mar 08 19:53:17 crc kubenswrapper[4885]: I0308 19:53:17.829637 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012530 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd91d80d-b465-47bc-ab15-cc9281dbb198-logs\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012595 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012655 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd91d80d-b465-47bc-ab15-cc9281dbb198-etc-machine-id\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012794 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data-custom\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012820 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58gz4\" (UniqueName: \"kubernetes.io/projected/bd91d80d-b465-47bc-ab15-cc9281dbb198-kube-api-access-58gz4\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012842 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-scripts\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.012871 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-combined-ca-bundle\") pod \"bd91d80d-b465-47bc-ab15-cc9281dbb198\" (UID: \"bd91d80d-b465-47bc-ab15-cc9281dbb198\") " Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.013661 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bd91d80d-b465-47bc-ab15-cc9281dbb198-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.014010 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd91d80d-b465-47bc-ab15-cc9281dbb198-logs" (OuterVolumeSpecName: "logs") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.018767 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-scripts" (OuterVolumeSpecName: "scripts") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.018875 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.037328 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd91d80d-b465-47bc-ab15-cc9281dbb198-kube-api-access-58gz4" (OuterVolumeSpecName: "kube-api-access-58gz4") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "kube-api-access-58gz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.041371 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.061720 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data" (OuterVolumeSpecName: "config-data") pod "bd91d80d-b465-47bc-ab15-cc9281dbb198" (UID: "bd91d80d-b465-47bc-ab15-cc9281dbb198"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114583 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114611 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58gz4\" (UniqueName: \"kubernetes.io/projected/bd91d80d-b465-47bc-ab15-cc9281dbb198-kube-api-access-58gz4\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114620 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114630 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114639 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd91d80d-b465-47bc-ab15-cc9281dbb198-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114648 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd91d80d-b465-47bc-ab15-cc9281dbb198-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.114657 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bd91d80d-b465-47bc-ab15-cc9281dbb198-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.208085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf","Type":"ContainerStarted","Data":"dffb5abae31631b6e6c2de1c5ddd900437c42ccc0290d24bea54ec81cbab75b2"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.215908 4885 generic.go:334] "Generic (PLEG): container finished" podID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerID="3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d" exitCode=0 Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.215965 4885 generic.go:334] "Generic (PLEG): container finished" podID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerID="d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9" exitCode=143 Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.215982 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.216013 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bd91d80d-b465-47bc-ab15-cc9281dbb198","Type":"ContainerDied","Data":"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.216042 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bd91d80d-b465-47bc-ab15-cc9281dbb198","Type":"ContainerDied","Data":"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.216052 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"bd91d80d-b465-47bc-ab15-cc9281dbb198","Type":"ContainerDied","Data":"6ccedb26b689eacc5264cde7ce6de809146d5707c5200202b877132d6cb530ed"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.216074 4885 scope.go:117] "RemoveContainer" containerID="3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.221763 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-796cf584f6-dfmcm" event={"ID":"35e55887-f8af-4c57-820d-c46d0ee9cd9f","Type":"ContainerStarted","Data":"786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.221866 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-796cf584f6-dfmcm" event={"ID":"35e55887-f8af-4c57-820d-c46d0ee9cd9f","Type":"ContainerStarted","Data":"d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.221970 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-796cf584f6-dfmcm" event={"ID":"35e55887-f8af-4c57-820d-c46d0ee9cd9f","Type":"ContainerStarted","Data":"9666e26b13c4933935ee0abcb40c76da8cace1d3e077db5278af8135676f6e1f"} Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.222061 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.222218 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.241451 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.630458131 podStartE2EDuration="6.241433117s" podCreationTimestamp="2026-03-08 19:53:12 +0000 UTC" firstStartedPulling="2026-03-08 19:53:14.041281844 +0000 UTC m=+1295.437335867" lastFinishedPulling="2026-03-08 19:53:15.65225683 +0000 UTC m=+1297.048310853" observedRunningTime="2026-03-08 19:53:18.226197209 +0000 UTC m=+1299.622251232" watchObservedRunningTime="2026-03-08 19:53:18.241433117 +0000 UTC m=+1299.637487140" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.245777 4885 scope.go:117] "RemoveContainer" containerID="d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.272653 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-796cf584f6-dfmcm" podStartSLOduration=2.27263041 podStartE2EDuration="2.27263041s" podCreationTimestamp="2026-03-08 19:53:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:18.250197111 +0000 UTC m=+1299.646251124" watchObservedRunningTime="2026-03-08 19:53:18.27263041 +0000 UTC m=+1299.668684433" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.280137 4885 scope.go:117] "RemoveContainer" containerID="3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d" Mar 08 19:53:18 crc kubenswrapper[4885]: E0308 19:53:18.285565 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d\": container with ID starting with 3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d not found: ID does not exist" containerID="3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.285610 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d"} err="failed to get container status \"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d\": rpc error: code = NotFound desc = could not find container \"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d\": container with ID starting with 3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d not found: ID does not exist" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.285719 4885 scope.go:117] "RemoveContainer" containerID="d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9" Mar 08 19:53:18 crc kubenswrapper[4885]: E0308 19:53:18.288480 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9\": container with ID starting with d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9 not found: ID does not exist" containerID="d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.288535 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9"} err="failed to get container status \"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9\": rpc error: code = NotFound desc = could not find container \"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9\": container with ID starting with d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9 not found: ID does not exist" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.288565 4885 scope.go:117] "RemoveContainer" containerID="3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.289177 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d"} err="failed to get container status \"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d\": rpc error: code = NotFound desc = could not find container \"3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d\": container with ID starting with 3fdf3f57be787eab451e7af1e7d33ceb16cb4ea5446147af982b7d88017a2c6d not found: ID does not exist" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.289202 4885 scope.go:117] "RemoveContainer" containerID="d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.289582 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9"} err="failed to get container status \"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9\": rpc error: code = NotFound desc = could not find container \"d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9\": container with ID starting with d6bc57d079614606cf61145be69934635905c01d47b0d807e3e1da974b6602f9 not found: ID does not exist" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.290811 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.305978 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.325745 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:18 crc kubenswrapper[4885]: E0308 19:53:18.326241 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.326262 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api" Mar 08 19:53:18 crc kubenswrapper[4885]: E0308 19:53:18.326294 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api-log" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.326304 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api-log" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.326521 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.326560 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" containerName="cinder-api-log" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.327674 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.329241 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.330554 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.330712 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.331147 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523195 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-scripts\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523536 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-etc-machine-id\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-public-tls-certs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523598 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data-custom\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523621 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523649 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523693 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.523747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9zl6\" (UniqueName: \"kubernetes.io/projected/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-kube-api-access-f9zl6\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.524018 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-logs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625249 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625327 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9zl6\" (UniqueName: \"kubernetes.io/projected/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-kube-api-access-f9zl6\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625364 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-logs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625426 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-scripts\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625448 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-etc-machine-id\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625488 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-public-tls-certs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625525 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data-custom\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625555 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.625595 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.626177 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-logs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.626267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-etc-machine-id\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.630306 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.630797 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.631175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-public-tls-certs\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.631971 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data-custom\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.633318 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.633336 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-scripts\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.647854 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9zl6\" (UniqueName: \"kubernetes.io/projected/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-kube-api-access-f9zl6\") pod \"cinder-api-0\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " pod="openstack/cinder-api-0" Mar 08 19:53:18 crc kubenswrapper[4885]: I0308 19:53:18.944855 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:53:19 crc kubenswrapper[4885]: I0308 19:53:19.378978 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd91d80d-b465-47bc-ab15-cc9281dbb198" path="/var/lib/kubelet/pods/bd91d80d-b465-47bc-ab15-cc9281dbb198/volumes" Mar 08 19:53:19 crc kubenswrapper[4885]: I0308 19:53:19.734143 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:53:20 crc kubenswrapper[4885]: I0308 19:53:20.239535 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-86998568fb-9gsxz" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 08 19:53:20 crc kubenswrapper[4885]: I0308 19:53:20.256642 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64baa35e-d1c2-48fe-a7a1-d0a4d1485908","Type":"ContainerStarted","Data":"f56864cf75c0bf77d3e8eee5fd2c82834b4c4219c0b0d60077918b5a5fcf0612"} Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.036769 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.037121 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.267675 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64baa35e-d1c2-48fe-a7a1-d0a4d1485908","Type":"ContainerStarted","Data":"7305f11ea3e6d044101cca24b88547af01f2e1506724f8e566c2a9df42c34dc6"} Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.267718 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64baa35e-d1c2-48fe-a7a1-d0a4d1485908","Type":"ContainerStarted","Data":"48e1f046f7d97f16af118173fbff33a7753d9f3ef98b111d3153850bfbfdaf65"} Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.268841 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.287232 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.287211182 podStartE2EDuration="3.287211182s" podCreationTimestamp="2026-03-08 19:53:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:21.286322798 +0000 UTC m=+1302.682376851" watchObservedRunningTime="2026-03-08 19:53:21.287211182 +0000 UTC m=+1302.683265205" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.528898 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.771672 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56dd4b5ff7-j89qr"] Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.771993 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56dd4b5ff7-j89qr" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-api" containerID="cri-o://f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304" gracePeriod=30 Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.772129 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-56dd4b5ff7-j89qr" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-httpd" containerID="cri-o://0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c" gracePeriod=30 Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.800635 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5bb5b9c587-nd8hp"] Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.801935 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.802185 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.820470 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bb5b9c587-nd8hp"] Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.897739 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-ovndb-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.897959 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-public-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.898096 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-httpd-config\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.898239 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-internal-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.898327 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-config\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.898400 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdbrd\" (UniqueName: \"kubernetes.io/projected/d1b91750-253e-46eb-9a1c-f7208dab2496-kube-api-access-sdbrd\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:21 crc kubenswrapper[4885]: I0308 19:53:21.898478 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-combined-ca-bundle\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000106 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-internal-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000515 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-config\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000544 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdbrd\" (UniqueName: \"kubernetes.io/projected/d1b91750-253e-46eb-9a1c-f7208dab2496-kube-api-access-sdbrd\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000587 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-combined-ca-bundle\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000625 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-ovndb-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000686 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-public-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.000726 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-httpd-config\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.010396 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-httpd-config\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.010410 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-config\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.015988 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-public-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.021833 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-ovndb-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.025672 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-internal-tls-certs\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.025808 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdbrd\" (UniqueName: \"kubernetes.io/projected/d1b91750-253e-46eb-9a1c-f7208dab2496-kube-api-access-sdbrd\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.027369 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-combined-ca-bundle\") pod \"neutron-5bb5b9c587-nd8hp\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.122173 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.286995 4885 generic.go:334] "Generic (PLEG): container finished" podID="4de3b511-619d-4637-ac70-f7e555976c0e" containerID="0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c" exitCode=0 Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.288040 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd4b5ff7-j89qr" event={"ID":"4de3b511-619d-4637-ac70-f7e555976c0e","Type":"ContainerDied","Data":"0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c"} Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.534518 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.665112 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.755564 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-gpkjd"] Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.755779 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerName="dnsmasq-dns" containerID="cri-o://054cb1ab59c45cfe6a5f78ad6e0db5b7c13aecc33b053086164f7b3a9057b5c9" gracePeriod=10 Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.781873 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5bb5b9c587-nd8hp"] Mar 08 19:53:22 crc kubenswrapper[4885]: W0308 19:53:22.811301 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1b91750_253e_46eb_9a1c_f7208dab2496.slice/crio-021bd0d886601b9f55240e0e88eca80cd21300113e390352e26c76ca5a8592dc WatchSource:0}: Error finding container 021bd0d886601b9f55240e0e88eca80cd21300113e390352e26c76ca5a8592dc: Status 404 returned error can't find the container with id 021bd0d886601b9f55240e0e88eca80cd21300113e390352e26c76ca5a8592dc Mar 08 19:53:22 crc kubenswrapper[4885]: I0308 19:53:22.931327 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.355132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb5b9c587-nd8hp" event={"ID":"d1b91750-253e-46eb-9a1c-f7208dab2496","Type":"ContainerStarted","Data":"17e37a1234fac68b042cb982b6be421ba7a3bd54c84d93b8bbb1842a9f1fa332"} Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.355454 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb5b9c587-nd8hp" event={"ID":"d1b91750-253e-46eb-9a1c-f7208dab2496","Type":"ContainerStarted","Data":"021bd0d886601b9f55240e0e88eca80cd21300113e390352e26c76ca5a8592dc"} Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.359685 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerID="054cb1ab59c45cfe6a5f78ad6e0db5b7c13aecc33b053086164f7b3a9057b5c9" exitCode=0 Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.360774 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" event={"ID":"3c32ee1c-ff69-4043-a31d-92be1d77a404","Type":"ContainerDied","Data":"054cb1ab59c45cfe6a5f78ad6e0db5b7c13aecc33b053086164f7b3a9057b5c9"} Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.413383 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.414059 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.551758 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-swift-storage-0\") pod \"3c32ee1c-ff69-4043-a31d-92be1d77a404\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.552054 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-nb\") pod \"3c32ee1c-ff69-4043-a31d-92be1d77a404\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.552141 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-svc\") pod \"3c32ee1c-ff69-4043-a31d-92be1d77a404\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.552159 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-sb\") pod \"3c32ee1c-ff69-4043-a31d-92be1d77a404\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.552224 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-config\") pod \"3c32ee1c-ff69-4043-a31d-92be1d77a404\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.552666 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkbcw\" (UniqueName: \"kubernetes.io/projected/3c32ee1c-ff69-4043-a31d-92be1d77a404-kube-api-access-mkbcw\") pod \"3c32ee1c-ff69-4043-a31d-92be1d77a404\" (UID: \"3c32ee1c-ff69-4043-a31d-92be1d77a404\") " Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.556569 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c32ee1c-ff69-4043-a31d-92be1d77a404-kube-api-access-mkbcw" (OuterVolumeSpecName: "kube-api-access-mkbcw") pod "3c32ee1c-ff69-4043-a31d-92be1d77a404" (UID: "3c32ee1c-ff69-4043-a31d-92be1d77a404"). InnerVolumeSpecName "kube-api-access-mkbcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.599076 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3c32ee1c-ff69-4043-a31d-92be1d77a404" (UID: "3c32ee1c-ff69-4043-a31d-92be1d77a404"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.603179 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3c32ee1c-ff69-4043-a31d-92be1d77a404" (UID: "3c32ee1c-ff69-4043-a31d-92be1d77a404"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.607497 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-config" (OuterVolumeSpecName: "config") pod "3c32ee1c-ff69-4043-a31d-92be1d77a404" (UID: "3c32ee1c-ff69-4043-a31d-92be1d77a404"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.619466 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3c32ee1c-ff69-4043-a31d-92be1d77a404" (UID: "3c32ee1c-ff69-4043-a31d-92be1d77a404"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.620123 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3c32ee1c-ff69-4043-a31d-92be1d77a404" (UID: "3c32ee1c-ff69-4043-a31d-92be1d77a404"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.655062 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkbcw\" (UniqueName: \"kubernetes.io/projected/3c32ee1c-ff69-4043-a31d-92be1d77a404-kube-api-access-mkbcw\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.655094 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.655114 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.655123 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.655133 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:23 crc kubenswrapper[4885]: I0308 19:53:23.655141 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c32ee1c-ff69-4043-a31d-92be1d77a404-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.371325 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.371318 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7859c7799c-gpkjd" event={"ID":"3c32ee1c-ff69-4043-a31d-92be1d77a404","Type":"ContainerDied","Data":"d542f54b493f5cfa9eb7867bd9658f59584a89f76c6986f57f0a54179d70ccd3"} Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.371470 4885 scope.go:117] "RemoveContainer" containerID="054cb1ab59c45cfe6a5f78ad6e0db5b7c13aecc33b053086164f7b3a9057b5c9" Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.371886 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-56dd4b5ff7-j89qr" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.157:9696/\": dial tcp 10.217.0.157:9696: connect: connection refused" Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.373490 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb5b9c587-nd8hp" event={"ID":"d1b91750-253e-46eb-9a1c-f7208dab2496","Type":"ContainerStarted","Data":"46919954f7a8695f89f60ffd2c95fd19f9f50cf97e2bbb06931bbceff7c47a47"} Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.373665 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="cinder-scheduler" containerID="cri-o://aedf83af60a05d52fadd9f2d996eac1d5ae1cee15c263accbc4acefecf4b6ed5" gracePeriod=30 Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.373730 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="probe" containerID="cri-o://dffb5abae31631b6e6c2de1c5ddd900437c42ccc0290d24bea54ec81cbab75b2" gracePeriod=30 Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.398701 4885 scope.go:117] "RemoveContainer" containerID="695a219704dcca2bf986cd5937fa221f84e3701eae327a3dbb20ee1df55cf8bc" Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.435788 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5bb5b9c587-nd8hp" podStartSLOduration=3.435764865 podStartE2EDuration="3.435764865s" podCreationTimestamp="2026-03-08 19:53:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:24.422437358 +0000 UTC m=+1305.818491391" watchObservedRunningTime="2026-03-08 19:53:24.435764865 +0000 UTC m=+1305.831818898" Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.450126 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-gpkjd"] Mar 08 19:53:24 crc kubenswrapper[4885]: I0308 19:53:24.459308 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7859c7799c-gpkjd"] Mar 08 19:53:25 crc kubenswrapper[4885]: I0308 19:53:25.378838 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" path="/var/lib/kubelet/pods/3c32ee1c-ff69-4043-a31d-92be1d77a404/volumes" Mar 08 19:53:25 crc kubenswrapper[4885]: I0308 19:53:25.386128 4885 generic.go:334] "Generic (PLEG): container finished" podID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerID="dffb5abae31631b6e6c2de1c5ddd900437c42ccc0290d24bea54ec81cbab75b2" exitCode=0 Mar 08 19:53:25 crc kubenswrapper[4885]: I0308 19:53:25.386198 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf","Type":"ContainerDied","Data":"dffb5abae31631b6e6c2de1c5ddd900437c42ccc0290d24bea54ec81cbab75b2"} Mar 08 19:53:25 crc kubenswrapper[4885]: I0308 19:53:25.387447 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.105336 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.361415 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-58c657b6d6-r4tf7"] Mar 08 19:53:27 crc kubenswrapper[4885]: E0308 19:53:27.361802 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerName="init" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.361818 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerName="init" Mar 08 19:53:27 crc kubenswrapper[4885]: E0308 19:53:27.361847 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerName="dnsmasq-dns" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.361854 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerName="dnsmasq-dns" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.371175 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c32ee1c-ff69-4043-a31d-92be1d77a404" containerName="dnsmasq-dns" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.376363 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.387750 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58c657b6d6-r4tf7"] Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.520320 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-internal-tls-certs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.520717 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44fnl\" (UniqueName: \"kubernetes.io/projected/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-kube-api-access-44fnl\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.520749 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-scripts\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.520779 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-config-data\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.520858 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-combined-ca-bundle\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.520899 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-public-tls-certs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.521009 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-logs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622537 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-scripts\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622585 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-config-data\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622645 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-combined-ca-bundle\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622676 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-public-tls-certs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622713 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-logs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622782 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-internal-tls-certs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.622808 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44fnl\" (UniqueName: \"kubernetes.io/projected/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-kube-api-access-44fnl\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.623448 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-logs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.628880 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-combined-ca-bundle\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.629669 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-public-tls-certs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.631431 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-config-data\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.632152 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-scripts\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.640078 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-internal-tls-certs\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.648776 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44fnl\" (UniqueName: \"kubernetes.io/projected/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-kube-api-access-44fnl\") pod \"placement-58c657b6d6-r4tf7\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:27 crc kubenswrapper[4885]: I0308 19:53:27.702980 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.242148 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-58c657b6d6-r4tf7"] Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.295526 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.437912 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-public-tls-certs\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.438627 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-config\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.438715 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-676dz\" (UniqueName: \"kubernetes.io/projected/4de3b511-619d-4637-ac70-f7e555976c0e-kube-api-access-676dz\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.438769 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-ovndb-tls-certs\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.438807 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-internal-tls-certs\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.438829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-combined-ca-bundle\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.438869 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-httpd-config\") pod \"4de3b511-619d-4637-ac70-f7e555976c0e\" (UID: \"4de3b511-619d-4637-ac70-f7e555976c0e\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.443706 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.453368 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de3b511-619d-4637-ac70-f7e555976c0e-kube-api-access-676dz" (OuterVolumeSpecName: "kube-api-access-676dz") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "kube-api-access-676dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.467340 4885 generic.go:334] "Generic (PLEG): container finished" podID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerID="aedf83af60a05d52fadd9f2d996eac1d5ae1cee15c263accbc4acefecf4b6ed5" exitCode=0 Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.467440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf","Type":"ContainerDied","Data":"aedf83af60a05d52fadd9f2d996eac1d5ae1cee15c263accbc4acefecf4b6ed5"} Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.471007 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c657b6d6-r4tf7" event={"ID":"719b68df-d1ac-49e5-ac34-dfa3ba33c97f","Type":"ContainerStarted","Data":"91a7898b581f4a0b0c09c7d67b2b320f9e2ef08425d7b081856b5b38c0f51cba"} Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.473121 4885 generic.go:334] "Generic (PLEG): container finished" podID="4de3b511-619d-4637-ac70-f7e555976c0e" containerID="f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304" exitCode=0 Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.473170 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-56dd4b5ff7-j89qr" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.473146 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd4b5ff7-j89qr" event={"ID":"4de3b511-619d-4637-ac70-f7e555976c0e","Type":"ContainerDied","Data":"f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304"} Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.473297 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-56dd4b5ff7-j89qr" event={"ID":"4de3b511-619d-4637-ac70-f7e555976c0e","Type":"ContainerDied","Data":"276c039d660964960e48e559165b8b647e2ab8ab57d7b59f8062379f583a0dc6"} Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.473340 4885 scope.go:117] "RemoveContainer" containerID="0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.489043 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.503915 4885 scope.go:117] "RemoveContainer" containerID="f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.528829 4885 scope.go:117] "RemoveContainer" containerID="0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c" Mar 08 19:53:28 crc kubenswrapper[4885]: E0308 19:53:28.529320 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c\": container with ID starting with 0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c not found: ID does not exist" containerID="0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.529361 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c"} err="failed to get container status \"0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c\": rpc error: code = NotFound desc = could not find container \"0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c\": container with ID starting with 0b4b057c9742483938bf5d9782c2acf6bf3ee3c003afa841fb92d3a612cc3c9c not found: ID does not exist" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.529385 4885 scope.go:117] "RemoveContainer" containerID="f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304" Mar 08 19:53:28 crc kubenswrapper[4885]: E0308 19:53:28.529751 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304\": container with ID starting with f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304 not found: ID does not exist" containerID="f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.529779 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304"} err="failed to get container status \"f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304\": rpc error: code = NotFound desc = could not find container \"f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304\": container with ID starting with f6c88613458476a76e682e5f580e9c0dbdab9a61a053cdcfeb8684590c97a304 not found: ID does not exist" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.540761 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.540788 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-676dz\" (UniqueName: \"kubernetes.io/projected/4de3b511-619d-4637-ac70-f7e555976c0e-kube-api-access-676dz\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.558598 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.607848 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.617248 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-config" (OuterVolumeSpecName: "config") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.624547 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.629282 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.642808 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.642826 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.642834 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.651600 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.698587 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86998568fb-9gsxz"] Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.698629 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4de3b511-619d-4637-ac70-f7e555976c0e" (UID: "4de3b511-619d-4637-ac70-f7e555976c0e"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.699226 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86998568fb-9gsxz" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api-log" containerID="cri-o://aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd" gracePeriod=30 Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.699452 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-86998568fb-9gsxz" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api" containerID="cri-o://49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f" gracePeriod=30 Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.748650 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-combined-ca-bundle\") pod \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.748705 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data\") pod \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.748731 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-etc-machine-id\") pod \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.748830 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data-custom\") pod \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.748935 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-scripts\") pod \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.748970 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zwcw\" (UniqueName: \"kubernetes.io/projected/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-kube-api-access-2zwcw\") pod \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\" (UID: \"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf\") " Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.749457 4885 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.749470 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de3b511-619d-4637-ac70-f7e555976c0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.749540 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" (UID: "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.757066 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-kube-api-access-2zwcw" (OuterVolumeSpecName: "kube-api-access-2zwcw") pod "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" (UID: "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf"). InnerVolumeSpecName "kube-api-access-2zwcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.765777 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" (UID: "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.766200 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-scripts" (OuterVolumeSpecName: "scripts") pod "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" (UID: "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.848891 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" (UID: "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.851157 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.851196 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.851211 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zwcw\" (UniqueName: \"kubernetes.io/projected/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-kube-api-access-2zwcw\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.851224 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.851235 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.910527 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data" (OuterVolumeSpecName: "config-data") pod "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" (UID: "6ac3e4ad-92f1-4c79-bc47-5e9707b376bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.921052 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-56dd4b5ff7-j89qr"] Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.932781 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-56dd4b5ff7-j89qr"] Mar 08 19:53:28 crc kubenswrapper[4885]: I0308 19:53:28.955519 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.333729 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.339228 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.408492 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" path="/var/lib/kubelet/pods/4de3b511-619d-4637-ac70-f7e555976c0e/volumes" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.497493 4885 generic.go:334] "Generic (PLEG): container finished" podID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerID="aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd" exitCode=143 Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.497589 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86998568fb-9gsxz" event={"ID":"5d5033f1-b303-4891-875f-8f9bcb7585c0","Type":"ContainerDied","Data":"aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd"} Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.502023 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c657b6d6-r4tf7" event={"ID":"719b68df-d1ac-49e5-ac34-dfa3ba33c97f","Type":"ContainerStarted","Data":"1b8b8e4856a24e16b23d4c15ef261857dfcc94531017a1b02728028102e1d5ce"} Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.502066 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c657b6d6-r4tf7" event={"ID":"719b68df-d1ac-49e5-ac34-dfa3ba33c97f","Type":"ContainerStarted","Data":"2ea3e6e51d477fb7795967def43fe0be522063fbf11d9053ce41fa22a8bf42b3"} Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.503490 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.503520 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.512448 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.513005 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6ac3e4ad-92f1-4c79-bc47-5e9707b376bf","Type":"ContainerDied","Data":"b4e02e9dba863f76b95bf3b4c26ddf5112fe936b13a4f102d174a8450de9c4d1"} Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.513039 4885 scope.go:117] "RemoveContainer" containerID="dffb5abae31631b6e6c2de1c5ddd900437c42ccc0290d24bea54ec81cbab75b2" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.537896 4885 scope.go:117] "RemoveContainer" containerID="aedf83af60a05d52fadd9f2d996eac1d5ae1cee15c263accbc4acefecf4b6ed5" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.540421 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-58c657b6d6-r4tf7" podStartSLOduration=2.540382194 podStartE2EDuration="2.540382194s" podCreationTimestamp="2026-03-08 19:53:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:29.529337569 +0000 UTC m=+1310.925391592" watchObservedRunningTime="2026-03-08 19:53:29.540382194 +0000 UTC m=+1310.936436217" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.573468 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.585783 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.599671 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:29 crc kubenswrapper[4885]: E0308 19:53:29.600080 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="probe" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600098 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="probe" Mar 08 19:53:29 crc kubenswrapper[4885]: E0308 19:53:29.600117 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-api" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600124 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-api" Mar 08 19:53:29 crc kubenswrapper[4885]: E0308 19:53:29.600140 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-httpd" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600148 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-httpd" Mar 08 19:53:29 crc kubenswrapper[4885]: E0308 19:53:29.600166 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="cinder-scheduler" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600172 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="cinder-scheduler" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600330 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="cinder-scheduler" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600342 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" containerName="probe" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600354 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-httpd" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.600362 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de3b511-619d-4637-ac70-f7e555976c0e" containerName="neutron-api" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.601730 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.604985 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.607514 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.769323 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.769627 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13df70e2-1a9e-4d81-b23b-c461291bce93-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.769732 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-scripts\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.769849 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt46p\" (UniqueName: \"kubernetes.io/projected/13df70e2-1a9e-4d81-b23b-c461291bce93-kube-api-access-wt46p\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.769945 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.770019 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.871290 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.871390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13df70e2-1a9e-4d81-b23b-c461291bce93-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.871438 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-scripts\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.871470 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt46p\" (UniqueName: \"kubernetes.io/projected/13df70e2-1a9e-4d81-b23b-c461291bce93-kube-api-access-wt46p\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.871489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.871514 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.874122 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13df70e2-1a9e-4d81-b23b-c461291bce93-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.878198 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.879239 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-scripts\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.880681 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.880956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.893137 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt46p\" (UniqueName: \"kubernetes.io/projected/13df70e2-1a9e-4d81-b23b-c461291bce93-kube-api-access-wt46p\") pod \"cinder-scheduler-0\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " pod="openstack/cinder-scheduler-0" Mar 08 19:53:29 crc kubenswrapper[4885]: I0308 19:53:29.932872 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:53:30 crc kubenswrapper[4885]: I0308 19:53:30.422828 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:53:30 crc kubenswrapper[4885]: I0308 19:53:30.541661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13df70e2-1a9e-4d81-b23b-c461291bce93","Type":"ContainerStarted","Data":"4d0a6fa6c058e8e3d990a208a072ec9c1c565777e02360b62370fc36d2e37246"} Mar 08 19:53:31 crc kubenswrapper[4885]: I0308 19:53:31.234773 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 08 19:53:31 crc kubenswrapper[4885]: I0308 19:53:31.380893 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ac3e4ad-92f1-4c79-bc47-5e9707b376bf" path="/var/lib/kubelet/pods/6ac3e4ad-92f1-4c79-bc47-5e9707b376bf/volumes" Mar 08 19:53:31 crc kubenswrapper[4885]: I0308 19:53:31.551433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13df70e2-1a9e-4d81-b23b-c461291bce93","Type":"ContainerStarted","Data":"fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12"} Mar 08 19:53:31 crc kubenswrapper[4885]: I0308 19:53:31.873000 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-86998568fb-9gsxz" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:45958->10.217.0.163:9311: read: connection reset by peer" Mar 08 19:53:31 crc kubenswrapper[4885]: I0308 19:53:31.873000 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-86998568fb-9gsxz" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:45972->10.217.0.163:9311: read: connection reset by peer" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.268459 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.473456 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-combined-ca-bundle\") pod \"5d5033f1-b303-4891-875f-8f9bcb7585c0\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.473654 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d5033f1-b303-4891-875f-8f9bcb7585c0-logs\") pod \"5d5033f1-b303-4891-875f-8f9bcb7585c0\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.473682 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nclf\" (UniqueName: \"kubernetes.io/projected/5d5033f1-b303-4891-875f-8f9bcb7585c0-kube-api-access-5nclf\") pod \"5d5033f1-b303-4891-875f-8f9bcb7585c0\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.473715 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data\") pod \"5d5033f1-b303-4891-875f-8f9bcb7585c0\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.473770 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data-custom\") pod \"5d5033f1-b303-4891-875f-8f9bcb7585c0\" (UID: \"5d5033f1-b303-4891-875f-8f9bcb7585c0\") " Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.476437 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d5033f1-b303-4891-875f-8f9bcb7585c0-logs" (OuterVolumeSpecName: "logs") pod "5d5033f1-b303-4891-875f-8f9bcb7585c0" (UID: "5d5033f1-b303-4891-875f-8f9bcb7585c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.489307 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5d5033f1-b303-4891-875f-8f9bcb7585c0" (UID: "5d5033f1-b303-4891-875f-8f9bcb7585c0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.494143 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d5033f1-b303-4891-875f-8f9bcb7585c0-kube-api-access-5nclf" (OuterVolumeSpecName: "kube-api-access-5nclf") pod "5d5033f1-b303-4891-875f-8f9bcb7585c0" (UID: "5d5033f1-b303-4891-875f-8f9bcb7585c0"). InnerVolumeSpecName "kube-api-access-5nclf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.509518 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d5033f1-b303-4891-875f-8f9bcb7585c0" (UID: "5d5033f1-b303-4891-875f-8f9bcb7585c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.539055 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data" (OuterVolumeSpecName: "config-data") pod "5d5033f1-b303-4891-875f-8f9bcb7585c0" (UID: "5d5033f1-b303-4891-875f-8f9bcb7585c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.561548 4885 generic.go:334] "Generic (PLEG): container finished" podID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerID="49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f" exitCode=0 Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.561622 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86998568fb-9gsxz" event={"ID":"5d5033f1-b303-4891-875f-8f9bcb7585c0","Type":"ContainerDied","Data":"49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f"} Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.561647 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86998568fb-9gsxz" event={"ID":"5d5033f1-b303-4891-875f-8f9bcb7585c0","Type":"ContainerDied","Data":"004645ddace08c8f81341f8f25e364cdb5c96da8e77e37b02915c3180098f6f0"} Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.561664 4885 scope.go:117] "RemoveContainer" containerID="49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.561790 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86998568fb-9gsxz" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.563688 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13df70e2-1a9e-4d81-b23b-c461291bce93","Type":"ContainerStarted","Data":"ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5"} Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.583729 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.584247 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d5033f1-b303-4891-875f-8f9bcb7585c0-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.584265 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nclf\" (UniqueName: \"kubernetes.io/projected/5d5033f1-b303-4891-875f-8f9bcb7585c0-kube-api-access-5nclf\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.584277 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.584286 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5d5033f1-b303-4891-875f-8f9bcb7585c0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.585449 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.58543272 podStartE2EDuration="3.58543272s" podCreationTimestamp="2026-03-08 19:53:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:32.58242372 +0000 UTC m=+1313.978477733" watchObservedRunningTime="2026-03-08 19:53:32.58543272 +0000 UTC m=+1313.981486743" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.621421 4885 scope.go:117] "RemoveContainer" containerID="aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.633409 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-86998568fb-9gsxz"] Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.637346 4885 scope.go:117] "RemoveContainer" containerID="49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f" Mar 08 19:53:32 crc kubenswrapper[4885]: E0308 19:53:32.637810 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f\": container with ID starting with 49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f not found: ID does not exist" containerID="49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.637837 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f"} err="failed to get container status \"49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f\": rpc error: code = NotFound desc = could not find container \"49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f\": container with ID starting with 49ca64d49a9e1e46e54bd37714779004f84e79f779640af5224925dd5b17977f not found: ID does not exist" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.637873 4885 scope.go:117] "RemoveContainer" containerID="aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd" Mar 08 19:53:32 crc kubenswrapper[4885]: E0308 19:53:32.638153 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd\": container with ID starting with aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd not found: ID does not exist" containerID="aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.638230 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd"} err="failed to get container status \"aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd\": rpc error: code = NotFound desc = could not find container \"aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd\": container with ID starting with aecfd7ce1800dc86ad51688344e4c2583296c3dea9d6a598d538cebfb67dbfbd not found: ID does not exist" Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.641615 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-86998568fb-9gsxz"] Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.818713 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:53:32 crc kubenswrapper[4885]: I0308 19:53:32.818770 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:53:33 crc kubenswrapper[4885]: I0308 19:53:33.407795 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" path="/var/lib/kubelet/pods/5d5033f1-b303-4891-875f-8f9bcb7585c0/volumes" Mar 08 19:53:33 crc kubenswrapper[4885]: I0308 19:53:33.491716 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:53:34 crc kubenswrapper[4885]: I0308 19:53:34.933328 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.300768 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-744484b5fc-g6mjz"] Mar 08 19:53:37 crc kubenswrapper[4885]: E0308 19:53:37.301789 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.301807 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api" Mar 08 19:53:37 crc kubenswrapper[4885]: E0308 19:53:37.301827 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api-log" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.301836 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api-log" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.302059 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api-log" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.302088 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d5033f1-b303-4891-875f-8f9bcb7585c0" containerName="barbican-api" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.303912 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.305673 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.307219 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.307469 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.335505 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-744484b5fc-g6mjz"] Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384448 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-internal-tls-certs\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384557 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-run-httpd\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384632 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-config-data\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384663 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-combined-ca-bundle\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384724 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-public-tls-certs\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-log-httpd\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384879 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-etc-swift\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.384989 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsh5k\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-kube-api-access-jsh5k\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486505 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-public-tls-certs\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486567 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-log-httpd\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486627 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-etc-swift\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486675 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsh5k\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-kube-api-access-jsh5k\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486738 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-internal-tls-certs\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-run-httpd\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486844 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-config-data\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.486866 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-combined-ca-bundle\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.487478 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-log-httpd\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.487496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-run-httpd\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.494588 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-public-tls-certs\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.494858 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-internal-tls-certs\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.495373 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-etc-swift\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.496066 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-config-data\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.499774 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-combined-ca-bundle\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.510794 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsh5k\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-kube-api-access-jsh5k\") pod \"swift-proxy-744484b5fc-g6mjz\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:37 crc kubenswrapper[4885]: I0308 19:53:37.623871 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:38 crc kubenswrapper[4885]: W0308 19:53:38.176586 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60f9821e_e554_4594_bfb2_9521cd3c171a.slice/crio-a9b17f7da8fbb915380a49441fc73dc30251abdd34aa8c926a52efea3b64bcdc WatchSource:0}: Error finding container a9b17f7da8fbb915380a49441fc73dc30251abdd34aa8c926a52efea3b64bcdc: Status 404 returned error can't find the container with id a9b17f7da8fbb915380a49441fc73dc30251abdd34aa8c926a52efea3b64bcdc Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.177451 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-744484b5fc-g6mjz"] Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.408681 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.410552 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.413635 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.414528 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.415070 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-r9g5q" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.419244 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.503951 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config-secret\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.504088 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.504225 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.504335 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ml5\" (UniqueName: \"kubernetes.io/projected/c6c74f05-881e-48c0-82d2-d90356ad15eb-kube-api-access-c6ml5\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.606321 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config-secret\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.606374 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.606425 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.606449 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6ml5\" (UniqueName: \"kubernetes.io/projected/c6c74f05-881e-48c0-82d2-d90356ad15eb-kube-api-access-c6ml5\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.607458 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.612622 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config-secret\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.612838 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-combined-ca-bundle\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.636153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6ml5\" (UniqueName: \"kubernetes.io/projected/c6c74f05-881e-48c0-82d2-d90356ad15eb-kube-api-access-c6ml5\") pod \"openstackclient\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.639303 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744484b5fc-g6mjz" event={"ID":"60f9821e-e554-4594-bfb2-9521cd3c171a","Type":"ContainerStarted","Data":"67f1959dc61ea688f5069874c4fb20e1a6cd0f9f33725553c532e43624ce5124"} Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.639352 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744484b5fc-g6mjz" event={"ID":"60f9821e-e554-4594-bfb2-9521cd3c171a","Type":"ContainerStarted","Data":"a9b17f7da8fbb915380a49441fc73dc30251abdd34aa8c926a52efea3b64bcdc"} Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.723087 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.723887 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.729192 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.900654 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.902802 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.912386 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config-secret\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.912751 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.912875 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.912990 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phmx9\" (UniqueName: \"kubernetes.io/projected/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-kube-api-access-phmx9\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:38 crc kubenswrapper[4885]: I0308 19:53:38.926101 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:38 crc kubenswrapper[4885]: E0308 19:53:38.964692 4885 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 08 19:53:38 crc kubenswrapper[4885]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_c6c74f05-881e-48c0-82d2-d90356ad15eb_0(ff48690a1408ad1bdae4ca0bcfeea04329365eff07df2ff2b5fae30426429404): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ff48690a1408ad1bdae4ca0bcfeea04329365eff07df2ff2b5fae30426429404" Netns:"/var/run/netns/2ba25179-213b-4737-9a9a-c6d6279fe0a3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=ff48690a1408ad1bdae4ca0bcfeea04329365eff07df2ff2b5fae30426429404;K8S_POD_UID=c6c74f05-881e-48c0-82d2-d90356ad15eb" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/c6c74f05-881e-48c0-82d2-d90356ad15eb]: expected pod UID "c6c74f05-881e-48c0-82d2-d90356ad15eb" but got "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" from Kube API Mar 08 19:53:38 crc kubenswrapper[4885]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 19:53:38 crc kubenswrapper[4885]: > Mar 08 19:53:38 crc kubenswrapper[4885]: E0308 19:53:38.964759 4885 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 08 19:53:38 crc kubenswrapper[4885]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_c6c74f05-881e-48c0-82d2-d90356ad15eb_0(ff48690a1408ad1bdae4ca0bcfeea04329365eff07df2ff2b5fae30426429404): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"ff48690a1408ad1bdae4ca0bcfeea04329365eff07df2ff2b5fae30426429404" Netns:"/var/run/netns/2ba25179-213b-4737-9a9a-c6d6279fe0a3" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=ff48690a1408ad1bdae4ca0bcfeea04329365eff07df2ff2b5fae30426429404;K8S_POD_UID=c6c74f05-881e-48c0-82d2-d90356ad15eb" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/c6c74f05-881e-48c0-82d2-d90356ad15eb]: expected pod UID "c6c74f05-881e-48c0-82d2-d90356ad15eb" but got "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" from Kube API Mar 08 19:53:38 crc kubenswrapper[4885]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 08 19:53:38 crc kubenswrapper[4885]: > pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.016289 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phmx9\" (UniqueName: \"kubernetes.io/projected/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-kube-api-access-phmx9\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.016358 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config-secret\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.016491 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.016548 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.017755 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.027805 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config-secret\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.040338 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.042702 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phmx9\" (UniqueName: \"kubernetes.io/projected/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-kube-api-access-phmx9\") pod \"openstackclient\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.223785 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.648539 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.649071 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744484b5fc-g6mjz" event={"ID":"60f9821e-e554-4594-bfb2-9521cd3c171a","Type":"ContainerStarted","Data":"52b3a63b99137a90e03814ec54cafbe70d9ceac7f36dc9e81e362bf716f0276b"} Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.649146 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.649181 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.655192 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 19:53:39 crc kubenswrapper[4885]: W0308 19:53:39.674034 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddd62ab9_bb59_47ef_b639_fd0a0a4c4b84.slice/crio-b77651b6a5d5a919a5f8a7da9eb432b61a596a28b9541d2b704c7148835a224c WatchSource:0}: Error finding container b77651b6a5d5a919a5f8a7da9eb432b61a596a28b9541d2b704c7148835a224c: Status 404 returned error can't find the container with id b77651b6a5d5a919a5f8a7da9eb432b61a596a28b9541d2b704c7148835a224c Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.680212 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-744484b5fc-g6mjz" podStartSLOduration=2.680189896 podStartE2EDuration="2.680189896s" podCreationTimestamp="2026-03-08 19:53:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:39.675955963 +0000 UTC m=+1321.072009986" watchObservedRunningTime="2026-03-08 19:53:39.680189896 +0000 UTC m=+1321.076243919" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.699839 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.702495 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c6c74f05-881e-48c0-82d2-d90356ad15eb" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.729732 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-combined-ca-bundle\") pod \"c6c74f05-881e-48c0-82d2-d90356ad15eb\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.729787 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config\") pod \"c6c74f05-881e-48c0-82d2-d90356ad15eb\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.729824 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6ml5\" (UniqueName: \"kubernetes.io/projected/c6c74f05-881e-48c0-82d2-d90356ad15eb-kube-api-access-c6ml5\") pod \"c6c74f05-881e-48c0-82d2-d90356ad15eb\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.729865 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config-secret\") pod \"c6c74f05-881e-48c0-82d2-d90356ad15eb\" (UID: \"c6c74f05-881e-48c0-82d2-d90356ad15eb\") " Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.730297 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "c6c74f05-881e-48c0-82d2-d90356ad15eb" (UID: "c6c74f05-881e-48c0-82d2-d90356ad15eb"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.731116 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.741039 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6c74f05-881e-48c0-82d2-d90356ad15eb-kube-api-access-c6ml5" (OuterVolumeSpecName: "kube-api-access-c6ml5") pod "c6c74f05-881e-48c0-82d2-d90356ad15eb" (UID: "c6c74f05-881e-48c0-82d2-d90356ad15eb"). InnerVolumeSpecName "kube-api-access-c6ml5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.752058 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6c74f05-881e-48c0-82d2-d90356ad15eb" (UID: "c6c74f05-881e-48c0-82d2-d90356ad15eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.752111 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "c6c74f05-881e-48c0-82d2-d90356ad15eb" (UID: "c6c74f05-881e-48c0-82d2-d90356ad15eb"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.832961 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.832996 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6ml5\" (UniqueName: \"kubernetes.io/projected/c6c74f05-881e-48c0-82d2-d90356ad15eb-kube-api-access-c6ml5\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:39 crc kubenswrapper[4885]: I0308 19:53:39.833009 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/c6c74f05-881e-48c0-82d2-d90356ad15eb-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:40 crc kubenswrapper[4885]: I0308 19:53:40.157276 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 08 19:53:40 crc kubenswrapper[4885]: I0308 19:53:40.660836 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84","Type":"ContainerStarted","Data":"b77651b6a5d5a919a5f8a7da9eb432b61a596a28b9541d2b704c7148835a224c"} Mar 08 19:53:40 crc kubenswrapper[4885]: I0308 19:53:40.660884 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:53:40 crc kubenswrapper[4885]: I0308 19:53:40.677573 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="c6c74f05-881e-48c0-82d2-d90356ad15eb" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.381351 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6c74f05-881e-48c0-82d2-d90356ad15eb" path="/var/lib/kubelet/pods/c6c74f05-881e-48c0-82d2-d90356ad15eb/volumes" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.465753 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.566331 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-sg-core-conf-yaml\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.566729 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-run-httpd\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.566954 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-log-httpd\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.567103 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-scripts\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.567198 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bkw5\" (UniqueName: \"kubernetes.io/projected/624830da-2b73-4843-bb04-6db9c1a7b281-kube-api-access-5bkw5\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.567293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-config-data\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.567365 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-combined-ca-bundle\") pod \"624830da-2b73-4843-bb04-6db9c1a7b281\" (UID: \"624830da-2b73-4843-bb04-6db9c1a7b281\") " Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.567773 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.567840 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.588606 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-scripts" (OuterVolumeSpecName: "scripts") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.589870 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624830da-2b73-4843-bb04-6db9c1a7b281-kube-api-access-5bkw5" (OuterVolumeSpecName: "kube-api-access-5bkw5") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "kube-api-access-5bkw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.598155 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.650615 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.668857 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.669253 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.669316 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/624830da-2b73-4843-bb04-6db9c1a7b281-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.669382 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.669444 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bkw5\" (UniqueName: \"kubernetes.io/projected/624830da-2b73-4843-bb04-6db9c1a7b281-kube-api-access-5bkw5\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.669503 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.676252 4885 generic.go:334] "Generic (PLEG): container finished" podID="624830da-2b73-4843-bb04-6db9c1a7b281" containerID="777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d" exitCode=137 Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.676310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerDied","Data":"777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d"} Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.676368 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"624830da-2b73-4843-bb04-6db9c1a7b281","Type":"ContainerDied","Data":"7df05fec0614fac93aab8ae29777fc59abe79ffd7aca5937f60cd7d723e3ec63"} Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.676388 4885 scope.go:117] "RemoveContainer" containerID="777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.676506 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.687442 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-config-data" (OuterVolumeSpecName: "config-data") pod "624830da-2b73-4843-bb04-6db9c1a7b281" (UID: "624830da-2b73-4843-bb04-6db9c1a7b281"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.697397 4885 scope.go:117] "RemoveContainer" containerID="37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.751555 4885 scope.go:117] "RemoveContainer" containerID="b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.772150 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/624830da-2b73-4843-bb04-6db9c1a7b281-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.773739 4885 scope.go:117] "RemoveContainer" containerID="e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.793296 4885 scope.go:117] "RemoveContainer" containerID="777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d" Mar 08 19:53:41 crc kubenswrapper[4885]: E0308 19:53:41.793735 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d\": container with ID starting with 777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d not found: ID does not exist" containerID="777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.793779 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d"} err="failed to get container status \"777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d\": rpc error: code = NotFound desc = could not find container \"777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d\": container with ID starting with 777ec44eaf0048dbe4cdb216e58c2632140d011e135c32742e200184c686980d not found: ID does not exist" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.793806 4885 scope.go:117] "RemoveContainer" containerID="37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b" Mar 08 19:53:41 crc kubenswrapper[4885]: E0308 19:53:41.794174 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b\": container with ID starting with 37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b not found: ID does not exist" containerID="37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.794231 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b"} err="failed to get container status \"37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b\": rpc error: code = NotFound desc = could not find container \"37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b\": container with ID starting with 37349c5a6103ead987b55b56e2e3b5a85f6b2c961832f831a7f3db16750c050b not found: ID does not exist" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.794265 4885 scope.go:117] "RemoveContainer" containerID="b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2" Mar 08 19:53:41 crc kubenswrapper[4885]: E0308 19:53:41.794741 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2\": container with ID starting with b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2 not found: ID does not exist" containerID="b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.794774 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2"} err="failed to get container status \"b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2\": rpc error: code = NotFound desc = could not find container \"b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2\": container with ID starting with b6402a754edcca20d8ab6663f4228c2a19a524d99533f11229b860bdc77c85c2 not found: ID does not exist" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.794793 4885 scope.go:117] "RemoveContainer" containerID="e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f" Mar 08 19:53:41 crc kubenswrapper[4885]: E0308 19:53:41.795028 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f\": container with ID starting with e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f not found: ID does not exist" containerID="e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f" Mar 08 19:53:41 crc kubenswrapper[4885]: I0308 19:53:41.795059 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f"} err="failed to get container status \"e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f\": rpc error: code = NotFound desc = could not find container \"e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f\": container with ID starting with e09bf4de2823f503707ab8557468fc163844e4bec6b87074b4427ca38f46831f not found: ID does not exist" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.024451 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.042267 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052250 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:42 crc kubenswrapper[4885]: E0308 19:53:42.052646 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="proxy-httpd" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052666 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="proxy-httpd" Mar 08 19:53:42 crc kubenswrapper[4885]: E0308 19:53:42.052676 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="sg-core" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052682 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="sg-core" Mar 08 19:53:42 crc kubenswrapper[4885]: E0308 19:53:42.052711 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-notification-agent" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052719 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-notification-agent" Mar 08 19:53:42 crc kubenswrapper[4885]: E0308 19:53:42.052729 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-central-agent" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052734 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-central-agent" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052892 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-central-agent" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052907 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="ceilometer-notification-agent" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052933 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="sg-core" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.052947 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" containerName="proxy-httpd" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.054469 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.056757 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.056839 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.063309 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.183835 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-run-httpd\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.183992 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-scripts\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.184018 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-config-data\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.184085 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.184509 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-log-httpd\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.184590 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54bx7\" (UniqueName: \"kubernetes.io/projected/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-kube-api-access-54bx7\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.184626 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.285787 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.285887 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-log-httpd\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.285913 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54bx7\" (UniqueName: \"kubernetes.io/projected/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-kube-api-access-54bx7\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.285945 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.285994 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-run-httpd\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.286033 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-scripts\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.286052 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-config-data\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.286551 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-log-httpd\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.286573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-run-httpd\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.292781 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-config-data\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.295669 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.297377 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-scripts\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.309836 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54bx7\" (UniqueName: \"kubernetes.io/projected/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-kube-api-access-54bx7\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.323492 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.418650 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:42 crc kubenswrapper[4885]: I0308 19:53:42.882277 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:43 crc kubenswrapper[4885]: I0308 19:53:43.382508 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="624830da-2b73-4843-bb04-6db9c1a7b281" path="/var/lib/kubelet/pods/624830da-2b73-4843-bb04-6db9c1a7b281/volumes" Mar 08 19:53:43 crc kubenswrapper[4885]: I0308 19:53:43.701333 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerStarted","Data":"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843"} Mar 08 19:53:43 crc kubenswrapper[4885]: I0308 19:53:43.701631 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerStarted","Data":"ff053af0de43567bfe97eb303e2667a54755259da1bbe8ed4301748389714730"} Mar 08 19:53:44 crc kubenswrapper[4885]: I0308 19:53:44.711189 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerStarted","Data":"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78"} Mar 08 19:53:47 crc kubenswrapper[4885]: I0308 19:53:47.629007 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:47 crc kubenswrapper[4885]: I0308 19:53:47.637209 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:53:48 crc kubenswrapper[4885]: I0308 19:53:48.392026 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:51 crc kubenswrapper[4885]: I0308 19:53:51.810753 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerStarted","Data":"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857"} Mar 08 19:53:51 crc kubenswrapper[4885]: I0308 19:53:51.828125 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84","Type":"ContainerStarted","Data":"6670e6817995526cd80a6c1b2064f3af999a3d367e59a87b40d4c34b2c61c6e3"} Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.146389 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.165347 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.065946552 podStartE2EDuration="14.165326604s" podCreationTimestamp="2026-03-08 19:53:38 +0000 UTC" firstStartedPulling="2026-03-08 19:53:39.677437793 +0000 UTC m=+1321.073491816" lastFinishedPulling="2026-03-08 19:53:50.776817845 +0000 UTC m=+1332.172871868" observedRunningTime="2026-03-08 19:53:51.849372916 +0000 UTC m=+1333.245426939" watchObservedRunningTime="2026-03-08 19:53:52.165326604 +0000 UTC m=+1333.561380627" Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.216065 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bbc5d6644-tztss"] Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.216291 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bbc5d6644-tztss" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-api" containerID="cri-o://5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b" gracePeriod=30 Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.216401 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6bbc5d6644-tztss" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-httpd" containerID="cri-o://47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab" gracePeriod=30 Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.837863 4885 generic.go:334] "Generic (PLEG): container finished" podID="de714834-e155-41c1-83fc-a050203bde75" containerID="47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab" exitCode=0 Mar 08 19:53:52 crc kubenswrapper[4885]: I0308 19:53:52.838409 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbc5d6644-tztss" event={"ID":"de714834-e155-41c1-83fc-a050203bde75","Type":"ContainerDied","Data":"47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab"} Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.846281 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerStarted","Data":"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70"} Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.846583 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="sg-core" containerID="cri-o://75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" gracePeriod=30 Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.846599 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.846540 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="proxy-httpd" containerID="cri-o://e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" gracePeriod=30 Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.846692 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-notification-agent" containerID="cri-o://4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" gracePeriod=30 Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.846688 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-central-agent" containerID="cri-o://ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" gracePeriod=30 Mar 08 19:53:53 crc kubenswrapper[4885]: I0308 19:53:53.867366 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.028266946 podStartE2EDuration="11.867350117s" podCreationTimestamp="2026-03-08 19:53:42 +0000 UTC" firstStartedPulling="2026-03-08 19:53:42.895086895 +0000 UTC m=+1324.291140918" lastFinishedPulling="2026-03-08 19:53:52.734170066 +0000 UTC m=+1334.130224089" observedRunningTime="2026-03-08 19:53:53.864302446 +0000 UTC m=+1335.260356469" watchObservedRunningTime="2026-03-08 19:53:53.867350117 +0000 UTC m=+1335.263404130" Mar 08 19:53:53 crc kubenswrapper[4885]: E0308 19:53:53.887580 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f9ffb05_57d2_4576_ac6a_a1d29fd8cfc2.slice/crio-75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.755734 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855486 4885 generic.go:334] "Generic (PLEG): container finished" podID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" exitCode=0 Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855518 4885 generic.go:334] "Generic (PLEG): container finished" podID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" exitCode=2 Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855526 4885 generic.go:334] "Generic (PLEG): container finished" podID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" exitCode=0 Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855533 4885 generic.go:334] "Generic (PLEG): container finished" podID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" exitCode=0 Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855549 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerDied","Data":"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70"} Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855552 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855572 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerDied","Data":"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857"} Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855582 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerDied","Data":"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78"} Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855590 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerDied","Data":"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843"} Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855598 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2","Type":"ContainerDied","Data":"ff053af0de43567bfe97eb303e2667a54755259da1bbe8ed4301748389714730"} Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.855611 4885 scope.go:117] "RemoveContainer" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.877987 4885 scope.go:117] "RemoveContainer" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.904584 4885 scope.go:117] "RemoveContainer" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.924698 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-scripts\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.924744 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54bx7\" (UniqueName: \"kubernetes.io/projected/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-kube-api-access-54bx7\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.924779 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-combined-ca-bundle\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.924818 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-run-httpd\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.924866 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-config-data\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.925015 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-sg-core-conf-yaml\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.925138 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-log-httpd\") pod \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\" (UID: \"6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2\") " Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.925739 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.926137 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.928198 4885 scope.go:117] "RemoveContainer" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.930394 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-scripts" (OuterVolumeSpecName: "scripts") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.935232 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-kube-api-access-54bx7" (OuterVolumeSpecName: "kube-api-access-54bx7") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "kube-api-access-54bx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.957691 4885 scope.go:117] "RemoveContainer" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" Mar 08 19:53:54 crc kubenswrapper[4885]: E0308 19:53:54.958138 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": container with ID starting with e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70 not found: ID does not exist" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.958180 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70"} err="failed to get container status \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": rpc error: code = NotFound desc = could not find container \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": container with ID starting with e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.958201 4885 scope.go:117] "RemoveContainer" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" Mar 08 19:53:54 crc kubenswrapper[4885]: E0308 19:53:54.958804 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": container with ID starting with 75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857 not found: ID does not exist" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.958832 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857"} err="failed to get container status \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": rpc error: code = NotFound desc = could not find container \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": container with ID starting with 75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.958850 4885 scope.go:117] "RemoveContainer" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" Mar 08 19:53:54 crc kubenswrapper[4885]: E0308 19:53:54.964141 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": container with ID starting with 4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78 not found: ID does not exist" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.964174 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78"} err="failed to get container status \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": rpc error: code = NotFound desc = could not find container \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": container with ID starting with 4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.964193 4885 scope.go:117] "RemoveContainer" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" Mar 08 19:53:54 crc kubenswrapper[4885]: E0308 19:53:54.964565 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": container with ID starting with ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843 not found: ID does not exist" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.964594 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843"} err="failed to get container status \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": rpc error: code = NotFound desc = could not find container \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": container with ID starting with ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.964612 4885 scope.go:117] "RemoveContainer" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.965389 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70"} err="failed to get container status \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": rpc error: code = NotFound desc = could not find container \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": container with ID starting with e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.965466 4885 scope.go:117] "RemoveContainer" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.965669 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.966558 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857"} err="failed to get container status \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": rpc error: code = NotFound desc = could not find container \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": container with ID starting with 75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.966587 4885 scope.go:117] "RemoveContainer" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.967127 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78"} err="failed to get container status \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": rpc error: code = NotFound desc = could not find container \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": container with ID starting with 4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.967152 4885 scope.go:117] "RemoveContainer" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.967521 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843"} err="failed to get container status \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": rpc error: code = NotFound desc = could not find container \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": container with ID starting with ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.967548 4885 scope.go:117] "RemoveContainer" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.967850 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70"} err="failed to get container status \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": rpc error: code = NotFound desc = could not find container \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": container with ID starting with e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.967876 4885 scope.go:117] "RemoveContainer" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.968180 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857"} err="failed to get container status \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": rpc error: code = NotFound desc = could not find container \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": container with ID starting with 75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.968207 4885 scope.go:117] "RemoveContainer" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.968440 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78"} err="failed to get container status \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": rpc error: code = NotFound desc = could not find container \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": container with ID starting with 4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.968464 4885 scope.go:117] "RemoveContainer" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.968883 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843"} err="failed to get container status \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": rpc error: code = NotFound desc = could not find container \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": container with ID starting with ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.968906 4885 scope.go:117] "RemoveContainer" containerID="e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.969207 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70"} err="failed to get container status \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": rpc error: code = NotFound desc = could not find container \"e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70\": container with ID starting with e0696326932c2ba21e1b707e26ff4f192075bdd30797a72cf4aada81aa336c70 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.969299 4885 scope.go:117] "RemoveContainer" containerID="75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.969806 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857"} err="failed to get container status \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": rpc error: code = NotFound desc = could not find container \"75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857\": container with ID starting with 75da5552c3fc18f96e9868a50c3bb879a3b99bc84aa9490cd6ae324a71a89857 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.969834 4885 scope.go:117] "RemoveContainer" containerID="4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.970379 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78"} err="failed to get container status \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": rpc error: code = NotFound desc = could not find container \"4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78\": container with ID starting with 4518cf402face0e09e4c850db1ba14a8192616876157afd04653f59a9c1f9f78 not found: ID does not exist" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.970438 4885 scope.go:117] "RemoveContainer" containerID="ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843" Mar 08 19:53:54 crc kubenswrapper[4885]: I0308 19:53:54.970793 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843"} err="failed to get container status \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": rpc error: code = NotFound desc = could not find container \"ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843\": container with ID starting with ac5802adadd79d053a98f578ac634ce485af7377ed1b0a909cc7ae8894f24843 not found: ID does not exist" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.015138 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.027291 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.027323 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.027333 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54bx7\" (UniqueName: \"kubernetes.io/projected/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-kube-api-access-54bx7\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.027345 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.027353 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.027363 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.040075 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-config-data" (OuterVolumeSpecName: "config-data") pod "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" (UID: "6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.128659 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.185715 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.192907 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.210314 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:55 crc kubenswrapper[4885]: E0308 19:53:55.210761 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="proxy-httpd" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.210783 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="proxy-httpd" Mar 08 19:53:55 crc kubenswrapper[4885]: E0308 19:53:55.210807 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-notification-agent" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.210815 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-notification-agent" Mar 08 19:53:55 crc kubenswrapper[4885]: E0308 19:53:55.210829 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-central-agent" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.210838 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-central-agent" Mar 08 19:53:55 crc kubenswrapper[4885]: E0308 19:53:55.210856 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="sg-core" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.210862 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="sg-core" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.211074 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-central-agent" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.211099 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="sg-core" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.211109 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="ceilometer-notification-agent" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.211124 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" containerName="proxy-httpd" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.212982 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.217304 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.217320 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.225382 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.332321 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.332644 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-config-data\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.332836 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.332985 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-log-httpd\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.333150 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-scripts\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.333192 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-run-httpd\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.333271 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72jnk\" (UniqueName: \"kubernetes.io/projected/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-kube-api-access-72jnk\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.382618 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2" path="/var/lib/kubelet/pods/6f9ffb05-57d2-4576-ac6a-a1d29fd8cfc2/volumes" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435053 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-log-httpd\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435124 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-scripts\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435142 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-run-httpd\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435710 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72jnk\" (UniqueName: \"kubernetes.io/projected/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-kube-api-access-72jnk\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-run-httpd\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435767 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-log-httpd\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-config-data\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.435952 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.440881 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.441123 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-scripts\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.441834 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-config-data\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.442582 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.465772 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72jnk\" (UniqueName: \"kubernetes.io/projected/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-kube-api-access-72jnk\") pod \"ceilometer-0\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.485934 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-b5mql"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.486951 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.501494 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-b5mql"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.575532 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.607639 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-5fh2h"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.609148 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.638407 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5fh2h"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.650502 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sqzl\" (UniqueName: \"kubernetes.io/projected/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-kube-api-access-2sqzl\") pod \"nova-api-db-create-b5mql\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.650566 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-operator-scripts\") pod \"nova-api-db-create-b5mql\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.697863 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-fdab-account-create-update-vs9sz"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.699298 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.701274 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.705714 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fdab-account-create-update-vs9sz"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.752227 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sqzl\" (UniqueName: \"kubernetes.io/projected/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-kube-api-access-2sqzl\") pod \"nova-api-db-create-b5mql\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.752279 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcwfw\" (UniqueName: \"kubernetes.io/projected/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-kube-api-access-qcwfw\") pod \"nova-cell0-db-create-5fh2h\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.752316 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-operator-scripts\") pod \"nova-api-db-create-b5mql\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.752466 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-operator-scripts\") pod \"nova-cell0-db-create-5fh2h\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.753026 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-operator-scripts\") pod \"nova-api-db-create-b5mql\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.780521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sqzl\" (UniqueName: \"kubernetes.io/projected/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-kube-api-access-2sqzl\") pod \"nova-api-db-create-b5mql\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.801508 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-hjcwx"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.802933 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.808697 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hjcwx"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.854205 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plmtf\" (UniqueName: \"kubernetes.io/projected/92191eaa-0c0a-4927-adf4-a4e386ed2552-kube-api-access-plmtf\") pod \"nova-api-fdab-account-create-update-vs9sz\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.854310 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcwfw\" (UniqueName: \"kubernetes.io/projected/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-kube-api-access-qcwfw\") pod \"nova-cell0-db-create-5fh2h\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.854346 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92191eaa-0c0a-4927-adf4-a4e386ed2552-operator-scripts\") pod \"nova-api-fdab-account-create-update-vs9sz\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.854430 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-operator-scripts\") pod \"nova-cell0-db-create-5fh2h\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.855106 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-operator-scripts\") pod \"nova-cell0-db-create-5fh2h\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.866990 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.888668 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcwfw\" (UniqueName: \"kubernetes.io/projected/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-kube-api-access-qcwfw\") pod \"nova-cell0-db-create-5fh2h\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.909608 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-qx2kg"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.910789 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.913413 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.935215 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-qx2kg"] Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.945340 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.959104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-operator-scripts\") pod \"nova-cell1-db-create-hjcwx\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.959173 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92191eaa-0c0a-4927-adf4-a4e386ed2552-operator-scripts\") pod \"nova-api-fdab-account-create-update-vs9sz\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.959253 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4g7t\" (UniqueName: \"kubernetes.io/projected/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-kube-api-access-g4g7t\") pod \"nova-cell1-db-create-hjcwx\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.959280 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plmtf\" (UniqueName: \"kubernetes.io/projected/92191eaa-0c0a-4927-adf4-a4e386ed2552-kube-api-access-plmtf\") pod \"nova-api-fdab-account-create-update-vs9sz\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.960256 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92191eaa-0c0a-4927-adf4-a4e386ed2552-operator-scripts\") pod \"nova-api-fdab-account-create-update-vs9sz\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:55 crc kubenswrapper[4885]: I0308 19:53:55.976855 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plmtf\" (UniqueName: \"kubernetes.io/projected/92191eaa-0c0a-4927-adf4-a4e386ed2552-kube-api-access-plmtf\") pod \"nova-api-fdab-account-create-update-vs9sz\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.035935 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.063078 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4g7t\" (UniqueName: \"kubernetes.io/projected/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-kube-api-access-g4g7t\") pod \"nova-cell1-db-create-hjcwx\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.063451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-operator-scripts\") pod \"nova-cell1-db-create-hjcwx\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.063530 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfljk\" (UniqueName: \"kubernetes.io/projected/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-kube-api-access-dfljk\") pod \"nova-cell0-1b03-account-create-update-qx2kg\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.063563 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-operator-scripts\") pod \"nova-cell0-1b03-account-create-update-qx2kg\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.064587 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-operator-scripts\") pod \"nova-cell1-db-create-hjcwx\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.083809 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4g7t\" (UniqueName: \"kubernetes.io/projected/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-kube-api-access-g4g7t\") pod \"nova-cell1-db-create-hjcwx\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.092485 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.108069 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-4khmp"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.109260 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.111975 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.120220 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.123215 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-4khmp"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.165714 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfljk\" (UniqueName: \"kubernetes.io/projected/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-kube-api-access-dfljk\") pod \"nova-cell0-1b03-account-create-update-qx2kg\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.165766 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-operator-scripts\") pod \"nova-cell0-1b03-account-create-update-qx2kg\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.166439 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-operator-scripts\") pod \"nova-cell0-1b03-account-create-update-qx2kg\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.184608 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfljk\" (UniqueName: \"kubernetes.io/projected/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-kube-api-access-dfljk\") pod \"nova-cell0-1b03-account-create-update-qx2kg\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.230844 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.267171 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11e75774-c86c-459a-9c66-eaf3c43addac-operator-scripts\") pod \"nova-cell1-cd3f-account-create-update-4khmp\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.267232 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht8t7\" (UniqueName: \"kubernetes.io/projected/11e75774-c86c-459a-9c66-eaf3c43addac-kube-api-access-ht8t7\") pod \"nova-cell1-cd3f-account-create-update-4khmp\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.368679 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11e75774-c86c-459a-9c66-eaf3c43addac-operator-scripts\") pod \"nova-cell1-cd3f-account-create-update-4khmp\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.368736 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ht8t7\" (UniqueName: \"kubernetes.io/projected/11e75774-c86c-459a-9c66-eaf3c43addac-kube-api-access-ht8t7\") pod \"nova-cell1-cd3f-account-create-update-4khmp\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.369607 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11e75774-c86c-459a-9c66-eaf3c43addac-operator-scripts\") pod \"nova-cell1-cd3f-account-create-update-4khmp\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.385306 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht8t7\" (UniqueName: \"kubernetes.io/projected/11e75774-c86c-459a-9c66-eaf3c43addac-kube-api-access-ht8t7\") pod \"nova-cell1-cd3f-account-create-update-4khmp\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.479638 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-b5mql"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.498363 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.599441 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-5fh2h"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.646379 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fdab-account-create-update-vs9sz"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.664736 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-hjcwx"] Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.921131 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hjcwx" event={"ID":"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c","Type":"ContainerStarted","Data":"c3c38b3a4b419b05af059ea5f575241ba3087897fe3f917341a136955ab80346"} Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.922535 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5fh2h" event={"ID":"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7","Type":"ContainerStarted","Data":"4eb69137cae07743eb1af1f6321df214566d9dbcbf6527c970e6fc1975ea80cf"} Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.923798 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fdab-account-create-update-vs9sz" event={"ID":"92191eaa-0c0a-4927-adf4-a4e386ed2552","Type":"ContainerStarted","Data":"3b62b092a692234fe276d2cdd5a40b55370dc15612fb51133501de3a4a2fb489"} Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.926389 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerStarted","Data":"34e707d299017c5d1c8910ad8b86e1920b5d7996a4afa2abaf7f5e2cb4124ea4"} Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.929028 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-b5mql" event={"ID":"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3","Type":"ContainerStarted","Data":"4cffd27a9b7724e448f78dd5d8fc02f0f0058f5575262e988c93602105c6d597"} Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.929052 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-b5mql" event={"ID":"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3","Type":"ContainerStarted","Data":"06c6027e971cc55099bbd434c3d7f4bf09a5b78fbf9c1a23eea867d1aa75409f"} Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.945347 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-b5mql" podStartSLOduration=1.9453322229999999 podStartE2EDuration="1.945332223s" podCreationTimestamp="2026-03-08 19:53:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:53:56.943674119 +0000 UTC m=+1338.339728142" watchObservedRunningTime="2026-03-08 19:53:56.945332223 +0000 UTC m=+1338.341386246" Mar 08 19:53:56 crc kubenswrapper[4885]: I0308 19:53:56.983388 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-qx2kg"] Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.144220 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-4khmp"] Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.454353 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.611821 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-ovndb-tls-certs\") pod \"de714834-e155-41c1-83fc-a050203bde75\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.611903 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-config\") pod \"de714834-e155-41c1-83fc-a050203bde75\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.611985 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-combined-ca-bundle\") pod \"de714834-e155-41c1-83fc-a050203bde75\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.612013 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfv9t\" (UniqueName: \"kubernetes.io/projected/de714834-e155-41c1-83fc-a050203bde75-kube-api-access-xfv9t\") pod \"de714834-e155-41c1-83fc-a050203bde75\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.612052 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-httpd-config\") pod \"de714834-e155-41c1-83fc-a050203bde75\" (UID: \"de714834-e155-41c1-83fc-a050203bde75\") " Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.616425 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "de714834-e155-41c1-83fc-a050203bde75" (UID: "de714834-e155-41c1-83fc-a050203bde75"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.616492 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de714834-e155-41c1-83fc-a050203bde75-kube-api-access-xfv9t" (OuterVolumeSpecName: "kube-api-access-xfv9t") pod "de714834-e155-41c1-83fc-a050203bde75" (UID: "de714834-e155-41c1-83fc-a050203bde75"). InnerVolumeSpecName "kube-api-access-xfv9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.692683 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de714834-e155-41c1-83fc-a050203bde75" (UID: "de714834-e155-41c1-83fc-a050203bde75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.694054 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-config" (OuterVolumeSpecName: "config") pod "de714834-e155-41c1-83fc-a050203bde75" (UID: "de714834-e155-41c1-83fc-a050203bde75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.713998 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.714244 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfv9t\" (UniqueName: \"kubernetes.io/projected/de714834-e155-41c1-83fc-a050203bde75-kube-api-access-xfv9t\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.714331 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.714398 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.734610 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "de714834-e155-41c1-83fc-a050203bde75" (UID: "de714834-e155-41c1-83fc-a050203bde75"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.816253 4885 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de714834-e155-41c1-83fc-a050203bde75-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.938211 4885 generic.go:334] "Generic (PLEG): container finished" podID="8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" containerID="83c825c6a12d2141eb0dfe1368babc2f8bfb90700bef146c412cb41b76f028b3" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.938280 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" event={"ID":"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41","Type":"ContainerDied","Data":"83c825c6a12d2141eb0dfe1368babc2f8bfb90700bef146c412cb41b76f028b3"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.938305 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" event={"ID":"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41","Type":"ContainerStarted","Data":"76f5bddc0468d531085855eaa891e527dd627f81f438aa98c14e0fe69b754f55"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.939369 4885 generic.go:334] "Generic (PLEG): container finished" podID="92191eaa-0c0a-4927-adf4-a4e386ed2552" containerID="9585f2e0b3d9045954e289a5ad0191eb4ab1e2632be8da00e467a511a692dd4f" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.939446 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fdab-account-create-update-vs9sz" event={"ID":"92191eaa-0c0a-4927-adf4-a4e386ed2552","Type":"ContainerDied","Data":"9585f2e0b3d9045954e289a5ad0191eb4ab1e2632be8da00e467a511a692dd4f"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.941570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerStarted","Data":"7cf70af4753fbcc177c169967bfd0633e149f5d98df36cb2d6ff676d0a215e21"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.941617 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerStarted","Data":"db7de2bfb2402bc7c35eeb0e3a0a80a212c00dd48e7b95c320538d854040bceb"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.943396 4885 generic.go:334] "Generic (PLEG): container finished" podID="7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" containerID="ce47c98e58f66c2a55840d70bb55bfd25b6d54a9fa04407857b7919987c1acd6" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.943454 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hjcwx" event={"ID":"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c","Type":"ContainerDied","Data":"ce47c98e58f66c2a55840d70bb55bfd25b6d54a9fa04407857b7919987c1acd6"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.945026 4885 generic.go:334] "Generic (PLEG): container finished" podID="15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" containerID="eb80fb2a1922a32d725b4ee5e3cc391924d843e1dfc770a23f4293be00620e5f" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.945186 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5fh2h" event={"ID":"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7","Type":"ContainerDied","Data":"eb80fb2a1922a32d725b4ee5e3cc391924d843e1dfc770a23f4293be00620e5f"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.946837 4885 generic.go:334] "Generic (PLEG): container finished" podID="11e75774-c86c-459a-9c66-eaf3c43addac" containerID="0fd5040bc376c8f684c8ba84911a21e03723dd7d09ccc7b3d5b40d2f11712a3d" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.946972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" event={"ID":"11e75774-c86c-459a-9c66-eaf3c43addac","Type":"ContainerDied","Data":"0fd5040bc376c8f684c8ba84911a21e03723dd7d09ccc7b3d5b40d2f11712a3d"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.947089 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" event={"ID":"11e75774-c86c-459a-9c66-eaf3c43addac","Type":"ContainerStarted","Data":"8babcab44df031a7dca85bbd1e1b25d866660a24b8048961074ce85533c9f142"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.948381 4885 generic.go:334] "Generic (PLEG): container finished" podID="e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" containerID="4cffd27a9b7724e448f78dd5d8fc02f0f0058f5575262e988c93602105c6d597" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.948440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-b5mql" event={"ID":"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3","Type":"ContainerDied","Data":"4cffd27a9b7724e448f78dd5d8fc02f0f0058f5575262e988c93602105c6d597"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.950436 4885 generic.go:334] "Generic (PLEG): container finished" podID="de714834-e155-41c1-83fc-a050203bde75" containerID="5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b" exitCode=0 Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.950468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbc5d6644-tztss" event={"ID":"de714834-e155-41c1-83fc-a050203bde75","Type":"ContainerDied","Data":"5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.950487 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6bbc5d6644-tztss" event={"ID":"de714834-e155-41c1-83fc-a050203bde75","Type":"ContainerDied","Data":"ce2e04e0c937557f90cd3e8f47d07607bc0f9d0c6eb93b3ee180bc8da569c97b"} Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.950507 4885 scope.go:117] "RemoveContainer" containerID="47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab" Mar 08 19:53:57 crc kubenswrapper[4885]: I0308 19:53:57.950619 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6bbc5d6644-tztss" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.041495 4885 scope.go:117] "RemoveContainer" containerID="5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.068769 4885 scope.go:117] "RemoveContainer" containerID="47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab" Mar 08 19:53:58 crc kubenswrapper[4885]: E0308 19:53:58.069903 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab\": container with ID starting with 47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab not found: ID does not exist" containerID="47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.069950 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab"} err="failed to get container status \"47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab\": rpc error: code = NotFound desc = could not find container \"47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab\": container with ID starting with 47289b7f9941e9739271de9b35e86ea5b915a23111bd2277ded257960341e4ab not found: ID does not exist" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.069967 4885 scope.go:117] "RemoveContainer" containerID="5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b" Mar 08 19:53:58 crc kubenswrapper[4885]: E0308 19:53:58.072348 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b\": container with ID starting with 5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b not found: ID does not exist" containerID="5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.072385 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b"} err="failed to get container status \"5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b\": rpc error: code = NotFound desc = could not find container \"5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b\": container with ID starting with 5a2a9aafc8c47bff90e778ca3136c8041cd4badcd91afbb829892104a4aa701b not found: ID does not exist" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.086220 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6bbc5d6644-tztss"] Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.093487 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6bbc5d6644-tztss"] Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.723861 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.765402 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.845648 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b5685698-p87pb"] Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.845896 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b5685698-p87pb" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-log" containerID="cri-o://284122f7c1790bf6573097e7966743c78c5e2016a00bf0c85b35b46fb79ec3ac" gracePeriod=30 Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.846009 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6b5685698-p87pb" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-api" containerID="cri-o://4b8fc7ef37c75cd3a7a88b7cc2d7710779d5554badc6896eed06c03d3625b81a" gracePeriod=30 Mar 08 19:53:58 crc kubenswrapper[4885]: I0308 19:53:58.993889 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerStarted","Data":"30e722bde831d03eed4bc7a2ac2c7c561a897a0bc5aee76137806f8c867c31e9"} Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.383687 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de714834-e155-41c1-83fc-a050203bde75" path="/var/lib/kubelet/pods/de714834-e155-41c1-83fc-a050203bde75/volumes" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.399033 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.549419 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ht8t7\" (UniqueName: \"kubernetes.io/projected/11e75774-c86c-459a-9c66-eaf3c43addac-kube-api-access-ht8t7\") pod \"11e75774-c86c-459a-9c66-eaf3c43addac\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.549855 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11e75774-c86c-459a-9c66-eaf3c43addac-operator-scripts\") pod \"11e75774-c86c-459a-9c66-eaf3c43addac\" (UID: \"11e75774-c86c-459a-9c66-eaf3c43addac\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.550214 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e75774-c86c-459a-9c66-eaf3c43addac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "11e75774-c86c-459a-9c66-eaf3c43addac" (UID: "11e75774-c86c-459a-9c66-eaf3c43addac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.552113 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11e75774-c86c-459a-9c66-eaf3c43addac-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.559144 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e75774-c86c-459a-9c66-eaf3c43addac-kube-api-access-ht8t7" (OuterVolumeSpecName: "kube-api-access-ht8t7") pod "11e75774-c86c-459a-9c66-eaf3c43addac" (UID: "11e75774-c86c-459a-9c66-eaf3c43addac"). InnerVolumeSpecName "kube-api-access-ht8t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.656115 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ht8t7\" (UniqueName: \"kubernetes.io/projected/11e75774-c86c-459a-9c66-eaf3c43addac-kube-api-access-ht8t7\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.711199 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-b5mql" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.724034 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.733894 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.743520 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.840623 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870533 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcwfw\" (UniqueName: \"kubernetes.io/projected/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-kube-api-access-qcwfw\") pod \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870610 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-operator-scripts\") pod \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870728 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-operator-scripts\") pod \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870756 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-operator-scripts\") pod \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\" (UID: \"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870806 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4g7t\" (UniqueName: \"kubernetes.io/projected/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-kube-api-access-g4g7t\") pod \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\" (UID: \"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870880 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-operator-scripts\") pod \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870912 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sqzl\" (UniqueName: \"kubernetes.io/projected/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-kube-api-access-2sqzl\") pod \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\" (UID: \"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.870983 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfljk\" (UniqueName: \"kubernetes.io/projected/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-kube-api-access-dfljk\") pod \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\" (UID: \"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.873319 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" (UID: "e437f837-ac56-4b1a-b7ec-7a22cf98c8b3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.873324 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" (UID: "15d49c2d-56cf-46e9-b0e9-c5aac516fdf7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.873639 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" (UID: "7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.873942 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" (UID: "8ed43ce9-6f70-49b2-aa6e-50917ea9ca41"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.881445 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-kube-api-access-g4g7t" (OuterVolumeSpecName: "kube-api-access-g4g7t") pod "7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" (UID: "7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c"). InnerVolumeSpecName "kube-api-access-g4g7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.881856 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-kube-api-access-qcwfw" (OuterVolumeSpecName: "kube-api-access-qcwfw") pod "15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" (UID: "15d49c2d-56cf-46e9-b0e9-c5aac516fdf7"). InnerVolumeSpecName "kube-api-access-qcwfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.890645 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-kube-api-access-dfljk" (OuterVolumeSpecName: "kube-api-access-dfljk") pod "8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" (UID: "8ed43ce9-6f70-49b2-aa6e-50917ea9ca41"). InnerVolumeSpecName "kube-api-access-dfljk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.891147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-kube-api-access-2sqzl" (OuterVolumeSpecName: "kube-api-access-2sqzl") pod "e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" (UID: "e437f837-ac56-4b1a-b7ec-7a22cf98c8b3"). InnerVolumeSpecName "kube-api-access-2sqzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.973155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plmtf\" (UniqueName: \"kubernetes.io/projected/92191eaa-0c0a-4927-adf4-a4e386ed2552-kube-api-access-plmtf\") pod \"92191eaa-0c0a-4927-adf4-a4e386ed2552\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.974164 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92191eaa-0c0a-4927-adf4-a4e386ed2552-operator-scripts\") pod \"92191eaa-0c0a-4927-adf4-a4e386ed2552\" (UID: \"92191eaa-0c0a-4927-adf4-a4e386ed2552\") " Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976002 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976027 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976039 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4g7t\" (UniqueName: \"kubernetes.io/projected/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-kube-api-access-g4g7t\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976050 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976058 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sqzl\" (UniqueName: \"kubernetes.io/projected/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3-kube-api-access-2sqzl\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976067 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfljk\" (UniqueName: \"kubernetes.io/projected/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41-kube-api-access-dfljk\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976075 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcwfw\" (UniqueName: \"kubernetes.io/projected/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7-kube-api-access-qcwfw\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.976104 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.978187 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92191eaa-0c0a-4927-adf4-a4e386ed2552-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92191eaa-0c0a-4927-adf4-a4e386ed2552" (UID: "92191eaa-0c0a-4927-adf4-a4e386ed2552"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:53:59 crc kubenswrapper[4885]: I0308 19:53:59.981019 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92191eaa-0c0a-4927-adf4-a4e386ed2552-kube-api-access-plmtf" (OuterVolumeSpecName: "kube-api-access-plmtf") pod "92191eaa-0c0a-4927-adf4-a4e386ed2552" (UID: "92191eaa-0c0a-4927-adf4-a4e386ed2552"). InnerVolumeSpecName "kube-api-access-plmtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.009143 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-5fh2h" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.009142 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-5fh2h" event={"ID":"15d49c2d-56cf-46e9-b0e9-c5aac516fdf7","Type":"ContainerDied","Data":"4eb69137cae07743eb1af1f6321df214566d9dbcbf6527c970e6fc1975ea80cf"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.009345 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4eb69137cae07743eb1af1f6321df214566d9dbcbf6527c970e6fc1975ea80cf" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.010908 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.010938 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3f-account-create-update-4khmp" event={"ID":"11e75774-c86c-459a-9c66-eaf3c43addac","Type":"ContainerDied","Data":"8babcab44df031a7dca85bbd1e1b25d866660a24b8048961074ce85533c9f142"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.010977 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8babcab44df031a7dca85bbd1e1b25d866660a24b8048961074ce85533c9f142" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.013162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-b5mql" event={"ID":"e437f837-ac56-4b1a-b7ec-7a22cf98c8b3","Type":"ContainerDied","Data":"06c6027e971cc55099bbd434c3d7f4bf09a5b78fbf9c1a23eea867d1aa75409f"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.013188 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06c6027e971cc55099bbd434c3d7f4bf09a5b78fbf9c1a23eea867d1aa75409f" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.013196 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-b5mql" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.017465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" event={"ID":"8ed43ce9-6f70-49b2-aa6e-50917ea9ca41","Type":"ContainerDied","Data":"76f5bddc0468d531085855eaa891e527dd627f81f438aa98c14e0fe69b754f55"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.017492 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76f5bddc0468d531085855eaa891e527dd627f81f438aa98c14e0fe69b754f55" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.017568 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-qx2kg" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.029694 4885 generic.go:334] "Generic (PLEG): container finished" podID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerID="284122f7c1790bf6573097e7966743c78c5e2016a00bf0c85b35b46fb79ec3ac" exitCode=143 Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.029820 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5685698-p87pb" event={"ID":"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1","Type":"ContainerDied","Data":"284122f7c1790bf6573097e7966743c78c5e2016a00bf0c85b35b46fb79ec3ac"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.037455 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fdab-account-create-update-vs9sz" event={"ID":"92191eaa-0c0a-4927-adf4-a4e386ed2552","Type":"ContainerDied","Data":"3b62b092a692234fe276d2cdd5a40b55370dc15612fb51133501de3a4a2fb489"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.037511 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b62b092a692234fe276d2cdd5a40b55370dc15612fb51133501de3a4a2fb489" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.037578 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fdab-account-create-update-vs9sz" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.039555 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-hjcwx" event={"ID":"7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c","Type":"ContainerDied","Data":"c3c38b3a4b419b05af059ea5f575241ba3087897fe3f917341a136955ab80346"} Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.039579 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3c38b3a4b419b05af059ea5f575241ba3087897fe3f917341a136955ab80346" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.039623 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-hjcwx" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.077711 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92191eaa-0c0a-4927-adf4-a4e386ed2552-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.077785 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plmtf\" (UniqueName: \"kubernetes.io/projected/92191eaa-0c0a-4927-adf4-a4e386ed2552-kube-api-access-plmtf\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135102 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549994-6zc6p"] Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135461 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-httpd" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135479 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-httpd" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135489 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135496 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135508 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92191eaa-0c0a-4927-adf4-a4e386ed2552" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135514 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="92191eaa-0c0a-4927-adf4-a4e386ed2552" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135532 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135540 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135549 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135554 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135571 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e75774-c86c-459a-9c66-eaf3c43addac" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135577 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e75774-c86c-459a-9c66-eaf3c43addac" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135589 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135595 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: E0308 19:54:00.135607 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-api" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135613 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-api" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135761 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135774 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="92191eaa-0c0a-4927-adf4-a4e386ed2552" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135785 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135797 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e75774-c86c-459a-9c66-eaf3c43addac" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135807 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" containerName="mariadb-account-create-update" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135817 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" containerName="mariadb-database-create" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135827 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-api" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.135839 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="de714834-e155-41c1-83fc-a050203bde75" containerName="neutron-httpd" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.136406 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.139790 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.139966 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.140590 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.147040 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549994-6zc6p"] Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.250437 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.251017 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-log" containerID="cri-o://5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144" gracePeriod=30 Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.251465 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-httpd" containerID="cri-o://bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2" gracePeriod=30 Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.281255 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng24f\" (UniqueName: \"kubernetes.io/projected/5e836afb-bb6f-4e67-9df6-5bef0273a523-kube-api-access-ng24f\") pod \"auto-csr-approver-29549994-6zc6p\" (UID: \"5e836afb-bb6f-4e67-9df6-5bef0273a523\") " pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.383029 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng24f\" (UniqueName: \"kubernetes.io/projected/5e836afb-bb6f-4e67-9df6-5bef0273a523-kube-api-access-ng24f\") pod \"auto-csr-approver-29549994-6zc6p\" (UID: \"5e836afb-bb6f-4e67-9df6-5bef0273a523\") " pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.400805 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng24f\" (UniqueName: \"kubernetes.io/projected/5e836afb-bb6f-4e67-9df6-5bef0273a523-kube-api-access-ng24f\") pod \"auto-csr-approver-29549994-6zc6p\" (UID: \"5e836afb-bb6f-4e67-9df6-5bef0273a523\") " pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.463223 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:00 crc kubenswrapper[4885]: I0308 19:54:00.917523 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549994-6zc6p"] Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.064529 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerStarted","Data":"9348c59006c229d5addb1edf8ed7baf6b6d89cc79e7937bbe98f9278bc9d36c3"} Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.064783 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.067023 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerID="5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144" exitCode=143 Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.067068 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c045d9d-f8a0-40b9-9600-0d10d5c699e7","Type":"ContainerDied","Data":"5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144"} Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.068981 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" event={"ID":"5e836afb-bb6f-4e67-9df6-5bef0273a523","Type":"ContainerStarted","Data":"556ae4e045c8d953629c747935f3661f6a7601c17af1a6b947e6ef3e154c9e4b"} Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.091413 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.250901076 podStartE2EDuration="6.091396461s" podCreationTimestamp="2026-03-08 19:53:55 +0000 UTC" firstStartedPulling="2026-03-08 19:53:56.126449137 +0000 UTC m=+1337.522503160" lastFinishedPulling="2026-03-08 19:53:59.966944522 +0000 UTC m=+1341.362998545" observedRunningTime="2026-03-08 19:54:01.085135153 +0000 UTC m=+1342.481189186" watchObservedRunningTime="2026-03-08 19:54:01.091396461 +0000 UTC m=+1342.487450484" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.322401 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5pkh8"] Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.323498 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.326174 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.326308 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.326351 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qrg8t" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.340072 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5pkh8"] Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.403088 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-scripts\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.403427 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-config-data\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.403452 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.403504 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66qmj\" (UniqueName: \"kubernetes.io/projected/84470b78-5e74-473c-88d3-5343943c01fb-kube-api-access-66qmj\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.504850 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-config-data\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.504888 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.504936 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66qmj\" (UniqueName: \"kubernetes.io/projected/84470b78-5e74-473c-88d3-5343943c01fb-kube-api-access-66qmj\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.505012 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-scripts\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.513758 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-config-data\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.515791 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-scripts\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.527775 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66qmj\" (UniqueName: \"kubernetes.io/projected/84470b78-5e74-473c-88d3-5343943c01fb-kube-api-access-66qmj\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.528635 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5pkh8\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.550667 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.550997 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-log" containerID="cri-o://bf43cbd05abc4859f6ceeadb70f7e22ee780318d3caa3829c75db5c4ff63615b" gracePeriod=30 Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.551079 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-httpd" containerID="cri-o://76f21ebe5e2a1ec7083c3221c48c316f2bf16958272fe800dd874d9d19aa99dd" gracePeriod=30 Mar 08 19:54:01 crc kubenswrapper[4885]: I0308 19:54:01.678459 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.027700 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.079833 4885 generic.go:334] "Generic (PLEG): container finished" podID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerID="4b8fc7ef37c75cd3a7a88b7cc2d7710779d5554badc6896eed06c03d3625b81a" exitCode=0 Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.079879 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5685698-p87pb" event={"ID":"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1","Type":"ContainerDied","Data":"4b8fc7ef37c75cd3a7a88b7cc2d7710779d5554badc6896eed06c03d3625b81a"} Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.081648 4885 generic.go:334] "Generic (PLEG): container finished" podID="405b5d21-a208-4f86-b046-66968c326aa4" containerID="bf43cbd05abc4859f6ceeadb70f7e22ee780318d3caa3829c75db5c4ff63615b" exitCode=143 Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.082126 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"405b5d21-a208-4f86-b046-66968c326aa4","Type":"ContainerDied","Data":"bf43cbd05abc4859f6ceeadb70f7e22ee780318d3caa3829c75db5c4ff63615b"} Mar 08 19:54:02 crc kubenswrapper[4885]: W0308 19:54:02.158202 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84470b78_5e74_473c_88d3_5343943c01fb.slice/crio-ce67e1273536f417a3da8fd4b46949a06744125c07e920dac8c012c6378cce25 WatchSource:0}: Error finding container ce67e1273536f417a3da8fd4b46949a06744125c07e920dac8c012c6378cce25: Status 404 returned error can't find the container with id ce67e1273536f417a3da8fd4b46949a06744125c07e920dac8c012c6378cce25 Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.158507 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5pkh8"] Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.370357 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523048 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq6bs\" (UniqueName: \"kubernetes.io/projected/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-kube-api-access-dq6bs\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523160 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-public-tls-certs\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523198 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-combined-ca-bundle\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523227 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-internal-tls-certs\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523294 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-config-data\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523344 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-scripts\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.523368 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-logs\") pod \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\" (UID: \"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1\") " Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.525831 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-logs" (OuterVolumeSpecName: "logs") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.534080 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-scripts" (OuterVolumeSpecName: "scripts") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.541455 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-kube-api-access-dq6bs" (OuterVolumeSpecName: "kube-api-access-dq6bs") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "kube-api-access-dq6bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.601077 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.621132 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-config-data" (OuterVolumeSpecName: "config-data") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.627909 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.627970 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.627980 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.627989 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq6bs\" (UniqueName: \"kubernetes.io/projected/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-kube-api-access-dq6bs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.627997 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.649778 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.688077 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" (UID: "30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.729496 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.729534 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.818068 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.818119 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.818157 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.818801 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c24a30299a18630f198121b61248ad8d1e3d9e8acd806e23d5c1d953fe5cfa83"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:54:02 crc kubenswrapper[4885]: I0308 19:54:02.818850 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://c24a30299a18630f198121b61248ad8d1e3d9e8acd806e23d5c1d953fe5cfa83" gracePeriod=600 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.093727 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="c24a30299a18630f198121b61248ad8d1e3d9e8acd806e23d5c1d953fe5cfa83" exitCode=0 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.093787 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"c24a30299a18630f198121b61248ad8d1e3d9e8acd806e23d5c1d953fe5cfa83"} Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.093819 4885 scope.go:117] "RemoveContainer" containerID="e6dd4ce3180e7f84da70c69f276b3e39a0d5b0c2aeeabe5c8a51dafdbeafb374" Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.095947 4885 generic.go:334] "Generic (PLEG): container finished" podID="5e836afb-bb6f-4e67-9df6-5bef0273a523" containerID="0d00454c184e09bd4a156eebaa35bb3bcacf94bedd622a0c71e0954aef720385" exitCode=0 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.095998 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" event={"ID":"5e836afb-bb6f-4e67-9df6-5bef0273a523","Type":"ContainerDied","Data":"0d00454c184e09bd4a156eebaa35bb3bcacf94bedd622a0c71e0954aef720385"} Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.098556 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b5685698-p87pb" event={"ID":"30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1","Type":"ContainerDied","Data":"728e2c4227d2c2ab14fe2c194baeb854bdfe4cac966691c25a22f562f3e2f82b"} Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.098592 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b5685698-p87pb" Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.105711 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" event={"ID":"84470b78-5e74-473c-88d3-5343943c01fb","Type":"ContainerStarted","Data":"ce67e1273536f417a3da8fd4b46949a06744125c07e920dac8c012c6378cce25"} Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.105845 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="proxy-httpd" containerID="cri-o://9348c59006c229d5addb1edf8ed7baf6b6d89cc79e7937bbe98f9278bc9d36c3" gracePeriod=30 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.105861 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-notification-agent" containerID="cri-o://7cf70af4753fbcc177c169967bfd0633e149f5d98df36cb2d6ff676d0a215e21" gracePeriod=30 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.105885 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="sg-core" containerID="cri-o://30e722bde831d03eed4bc7a2ac2c7c561a897a0bc5aee76137806f8c867c31e9" gracePeriod=30 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.105846 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-central-agent" containerID="cri-o://db7de2bfb2402bc7c35eeb0e3a0a80a212c00dd48e7b95c320538d854040bceb" gracePeriod=30 Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.141629 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6b5685698-p87pb"] Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.146648 4885 scope.go:117] "RemoveContainer" containerID="4b8fc7ef37c75cd3a7a88b7cc2d7710779d5554badc6896eed06c03d3625b81a" Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.147960 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6b5685698-p87pb"] Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.186717 4885 scope.go:117] "RemoveContainer" containerID="284122f7c1790bf6573097e7966743c78c5e2016a00bf0c85b35b46fb79ec3ac" Mar 08 19:54:03 crc kubenswrapper[4885]: I0308 19:54:03.380799 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" path="/var/lib/kubelet/pods/30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1/volumes" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:03.999943 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.123489 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerID="bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2" exitCode=0 Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.124056 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c045d9d-f8a0-40b9-9600-0d10d5c699e7","Type":"ContainerDied","Data":"bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2"} Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.124720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c045d9d-f8a0-40b9-9600-0d10d5c699e7","Type":"ContainerDied","Data":"4a7d8fc5b878c3d0d7e4f116b765cb88438382af5862067e0b9a50b02bb40fea"} Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.124795 4885 scope.go:117] "RemoveContainer" containerID="bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.124164 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.128340 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"b0670bdd5ae4a6193bbd77e528520a487b041af774d5305f09762277548bcda8"} Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.148726 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerID="9348c59006c229d5addb1edf8ed7baf6b6d89cc79e7937bbe98f9278bc9d36c3" exitCode=0 Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.148759 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerID="30e722bde831d03eed4bc7a2ac2c7c561a897a0bc5aee76137806f8c867c31e9" exitCode=2 Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.148768 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerID="7cf70af4753fbcc177c169967bfd0633e149f5d98df36cb2d6ff676d0a215e21" exitCode=0 Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.148774 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerDied","Data":"9348c59006c229d5addb1edf8ed7baf6b6d89cc79e7937bbe98f9278bc9d36c3"} Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.148838 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerDied","Data":"30e722bde831d03eed4bc7a2ac2c7c561a897a0bc5aee76137806f8c867c31e9"} Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.148853 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerDied","Data":"7cf70af4753fbcc177c169967bfd0633e149f5d98df36cb2d6ff676d0a215e21"} Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152270 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-config-data\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152487 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-scripts\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152513 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152534 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-combined-ca-bundle\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152565 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzlbm\" (UniqueName: \"kubernetes.io/projected/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-kube-api-access-hzlbm\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152609 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152674 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-logs\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.152707 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-httpd-run\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.153945 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-logs" (OuterVolumeSpecName: "logs") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.155325 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.158851 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-scripts" (OuterVolumeSpecName: "scripts") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.159247 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-kube-api-access-hzlbm" (OuterVolumeSpecName: "kube-api-access-hzlbm") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "kube-api-access-hzlbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.160391 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.198187 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.200832 4885 scope.go:117] "RemoveContainer" containerID="5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.212747 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-config-data" (OuterVolumeSpecName: "config-data") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.256097 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.256203 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs\") pod \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\" (UID: \"3c045d9d-f8a0-40b9-9600-0d10d5c699e7\") " Mar 08 19:54:04 crc kubenswrapper[4885]: W0308 19:54:04.256385 4885 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/3c045d9d-f8a0-40b9-9600-0d10d5c699e7/volumes/kubernetes.io~secret/public-tls-certs Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.256415 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3c045d9d-f8a0-40b9-9600-0d10d5c699e7" (UID: "3c045d9d-f8a0-40b9-9600-0d10d5c699e7"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.257731 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzlbm\" (UniqueName: \"kubernetes.io/projected/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-kube-api-access-hzlbm\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.257761 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.257771 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.258207 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.258229 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.258244 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.258265 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.258276 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c045d9d-f8a0-40b9-9600-0d10d5c699e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.265158 4885 scope.go:117] "RemoveContainer" containerID="bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2" Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.265735 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2\": container with ID starting with bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2 not found: ID does not exist" containerID="bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.265781 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2"} err="failed to get container status \"bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2\": rpc error: code = NotFound desc = could not find container \"bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2\": container with ID starting with bc43389cc5ef59a72b20af05b2988e3deef2a13e0503efd4c530f4d4679d57f2 not found: ID does not exist" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.265834 4885 scope.go:117] "RemoveContainer" containerID="5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144" Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.266339 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144\": container with ID starting with 5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144 not found: ID does not exist" containerID="5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.266389 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144"} err="failed to get container status \"5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144\": rpc error: code = NotFound desc = could not find container \"5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144\": container with ID starting with 5e04e2551f2b56f9d2ad540b91ac3e1c46fedde2674ee4d2a828d5dd0ca98144 not found: ID does not exist" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.286031 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.361076 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.495255 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.520019 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.535537 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.565534 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.565929 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-log" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.565966 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-log" Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.565986 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-httpd" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.565992 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-httpd" Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.566007 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-log" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566013 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-log" Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.566023 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-api" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566028 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-api" Mar 08 19:54:04 crc kubenswrapper[4885]: E0308 19:54:04.566039 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e836afb-bb6f-4e67-9df6-5bef0273a523" containerName="oc" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566046 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e836afb-bb6f-4e67-9df6-5bef0273a523" containerName="oc" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566231 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e836afb-bb6f-4e67-9df6-5bef0273a523" containerName="oc" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566243 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-httpd" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566259 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-log" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566268 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" containerName="glance-log" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566281 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="30962695-4bc8-4fd2-b6e4-5b4b1f9d75a1" containerName="placement-api" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.566739 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng24f\" (UniqueName: \"kubernetes.io/projected/5e836afb-bb6f-4e67-9df6-5bef0273a523-kube-api-access-ng24f\") pod \"5e836afb-bb6f-4e67-9df6-5bef0273a523\" (UID: \"5e836afb-bb6f-4e67-9df6-5bef0273a523\") " Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.567994 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.570047 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.570350 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.571377 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e836afb-bb6f-4e67-9df6-5bef0273a523-kube-api-access-ng24f" (OuterVolumeSpecName: "kube-api-access-ng24f") pod "5e836afb-bb6f-4e67-9df6-5bef0273a523" (UID: "5e836afb-bb6f-4e67-9df6-5bef0273a523"). InnerVolumeSpecName "kube-api-access-ng24f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.603051 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668060 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-config-data\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzmz5\" (UniqueName: \"kubernetes.io/projected/50b429e9-fb10-48ba-b15c-ec25d57e707a-kube-api-access-hzmz5\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668138 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668307 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-logs\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668528 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668607 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-scripts\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668765 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.668985 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng24f\" (UniqueName: \"kubernetes.io/projected/5e836afb-bb6f-4e67-9df6-5bef0273a523-kube-api-access-ng24f\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770059 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzmz5\" (UniqueName: \"kubernetes.io/projected/50b429e9-fb10-48ba-b15c-ec25d57e707a-kube-api-access-hzmz5\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770099 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770123 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-logs\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770181 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770204 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-scripts\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770247 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770272 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770309 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-config-data\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.770952 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.771486 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-logs\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.774311 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.774736 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.775149 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.780399 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-scripts\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.790504 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-config-data\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.793852 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzmz5\" (UniqueName: \"kubernetes.io/projected/50b429e9-fb10-48ba-b15c-ec25d57e707a-kube-api-access-hzmz5\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.797125 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " pod="openstack/glance-default-external-api-0" Mar 08 19:54:04 crc kubenswrapper[4885]: I0308 19:54:04.934349 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.156785 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" event={"ID":"5e836afb-bb6f-4e67-9df6-5bef0273a523","Type":"ContainerDied","Data":"556ae4e045c8d953629c747935f3661f6a7601c17af1a6b947e6ef3e154c9e4b"} Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.157086 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="556ae4e045c8d953629c747935f3661f6a7601c17af1a6b947e6ef3e154c9e4b" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.157018 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549994-6zc6p" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.160939 4885 generic.go:334] "Generic (PLEG): container finished" podID="405b5d21-a208-4f86-b046-66968c326aa4" containerID="76f21ebe5e2a1ec7083c3221c48c316f2bf16958272fe800dd874d9d19aa99dd" exitCode=0 Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.160966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"405b5d21-a208-4f86-b046-66968c326aa4","Type":"ContainerDied","Data":"76f21ebe5e2a1ec7083c3221c48c316f2bf16958272fe800dd874d9d19aa99dd"} Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.164464 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285505 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-httpd-run\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285597 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-logs\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285669 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-combined-ca-bundle\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285747 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285798 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-internal-tls-certs\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285838 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkzrv\" (UniqueName: \"kubernetes.io/projected/405b5d21-a208-4f86-b046-66968c326aa4-kube-api-access-vkzrv\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285853 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-scripts\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.285910 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-config-data\") pod \"405b5d21-a208-4f86-b046-66968c326aa4\" (UID: \"405b5d21-a208-4f86-b046-66968c326aa4\") " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.288946 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-logs" (OuterVolumeSpecName: "logs") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.289123 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.292419 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-scripts" (OuterVolumeSpecName: "scripts") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.292571 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/405b5d21-a208-4f86-b046-66968c326aa4-kube-api-access-vkzrv" (OuterVolumeSpecName: "kube-api-access-vkzrv") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "kube-api-access-vkzrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.294269 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.345090 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.351678 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-config-data" (OuterVolumeSpecName: "config-data") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.375847 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "405b5d21-a208-4f86-b046-66968c326aa4" (UID: "405b5d21-a208-4f86-b046-66968c326aa4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.386340 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c045d9d-f8a0-40b9-9600-0d10d5c699e7" path="/var/lib/kubelet/pods/3c045d9d-f8a0-40b9-9600-0d10d5c699e7/volumes" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389540 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkzrv\" (UniqueName: \"kubernetes.io/projected/405b5d21-a208-4f86-b046-66968c326aa4-kube-api-access-vkzrv\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389567 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389578 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389589 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389597 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405b5d21-a208-4f86-b046-66968c326aa4-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389605 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389635 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.389644 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/405b5d21-a208-4f86-b046-66968c326aa4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.419514 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.483967 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.491438 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.580734 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549988-f7hdz"] Mar 08 19:54:05 crc kubenswrapper[4885]: I0308 19:54:05.589296 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549988-f7hdz"] Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.172966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50b429e9-fb10-48ba-b15c-ec25d57e707a","Type":"ContainerStarted","Data":"2d562399fb223e806d5f3ddff2425b5e427d18a16330c90ac41a561625d41719"} Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.173278 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50b429e9-fb10-48ba-b15c-ec25d57e707a","Type":"ContainerStarted","Data":"8290e58829785cfd7645e5b7ea06bfd203515f9adadd2b8e8b4383fbc9129293"} Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.175368 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"405b5d21-a208-4f86-b046-66968c326aa4","Type":"ContainerDied","Data":"0ea29bb519254f2f6d232c0a073b9e8199e006c424f0d72c1ebe3ec8f1381dff"} Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.175399 4885 scope.go:117] "RemoveContainer" containerID="76f21ebe5e2a1ec7083c3221c48c316f2bf16958272fe800dd874d9d19aa99dd" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.175488 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.208489 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.211420 4885 scope.go:117] "RemoveContainer" containerID="bf43cbd05abc4859f6ceeadb70f7e22ee780318d3caa3829c75db5c4ff63615b" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.226188 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.261332 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:54:06 crc kubenswrapper[4885]: E0308 19:54:06.262934 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-log" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.262950 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-log" Mar 08 19:54:06 crc kubenswrapper[4885]: E0308 19:54:06.262992 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-httpd" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.262999 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-httpd" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.267003 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-httpd" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.267040 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="405b5d21-a208-4f86-b046-66968c326aa4" containerName="glance-log" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.269397 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.272182 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.272279 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.292687 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408369 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408436 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408458 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408490 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408515 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408561 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.408635 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tpp6\" (UniqueName: \"kubernetes.io/projected/e4ca493a-f707-45c3-b457-1a1053c3dfe5-kube-api-access-7tpp6\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.509965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510274 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510343 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510360 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tpp6\" (UniqueName: \"kubernetes.io/projected/e4ca493a-f707-45c3-b457-1a1053c3dfe5-kube-api-access-7tpp6\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510386 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510435 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510459 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.510809 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.511442 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-logs\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.511491 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.517176 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.517221 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.517947 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.526599 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.529433 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tpp6\" (UniqueName: \"kubernetes.io/projected/e4ca493a-f707-45c3-b457-1a1053c3dfe5-kube-api-access-7tpp6\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.546867 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " pod="openstack/glance-default-internal-api-0" Mar 08 19:54:06 crc kubenswrapper[4885]: I0308 19:54:06.609138 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:07 crc kubenswrapper[4885]: I0308 19:54:07.137697 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:54:07 crc kubenswrapper[4885]: I0308 19:54:07.193854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50b429e9-fb10-48ba-b15c-ec25d57e707a","Type":"ContainerStarted","Data":"f5e5f2790360729d9c4394c0e85bb4e8ea8164ab35be9023623e79b3a117f852"} Mar 08 19:54:07 crc kubenswrapper[4885]: I0308 19:54:07.198220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4ca493a-f707-45c3-b457-1a1053c3dfe5","Type":"ContainerStarted","Data":"d206b01c706625f3b6d24a81cff0491b35107018670692e53d89b4cfafe0b053"} Mar 08 19:54:07 crc kubenswrapper[4885]: I0308 19:54:07.220795 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.220774262 podStartE2EDuration="3.220774262s" podCreationTimestamp="2026-03-08 19:54:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:07.215331737 +0000 UTC m=+1348.611385840" watchObservedRunningTime="2026-03-08 19:54:07.220774262 +0000 UTC m=+1348.616828295" Mar 08 19:54:07 crc kubenswrapper[4885]: I0308 19:54:07.380394 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="405b5d21-a208-4f86-b046-66968c326aa4" path="/var/lib/kubelet/pods/405b5d21-a208-4f86-b046-66968c326aa4/volumes" Mar 08 19:54:07 crc kubenswrapper[4885]: I0308 19:54:07.381346 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1daba97-3389-4e45-8a6c-bf910619f315" path="/var/lib/kubelet/pods/a1daba97-3389-4e45-8a6c-bf910619f315/volumes" Mar 08 19:54:08 crc kubenswrapper[4885]: I0308 19:54:08.222468 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerID="db7de2bfb2402bc7c35eeb0e3a0a80a212c00dd48e7b95c320538d854040bceb" exitCode=0 Mar 08 19:54:08 crc kubenswrapper[4885]: I0308 19:54:08.222561 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerDied","Data":"db7de2bfb2402bc7c35eeb0e3a0a80a212c00dd48e7b95c320538d854040bceb"} Mar 08 19:54:08 crc kubenswrapper[4885]: I0308 19:54:08.225774 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4ca493a-f707-45c3-b457-1a1053c3dfe5","Type":"ContainerStarted","Data":"4e4c7d9e4404bd1a5433f1787f2f7abf1d5d2e0fd51aebb6079e0aa7c48cd16e"} Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.269155 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb","Type":"ContainerDied","Data":"34e707d299017c5d1c8910ad8b86e1920b5d7996a4afa2abaf7f5e2cb4124ea4"} Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.269690 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34e707d299017c5d1c8910ad8b86e1920b5d7996a4afa2abaf7f5e2cb4124ea4" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.309056 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426258 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-run-httpd\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426307 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-sg-core-conf-yaml\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426507 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-log-httpd\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426530 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-combined-ca-bundle\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426573 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-scripts\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426647 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-config-data\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.426756 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72jnk\" (UniqueName: \"kubernetes.io/projected/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-kube-api-access-72jnk\") pod \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\" (UID: \"cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb\") " Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.428131 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.428707 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.429381 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.429416 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.432479 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-kube-api-access-72jnk" (OuterVolumeSpecName: "kube-api-access-72jnk") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "kube-api-access-72jnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.440109 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-scripts" (OuterVolumeSpecName: "scripts") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.454613 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.502004 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.527808 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-config-data" (OuterVolumeSpecName: "config-data") pod "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" (UID: "cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.531292 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72jnk\" (UniqueName: \"kubernetes.io/projected/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-kube-api-access-72jnk\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.531325 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.531337 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.531349 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:12 crc kubenswrapper[4885]: I0308 19:54:12.531360 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.282887 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4ca493a-f707-45c3-b457-1a1053c3dfe5","Type":"ContainerStarted","Data":"41d8583c3d498141cc2e38d1ed8623082609a86faa3f087e124f2692ac0c8871"} Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.288675 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.294181 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" event={"ID":"84470b78-5e74-473c-88d3-5343943c01fb","Type":"ContainerStarted","Data":"5bc08b9d58402236103943567dfa278b8697894bc6f9fe1ef5bb281393c8f6d5"} Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.335662 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.335644207 podStartE2EDuration="7.335644207s" podCreationTimestamp="2026-03-08 19:54:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:13.324353384 +0000 UTC m=+1354.720407407" watchObservedRunningTime="2026-03-08 19:54:13.335644207 +0000 UTC m=+1354.731698220" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.361560 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.392697 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" podStartSLOduration=2.328384238 podStartE2EDuration="12.392682971s" podCreationTimestamp="2026-03-08 19:54:01 +0000 UTC" firstStartedPulling="2026-03-08 19:54:02.161944217 +0000 UTC m=+1343.557998230" lastFinishedPulling="2026-03-08 19:54:12.22624294 +0000 UTC m=+1353.622296963" observedRunningTime="2026-03-08 19:54:13.390786521 +0000 UTC m=+1354.786840554" watchObservedRunningTime="2026-03-08 19:54:13.392682971 +0000 UTC m=+1354.788736994" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.396416 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.414627 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:13 crc kubenswrapper[4885]: E0308 19:54:13.415134 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-notification-agent" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415155 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-notification-agent" Mar 08 19:54:13 crc kubenswrapper[4885]: E0308 19:54:13.415170 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-central-agent" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415177 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-central-agent" Mar 08 19:54:13 crc kubenswrapper[4885]: E0308 19:54:13.415208 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="proxy-httpd" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415216 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="proxy-httpd" Mar 08 19:54:13 crc kubenswrapper[4885]: E0308 19:54:13.415237 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="sg-core" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415244 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="sg-core" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415446 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="proxy-httpd" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415465 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-notification-agent" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415483 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="ceilometer-central-agent" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.415496 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" containerName="sg-core" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.417481 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.420194 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.421517 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.456047 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.550867 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-config-data\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.551432 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5pqv\" (UniqueName: \"kubernetes.io/projected/54b37336-b51a-477c-90c6-78242b1e301a-kube-api-access-v5pqv\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.551493 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.551570 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-scripts\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.551659 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-log-httpd\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.551705 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-run-httpd\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.551732 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.652825 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.652884 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-scripts\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.652977 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-log-httpd\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.653015 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-run-httpd\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.653042 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.653091 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-config-data\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.653113 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v5pqv\" (UniqueName: \"kubernetes.io/projected/54b37336-b51a-477c-90c6-78242b1e301a-kube-api-access-v5pqv\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.653783 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-run-httpd\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.654181 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-log-httpd\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.659032 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.659262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-scripts\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.660249 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-config-data\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.665547 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.687878 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5pqv\" (UniqueName: \"kubernetes.io/projected/54b37336-b51a-477c-90c6-78242b1e301a-kube-api-access-v5pqv\") pod \"ceilometer-0\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " pod="openstack/ceilometer-0" Mar 08 19:54:13 crc kubenswrapper[4885]: I0308 19:54:13.757651 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:14 crc kubenswrapper[4885]: I0308 19:54:14.228757 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:14 crc kubenswrapper[4885]: I0308 19:54:14.297110 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerStarted","Data":"5306967bbbb924e179bb457b15fdbc54377a2dfcd6df23e85eb070929ec038ff"} Mar 08 19:54:14 crc kubenswrapper[4885]: I0308 19:54:14.934850 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 19:54:14 crc kubenswrapper[4885]: I0308 19:54:14.935198 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 19:54:14 crc kubenswrapper[4885]: I0308 19:54:14.973437 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 19:54:14 crc kubenswrapper[4885]: I0308 19:54:14.981281 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 19:54:15 crc kubenswrapper[4885]: I0308 19:54:15.306396 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerStarted","Data":"1033d6de01346c935d13877d30ce6f5faeddb78e7018b0edc9f9cd7d22644bf2"} Mar 08 19:54:15 crc kubenswrapper[4885]: I0308 19:54:15.306637 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 19:54:15 crc kubenswrapper[4885]: I0308 19:54:15.306788 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 19:54:15 crc kubenswrapper[4885]: I0308 19:54:15.382173 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb" path="/var/lib/kubelet/pods/cf7df1e4-c89e-412a-97e3-ed4e2e4e42fb/volumes" Mar 08 19:54:15 crc kubenswrapper[4885]: I0308 19:54:15.621145 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:16 crc kubenswrapper[4885]: I0308 19:54:16.316788 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerStarted","Data":"91ab3e0d4b5038bd39e0b50ea73003debc97ce3fd7cb74f3d1703406c18f0f45"} Mar 08 19:54:16 crc kubenswrapper[4885]: I0308 19:54:16.609391 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:16 crc kubenswrapper[4885]: I0308 19:54:16.609451 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:16 crc kubenswrapper[4885]: I0308 19:54:16.648518 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:16 crc kubenswrapper[4885]: I0308 19:54:16.652169 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:17 crc kubenswrapper[4885]: I0308 19:54:17.336056 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerStarted","Data":"e760af5e1e3c9c30fd53df761c3e06a42f4c5b2005a2c182abf8652d173afbb4"} Mar 08 19:54:17 crc kubenswrapper[4885]: I0308 19:54:17.336411 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:17 crc kubenswrapper[4885]: I0308 19:54:17.336442 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:17 crc kubenswrapper[4885]: I0308 19:54:17.380979 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 19:54:17 crc kubenswrapper[4885]: I0308 19:54:17.381042 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 19:54:18 crc kubenswrapper[4885]: I0308 19:54:18.348563 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerStarted","Data":"682c312d6d039e2d50db0d6cb4756a641ecd8a4c893b0b2dd1dd818a51d8408e"} Mar 08 19:54:18 crc kubenswrapper[4885]: I0308 19:54:18.348934 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-central-agent" containerID="cri-o://1033d6de01346c935d13877d30ce6f5faeddb78e7018b0edc9f9cd7d22644bf2" gracePeriod=30 Mar 08 19:54:18 crc kubenswrapper[4885]: I0308 19:54:18.349259 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="proxy-httpd" containerID="cri-o://682c312d6d039e2d50db0d6cb4756a641ecd8a4c893b0b2dd1dd818a51d8408e" gracePeriod=30 Mar 08 19:54:18 crc kubenswrapper[4885]: I0308 19:54:18.349293 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-notification-agent" containerID="cri-o://91ab3e0d4b5038bd39e0b50ea73003debc97ce3fd7cb74f3d1703406c18f0f45" gracePeriod=30 Mar 08 19:54:18 crc kubenswrapper[4885]: I0308 19:54:18.349339 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="sg-core" containerID="cri-o://e760af5e1e3c9c30fd53df761c3e06a42f4c5b2005a2c182abf8652d173afbb4" gracePeriod=30 Mar 08 19:54:18 crc kubenswrapper[4885]: I0308 19:54:18.390010 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.816187066 podStartE2EDuration="5.389988271s" podCreationTimestamp="2026-03-08 19:54:13 +0000 UTC" firstStartedPulling="2026-03-08 19:54:14.238079188 +0000 UTC m=+1355.634133211" lastFinishedPulling="2026-03-08 19:54:17.811880393 +0000 UTC m=+1359.207934416" observedRunningTime="2026-03-08 19:54:18.376222224 +0000 UTC m=+1359.772276287" watchObservedRunningTime="2026-03-08 19:54:18.389988271 +0000 UTC m=+1359.786042304" Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.286993 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.293645 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.362127 4885 generic.go:334] "Generic (PLEG): container finished" podID="54b37336-b51a-477c-90c6-78242b1e301a" containerID="682c312d6d039e2d50db0d6cb4756a641ecd8a4c893b0b2dd1dd818a51d8408e" exitCode=0 Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.362167 4885 generic.go:334] "Generic (PLEG): container finished" podID="54b37336-b51a-477c-90c6-78242b1e301a" containerID="e760af5e1e3c9c30fd53df761c3e06a42f4c5b2005a2c182abf8652d173afbb4" exitCode=2 Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.362176 4885 generic.go:334] "Generic (PLEG): container finished" podID="54b37336-b51a-477c-90c6-78242b1e301a" containerID="91ab3e0d4b5038bd39e0b50ea73003debc97ce3fd7cb74f3d1703406c18f0f45" exitCode=0 Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.362249 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerDied","Data":"682c312d6d039e2d50db0d6cb4756a641ecd8a4c893b0b2dd1dd818a51d8408e"} Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.362310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerDied","Data":"e760af5e1e3c9c30fd53df761c3e06a42f4c5b2005a2c182abf8652d173afbb4"} Mar 08 19:54:19 crc kubenswrapper[4885]: I0308 19:54:19.362327 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerDied","Data":"91ab3e0d4b5038bd39e0b50ea73003debc97ce3fd7cb74f3d1703406c18f0f45"} Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.391451 4885 generic.go:334] "Generic (PLEG): container finished" podID="54b37336-b51a-477c-90c6-78242b1e301a" containerID="1033d6de01346c935d13877d30ce6f5faeddb78e7018b0edc9f9cd7d22644bf2" exitCode=0 Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.391539 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerDied","Data":"1033d6de01346c935d13877d30ce6f5faeddb78e7018b0edc9f9cd7d22644bf2"} Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.694670 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.727346 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-run-httpd\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.727705 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.727880 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-sg-core-conf-yaml\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.728626 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-combined-ca-bundle\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.728697 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-log-httpd\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.728737 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-config-data\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.728758 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-scripts\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.728803 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5pqv\" (UniqueName: \"kubernetes.io/projected/54b37336-b51a-477c-90c6-78242b1e301a-kube-api-access-v5pqv\") pod \"54b37336-b51a-477c-90c6-78242b1e301a\" (UID: \"54b37336-b51a-477c-90c6-78242b1e301a\") " Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.729524 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.729575 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.733083 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b37336-b51a-477c-90c6-78242b1e301a-kube-api-access-v5pqv" (OuterVolumeSpecName: "kube-api-access-v5pqv") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "kube-api-access-v5pqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.733967 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-scripts" (OuterVolumeSpecName: "scripts") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.772466 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.811091 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.830967 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.830999 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.831012 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/54b37336-b51a-477c-90c6-78242b1e301a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.831026 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.831038 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v5pqv\" (UniqueName: \"kubernetes.io/projected/54b37336-b51a-477c-90c6-78242b1e301a-kube-api-access-v5pqv\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.833240 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-config-data" (OuterVolumeSpecName: "config-data") pod "54b37336-b51a-477c-90c6-78242b1e301a" (UID: "54b37336-b51a-477c-90c6-78242b1e301a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:22 crc kubenswrapper[4885]: I0308 19:54:22.933138 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54b37336-b51a-477c-90c6-78242b1e301a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.414062 4885 generic.go:334] "Generic (PLEG): container finished" podID="84470b78-5e74-473c-88d3-5343943c01fb" containerID="5bc08b9d58402236103943567dfa278b8697894bc6f9fe1ef5bb281393c8f6d5" exitCode=0 Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.414163 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" event={"ID":"84470b78-5e74-473c-88d3-5343943c01fb","Type":"ContainerDied","Data":"5bc08b9d58402236103943567dfa278b8697894bc6f9fe1ef5bb281393c8f6d5"} Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.423010 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"54b37336-b51a-477c-90c6-78242b1e301a","Type":"ContainerDied","Data":"5306967bbbb924e179bb457b15fdbc54377a2dfcd6df23e85eb070929ec038ff"} Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.423084 4885 scope.go:117] "RemoveContainer" containerID="682c312d6d039e2d50db0d6cb4756a641ecd8a4c893b0b2dd1dd818a51d8408e" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.423280 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.462261 4885 scope.go:117] "RemoveContainer" containerID="e760af5e1e3c9c30fd53df761c3e06a42f4c5b2005a2c182abf8652d173afbb4" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.481687 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.518648 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.531478 4885 scope.go:117] "RemoveContainer" containerID="91ab3e0d4b5038bd39e0b50ea73003debc97ce3fd7cb74f3d1703406c18f0f45" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.552680 4885 scope.go:117] "RemoveContainer" containerID="1033d6de01346c935d13877d30ce6f5faeddb78e7018b0edc9f9cd7d22644bf2" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.554963 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:23 crc kubenswrapper[4885]: E0308 19:54:23.555489 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="sg-core" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555507 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="sg-core" Mar 08 19:54:23 crc kubenswrapper[4885]: E0308 19:54:23.555528 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-notification-agent" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555536 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-notification-agent" Mar 08 19:54:23 crc kubenswrapper[4885]: E0308 19:54:23.555561 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-central-agent" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555569 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-central-agent" Mar 08 19:54:23 crc kubenswrapper[4885]: E0308 19:54:23.555595 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="proxy-httpd" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555604 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="proxy-httpd" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555955 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="proxy-httpd" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555977 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-notification-agent" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.555996 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="sg-core" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.556007 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b37336-b51a-477c-90c6-78242b1e301a" containerName="ceilometer-central-agent" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.558279 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.560770 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.565021 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.574614 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647195 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2s99\" (UniqueName: \"kubernetes.io/projected/d0b0d3fd-e485-4924-8a1b-6214b7840e52-kube-api-access-l2s99\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647480 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-config-data\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647523 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-log-httpd\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647563 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647636 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-run-httpd\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.647661 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-scripts\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-run-httpd\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749421 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-scripts\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749532 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2s99\" (UniqueName: \"kubernetes.io/projected/d0b0d3fd-e485-4924-8a1b-6214b7840e52-kube-api-access-l2s99\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749595 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-config-data\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749696 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-log-httpd\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.749779 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.751334 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-run-httpd\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.751869 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-log-httpd\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.771611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.771623 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-scripts\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.772680 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.774522 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-config-data\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.804986 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2s99\" (UniqueName: \"kubernetes.io/projected/d0b0d3fd-e485-4924-8a1b-6214b7840e52-kube-api-access-l2s99\") pod \"ceilometer-0\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " pod="openstack/ceilometer-0" Mar 08 19:54:23 crc kubenswrapper[4885]: I0308 19:54:23.891786 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.367306 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.434505 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerStarted","Data":"031cf273d2ce6f416df1db118df4e04e2598121142c98562d1d0691ec1ae6950"} Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.815504 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.878426 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-combined-ca-bundle\") pod \"84470b78-5e74-473c-88d3-5343943c01fb\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.878710 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66qmj\" (UniqueName: \"kubernetes.io/projected/84470b78-5e74-473c-88d3-5343943c01fb-kube-api-access-66qmj\") pod \"84470b78-5e74-473c-88d3-5343943c01fb\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.878752 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-config-data\") pod \"84470b78-5e74-473c-88d3-5343943c01fb\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.878794 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-scripts\") pod \"84470b78-5e74-473c-88d3-5343943c01fb\" (UID: \"84470b78-5e74-473c-88d3-5343943c01fb\") " Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.886284 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84470b78-5e74-473c-88d3-5343943c01fb-kube-api-access-66qmj" (OuterVolumeSpecName: "kube-api-access-66qmj") pod "84470b78-5e74-473c-88d3-5343943c01fb" (UID: "84470b78-5e74-473c-88d3-5343943c01fb"). InnerVolumeSpecName "kube-api-access-66qmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.887260 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-scripts" (OuterVolumeSpecName: "scripts") pod "84470b78-5e74-473c-88d3-5343943c01fb" (UID: "84470b78-5e74-473c-88d3-5343943c01fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.912308 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84470b78-5e74-473c-88d3-5343943c01fb" (UID: "84470b78-5e74-473c-88d3-5343943c01fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.912573 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-config-data" (OuterVolumeSpecName: "config-data") pod "84470b78-5e74-473c-88d3-5343943c01fb" (UID: "84470b78-5e74-473c-88d3-5343943c01fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.980960 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.981350 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.981376 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66qmj\" (UniqueName: \"kubernetes.io/projected/84470b78-5e74-473c-88d3-5343943c01fb-kube-api-access-66qmj\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:24 crc kubenswrapper[4885]: I0308 19:54:24.981397 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84470b78-5e74-473c-88d3-5343943c01fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.379799 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b37336-b51a-477c-90c6-78242b1e301a" path="/var/lib/kubelet/pods/54b37336-b51a-477c-90c6-78242b1e301a/volumes" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.463059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" event={"ID":"84470b78-5e74-473c-88d3-5343943c01fb","Type":"ContainerDied","Data":"ce67e1273536f417a3da8fd4b46949a06744125c07e920dac8c012c6378cce25"} Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.463132 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce67e1273536f417a3da8fd4b46949a06744125c07e920dac8c012c6378cce25" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.463207 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5pkh8" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.465836 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerStarted","Data":"608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa"} Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.517337 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 19:54:25 crc kubenswrapper[4885]: E0308 19:54:25.517755 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84470b78-5e74-473c-88d3-5343943c01fb" containerName="nova-cell0-conductor-db-sync" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.517773 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="84470b78-5e74-473c-88d3-5343943c01fb" containerName="nova-cell0-conductor-db-sync" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.518004 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="84470b78-5e74-473c-88d3-5343943c01fb" containerName="nova-cell0-conductor-db-sync" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.522101 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.524839 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.524928 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qrg8t" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.543907 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.592115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.592162 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwjwb\" (UniqueName: \"kubernetes.io/projected/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-kube-api-access-nwjwb\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.592265 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.693854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.694330 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.694373 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwjwb\" (UniqueName: \"kubernetes.io/projected/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-kube-api-access-nwjwb\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.698671 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.700416 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.722414 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwjwb\" (UniqueName: \"kubernetes.io/projected/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-kube-api-access-nwjwb\") pod \"nova-cell0-conductor-0\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:25 crc kubenswrapper[4885]: I0308 19:54:25.840608 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:26 crc kubenswrapper[4885]: I0308 19:54:26.275430 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 19:54:26 crc kubenswrapper[4885]: W0308 19:54:26.278773 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedd9ad85_0e13_4d1f_ab0e_ffd5630c6197.slice/crio-e45bf247b9d279f13a7ad3c2bb090db1b20d8e474638af4ca7545e5e6c5bd1a9 WatchSource:0}: Error finding container e45bf247b9d279f13a7ad3c2bb090db1b20d8e474638af4ca7545e5e6c5bd1a9: Status 404 returned error can't find the container with id e45bf247b9d279f13a7ad3c2bb090db1b20d8e474638af4ca7545e5e6c5bd1a9 Mar 08 19:54:26 crc kubenswrapper[4885]: I0308 19:54:26.479012 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerStarted","Data":"9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1"} Mar 08 19:54:26 crc kubenswrapper[4885]: I0308 19:54:26.479281 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerStarted","Data":"640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055"} Mar 08 19:54:26 crc kubenswrapper[4885]: I0308 19:54:26.485493 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197","Type":"ContainerStarted","Data":"e45bf247b9d279f13a7ad3c2bb090db1b20d8e474638af4ca7545e5e6c5bd1a9"} Mar 08 19:54:27 crc kubenswrapper[4885]: I0308 19:54:27.500676 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197","Type":"ContainerStarted","Data":"9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b"} Mar 08 19:54:27 crc kubenswrapper[4885]: I0308 19:54:27.500987 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:27 crc kubenswrapper[4885]: I0308 19:54:27.534588 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.5345607709999998 podStartE2EDuration="2.534560771s" podCreationTimestamp="2026-03-08 19:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:27.52480205 +0000 UTC m=+1368.920856113" watchObservedRunningTime="2026-03-08 19:54:27.534560771 +0000 UTC m=+1368.930614834" Mar 08 19:54:28 crc kubenswrapper[4885]: I0308 19:54:28.513885 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerStarted","Data":"955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853"} Mar 08 19:54:28 crc kubenswrapper[4885]: I0308 19:54:28.514409 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 19:54:28 crc kubenswrapper[4885]: I0308 19:54:28.543704 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.231132469 podStartE2EDuration="5.543682967s" podCreationTimestamp="2026-03-08 19:54:23 +0000 UTC" firstStartedPulling="2026-03-08 19:54:24.378398891 +0000 UTC m=+1365.774452914" lastFinishedPulling="2026-03-08 19:54:27.690949379 +0000 UTC m=+1369.087003412" observedRunningTime="2026-03-08 19:54:28.533980257 +0000 UTC m=+1369.930034290" watchObservedRunningTime="2026-03-08 19:54:28.543682967 +0000 UTC m=+1369.939737000" Mar 08 19:54:35 crc kubenswrapper[4885]: I0308 19:54:35.880006 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.471577 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-pmgdm"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.473565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.475600 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.476815 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.505063 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmgdm"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.619597 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.620776 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.624729 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.636707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-config-data\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.637198 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjvvt\" (UniqueName: \"kubernetes.io/projected/72fa5124-24e9-47b1-8522-815cfef2a86b-kube-api-access-jjvvt\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.637354 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-scripts\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.637450 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.638831 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.730619 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.732189 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.735838 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739159 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-config-data\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739208 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvzmh\" (UniqueName: \"kubernetes.io/projected/699e9586-4fcb-4a93-b479-44d269162645-kube-api-access-zvzmh\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739248 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739291 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-config-data\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739316 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjvvt\" (UniqueName: \"kubernetes.io/projected/72fa5124-24e9-47b1-8522-815cfef2a86b-kube-api-access-jjvvt\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.739349 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-scripts\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.747794 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-scripts\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.754082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-config-data\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.755235 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.756138 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.756662 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.763324 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.774450 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.775531 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjvvt\" (UniqueName: \"kubernetes.io/projected/72fa5124-24e9-47b1-8522-815cfef2a86b-kube-api-access-jjvvt\") pod \"nova-cell0-cell-mapping-pmgdm\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.792101 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.824064 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.844823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f00be59-96a4-4f2f-b319-6435aa008932-logs\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845172 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f43bb9c-a447-4ef0-8cdd-4447d8703193-logs\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845321 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845444 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvzmh\" (UniqueName: \"kubernetes.io/projected/699e9586-4fcb-4a93-b479-44d269162645-kube-api-access-zvzmh\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845562 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845650 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-config-data\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845844 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgvhx\" (UniqueName: \"kubernetes.io/projected/1f43bb9c-a447-4ef0-8cdd-4447d8703193-kube-api-access-xgvhx\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.845972 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-config-data\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.846127 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9qpj\" (UniqueName: \"kubernetes.io/projected/5f00be59-96a4-4f2f-b319-6435aa008932-kube-api-access-z9qpj\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.846218 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-config-data\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.852028 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-8dnjm"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.853628 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.865020 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.866413 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.870134 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.870953 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-config-data\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.883028 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.888067 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.888133 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-8dnjm"] Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.918693 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvzmh\" (UniqueName: \"kubernetes.io/projected/699e9586-4fcb-4a93-b479-44d269162645-kube-api-access-zvzmh\") pod \"nova-scheduler-0\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.939634 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948094 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9qpj\" (UniqueName: \"kubernetes.io/projected/5f00be59-96a4-4f2f-b319-6435aa008932-kube-api-access-z9qpj\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948642 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-config-data\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948693 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948736 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948786 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f00be59-96a4-4f2f-b319-6435aa008932-logs\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948811 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948842 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f43bb9c-a447-4ef0-8cdd-4447d8703193-logs\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948875 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948897 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948950 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcznq\" (UniqueName: \"kubernetes.io/projected/515ba29a-53ae-41dc-a444-9ffe060dc61f-kube-api-access-zcznq\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.948993 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.949035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-config-data\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.949058 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.949080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.949097 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-config\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.949115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frh8b\" (UniqueName: \"kubernetes.io/projected/0274624f-a49d-425f-b025-753e4e174477-kube-api-access-frh8b\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.949139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgvhx\" (UniqueName: \"kubernetes.io/projected/1f43bb9c-a447-4ef0-8cdd-4447d8703193-kube-api-access-xgvhx\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.950946 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f00be59-96a4-4f2f-b319-6435aa008932-logs\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.952878 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f43bb9c-a447-4ef0-8cdd-4447d8703193-logs\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.958436 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-config-data\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.959474 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-config-data\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.964437 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.969292 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.972050 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9qpj\" (UniqueName: \"kubernetes.io/projected/5f00be59-96a4-4f2f-b319-6435aa008932-kube-api-access-z9qpj\") pod \"nova-api-0\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " pod="openstack/nova-api-0" Mar 08 19:54:36 crc kubenswrapper[4885]: I0308 19:54:36.982967 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgvhx\" (UniqueName: \"kubernetes.io/projected/1f43bb9c-a447-4ef0-8cdd-4447d8703193-kube-api-access-xgvhx\") pod \"nova-metadata-0\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " pod="openstack/nova-metadata-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.018414 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.059700 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.059812 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.059868 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zcznq\" (UniqueName: \"kubernetes.io/projected/515ba29a-53ae-41dc-a444-9ffe060dc61f-kube-api-access-zcznq\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.060983 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.061042 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.061129 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-config\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.061175 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frh8b\" (UniqueName: \"kubernetes.io/projected/0274624f-a49d-425f-b025-753e4e174477-kube-api-access-frh8b\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.061352 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.061417 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.062474 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-swift-storage-0\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.065657 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-config\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.066230 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-nb\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.066272 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.066962 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-sb\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.068539 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-svc\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.070724 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.085769 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcznq\" (UniqueName: \"kubernetes.io/projected/515ba29a-53ae-41dc-a444-9ffe060dc61f-kube-api-access-zcznq\") pod \"nova-cell1-novncproxy-0\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.089210 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frh8b\" (UniqueName: \"kubernetes.io/projected/0274624f-a49d-425f-b025-753e4e174477-kube-api-access-frh8b\") pod \"dnsmasq-dns-7bd5679c8c-8dnjm\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.112766 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.153361 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.389666 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.670661 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.678260 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmgdm"] Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.891420 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:37 crc kubenswrapper[4885]: W0308 19:54:37.901335 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod515ba29a_53ae_41dc_a444_9ffe060dc61f.slice/crio-93b0b9013aeb93d202bb43a6cf260d8de9e26ddf4b972302a13aaf4bbc7119c7 WatchSource:0}: Error finding container 93b0b9013aeb93d202bb43a6cf260d8de9e26ddf4b972302a13aaf4bbc7119c7: Status 404 returned error can't find the container with id 93b0b9013aeb93d202bb43a6cf260d8de9e26ddf4b972302a13aaf4bbc7119c7 Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.901450 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.998359 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jc2k"] Mar 08 19:54:37 crc kubenswrapper[4885]: I0308 19:54:37.999565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.001648 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.006093 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.033999 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jc2k"] Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.057184 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-8dnjm"] Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.065230 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:38 crc kubenswrapper[4885]: W0308 19:54:38.070298 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f43bb9c_a447_4ef0_8cdd_4447d8703193.slice/crio-8e7cdd4e54b7967bac306515e8c7e9771755d86ab276c44dacbac33677726424 WatchSource:0}: Error finding container 8e7cdd4e54b7967bac306515e8c7e9771755d86ab276c44dacbac33677726424: Status 404 returned error can't find the container with id 8e7cdd4e54b7967bac306515e8c7e9771755d86ab276c44dacbac33677726424 Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.095866 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-scripts\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.095997 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc57k\" (UniqueName: \"kubernetes.io/projected/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-kube-api-access-pc57k\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.096091 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-config-data\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.096119 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.197366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-scripts\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.197455 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc57k\" (UniqueName: \"kubernetes.io/projected/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-kube-api-access-pc57k\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.197525 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-config-data\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.197550 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.201461 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-config-data\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.201510 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.201868 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-scripts\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.214476 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc57k\" (UniqueName: \"kubernetes.io/projected/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-kube-api-access-pc57k\") pod \"nova-cell1-conductor-db-sync-7jc2k\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.318462 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.647680 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmgdm" event={"ID":"72fa5124-24e9-47b1-8522-815cfef2a86b","Type":"ContainerStarted","Data":"6794573adf705b25644869fa29cf7b121bfba4201d8cc4c2b8d4234d9e883a1d"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.648044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmgdm" event={"ID":"72fa5124-24e9-47b1-8522-815cfef2a86b","Type":"ContainerStarted","Data":"251a9a13da15a33f312746f93ec7e90b7cec5cac915228321a2a19b174040f9e"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.661594 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"515ba29a-53ae-41dc-a444-9ffe060dc61f","Type":"ContainerStarted","Data":"93b0b9013aeb93d202bb43a6cf260d8de9e26ddf4b972302a13aaf4bbc7119c7"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.663395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f43bb9c-a447-4ef0-8cdd-4447d8703193","Type":"ContainerStarted","Data":"8e7cdd4e54b7967bac306515e8c7e9771755d86ab276c44dacbac33677726424"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.670837 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f00be59-96a4-4f2f-b319-6435aa008932","Type":"ContainerStarted","Data":"44a4da7dfb75724690fd74611aac2ae815bfdeff07d9af1d06d6b4bf6e5274e1"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.672147 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-pmgdm" podStartSLOduration=2.672128622 podStartE2EDuration="2.672128622s" podCreationTimestamp="2026-03-08 19:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:38.666278726 +0000 UTC m=+1380.062332759" watchObservedRunningTime="2026-03-08 19:54:38.672128622 +0000 UTC m=+1380.068182645" Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.672657 4885 generic.go:334] "Generic (PLEG): container finished" podID="0274624f-a49d-425f-b025-753e4e174477" containerID="bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7" exitCode=0 Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.672824 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" event={"ID":"0274624f-a49d-425f-b025-753e4e174477","Type":"ContainerDied","Data":"bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.672846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" event={"ID":"0274624f-a49d-425f-b025-753e4e174477","Type":"ContainerStarted","Data":"e0f14b4c223b387a85388f249c4bcd0b49cb89c791c76c20d6484eb655e195a3"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.682169 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"699e9586-4fcb-4a93-b479-44d269162645","Type":"ContainerStarted","Data":"28d9f2d30308ef7519a870cfda7992bd0f04e5904fdb1fac5d4c1b4a105e7de5"} Mar 08 19:54:38 crc kubenswrapper[4885]: I0308 19:54:38.836422 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jc2k"] Mar 08 19:54:38 crc kubenswrapper[4885]: W0308 19:54:38.858235 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ea65f9e_cbf1_47a6_8800_aa6b7fe9ffef.slice/crio-3b6c61eb70ff155909795960839c6c1ae45750413aff9ca488f8dad001880faa WatchSource:0}: Error finding container 3b6c61eb70ff155909795960839c6c1ae45750413aff9ca488f8dad001880faa: Status 404 returned error can't find the container with id 3b6c61eb70ff155909795960839c6c1ae45750413aff9ca488f8dad001880faa Mar 08 19:54:39 crc kubenswrapper[4885]: I0308 19:54:39.694811 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" event={"ID":"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef","Type":"ContainerStarted","Data":"72f8ee44be245d4136285cfdfda421e5c74196d06b96d20eec24f989618614f0"} Mar 08 19:54:39 crc kubenswrapper[4885]: I0308 19:54:39.695210 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" event={"ID":"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef","Type":"ContainerStarted","Data":"3b6c61eb70ff155909795960839c6c1ae45750413aff9ca488f8dad001880faa"} Mar 08 19:54:39 crc kubenswrapper[4885]: I0308 19:54:39.701366 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" event={"ID":"0274624f-a49d-425f-b025-753e4e174477","Type":"ContainerStarted","Data":"65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db"} Mar 08 19:54:39 crc kubenswrapper[4885]: I0308 19:54:39.701436 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:39 crc kubenswrapper[4885]: I0308 19:54:39.737091 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" podStartSLOduration=2.737071124 podStartE2EDuration="2.737071124s" podCreationTimestamp="2026-03-08 19:54:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:39.726269315 +0000 UTC m=+1381.122323338" watchObservedRunningTime="2026-03-08 19:54:39.737071124 +0000 UTC m=+1381.133125157" Mar 08 19:54:39 crc kubenswrapper[4885]: I0308 19:54:39.753400 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" podStartSLOduration=3.753345729 podStartE2EDuration="3.753345729s" podCreationTimestamp="2026-03-08 19:54:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:39.750310908 +0000 UTC m=+1381.146364951" watchObservedRunningTime="2026-03-08 19:54:39.753345729 +0000 UTC m=+1381.149399752" Mar 08 19:54:40 crc kubenswrapper[4885]: I0308 19:54:40.785443 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:40 crc kubenswrapper[4885]: I0308 19:54:40.810245 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.766613 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f00be59-96a4-4f2f-b319-6435aa008932","Type":"ContainerStarted","Data":"52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d"} Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.767534 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f00be59-96a4-4f2f-b319-6435aa008932","Type":"ContainerStarted","Data":"1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff"} Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.768468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"699e9586-4fcb-4a93-b479-44d269162645","Type":"ContainerStarted","Data":"eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922"} Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.771772 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"515ba29a-53ae-41dc-a444-9ffe060dc61f","Type":"ContainerStarted","Data":"4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e"} Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.771898 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="515ba29a-53ae-41dc-a444-9ffe060dc61f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e" gracePeriod=30 Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.780489 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f43bb9c-a447-4ef0-8cdd-4447d8703193","Type":"ContainerStarted","Data":"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395"} Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.780533 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f43bb9c-a447-4ef0-8cdd-4447d8703193","Type":"ContainerStarted","Data":"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556"} Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.780669 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-log" containerID="cri-o://4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556" gracePeriod=30 Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.780899 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-metadata" containerID="cri-o://5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395" gracePeriod=30 Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.799692 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.092796799 podStartE2EDuration="6.799641177s" podCreationTimestamp="2026-03-08 19:54:36 +0000 UTC" firstStartedPulling="2026-03-08 19:54:37.898909293 +0000 UTC m=+1379.294963316" lastFinishedPulling="2026-03-08 19:54:41.605753671 +0000 UTC m=+1383.001807694" observedRunningTime="2026-03-08 19:54:42.793429281 +0000 UTC m=+1384.189483314" watchObservedRunningTime="2026-03-08 19:54:42.799641177 +0000 UTC m=+1384.195695210" Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.816729 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.315089473 podStartE2EDuration="6.816705464s" podCreationTimestamp="2026-03-08 19:54:36 +0000 UTC" firstStartedPulling="2026-03-08 19:54:38.076196162 +0000 UTC m=+1379.472250185" lastFinishedPulling="2026-03-08 19:54:41.577812163 +0000 UTC m=+1382.973866176" observedRunningTime="2026-03-08 19:54:42.815408369 +0000 UTC m=+1384.211462432" watchObservedRunningTime="2026-03-08 19:54:42.816705464 +0000 UTC m=+1384.212759527" Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.863017 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.190156985 podStartE2EDuration="6.862996973s" podCreationTimestamp="2026-03-08 19:54:36 +0000 UTC" firstStartedPulling="2026-03-08 19:54:37.904586235 +0000 UTC m=+1379.300640258" lastFinishedPulling="2026-03-08 19:54:41.577426223 +0000 UTC m=+1382.973480246" observedRunningTime="2026-03-08 19:54:42.855102963 +0000 UTC m=+1384.251156996" watchObservedRunningTime="2026-03-08 19:54:42.862996973 +0000 UTC m=+1384.259051006" Mar 08 19:54:42 crc kubenswrapper[4885]: I0308 19:54:42.867885 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.9671046629999998 podStartE2EDuration="6.867874525s" podCreationTimestamp="2026-03-08 19:54:36 +0000 UTC" firstStartedPulling="2026-03-08 19:54:37.676433645 +0000 UTC m=+1379.072487668" lastFinishedPulling="2026-03-08 19:54:41.577203497 +0000 UTC m=+1382.973257530" observedRunningTime="2026-03-08 19:54:42.838493358 +0000 UTC m=+1384.234547391" watchObservedRunningTime="2026-03-08 19:54:42.867874525 +0000 UTC m=+1384.263928548" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.356307 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.528162 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgvhx\" (UniqueName: \"kubernetes.io/projected/1f43bb9c-a447-4ef0-8cdd-4447d8703193-kube-api-access-xgvhx\") pod \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.528320 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f43bb9c-a447-4ef0-8cdd-4447d8703193-logs\") pod \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.528420 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-combined-ca-bundle\") pod \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.528442 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-config-data\") pod \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\" (UID: \"1f43bb9c-a447-4ef0-8cdd-4447d8703193\") " Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.528691 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f43bb9c-a447-4ef0-8cdd-4447d8703193-logs" (OuterVolumeSpecName: "logs") pod "1f43bb9c-a447-4ef0-8cdd-4447d8703193" (UID: "1f43bb9c-a447-4ef0-8cdd-4447d8703193"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.534349 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f43bb9c-a447-4ef0-8cdd-4447d8703193-kube-api-access-xgvhx" (OuterVolumeSpecName: "kube-api-access-xgvhx") pod "1f43bb9c-a447-4ef0-8cdd-4447d8703193" (UID: "1f43bb9c-a447-4ef0-8cdd-4447d8703193"). InnerVolumeSpecName "kube-api-access-xgvhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.569151 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-config-data" (OuterVolumeSpecName: "config-data") pod "1f43bb9c-a447-4ef0-8cdd-4447d8703193" (UID: "1f43bb9c-a447-4ef0-8cdd-4447d8703193"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.575624 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f43bb9c-a447-4ef0-8cdd-4447d8703193" (UID: "1f43bb9c-a447-4ef0-8cdd-4447d8703193"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.630945 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgvhx\" (UniqueName: \"kubernetes.io/projected/1f43bb9c-a447-4ef0-8cdd-4447d8703193-kube-api-access-xgvhx\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.630983 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1f43bb9c-a447-4ef0-8cdd-4447d8703193-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.630998 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.631006 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f43bb9c-a447-4ef0-8cdd-4447d8703193-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790561 4885 generic.go:334] "Generic (PLEG): container finished" podID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerID="5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395" exitCode=0 Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790599 4885 generic.go:334] "Generic (PLEG): container finished" podID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerID="4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556" exitCode=143 Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790619 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790668 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f43bb9c-a447-4ef0-8cdd-4447d8703193","Type":"ContainerDied","Data":"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395"} Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790724 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f43bb9c-a447-4ef0-8cdd-4447d8703193","Type":"ContainerDied","Data":"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556"} Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790743 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1f43bb9c-a447-4ef0-8cdd-4447d8703193","Type":"ContainerDied","Data":"8e7cdd4e54b7967bac306515e8c7e9771755d86ab276c44dacbac33677726424"} Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.790762 4885 scope.go:117] "RemoveContainer" containerID="5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.828989 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.845554 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.858886 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:43 crc kubenswrapper[4885]: E0308 19:54:43.859299 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-metadata" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.859318 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-metadata" Mar 08 19:54:43 crc kubenswrapper[4885]: E0308 19:54:43.859332 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-log" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.859338 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-log" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.859505 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-metadata" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.859525 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" containerName="nova-metadata-log" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.860527 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.860723 4885 scope.go:117] "RemoveContainer" containerID="4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.863189 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.868058 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.872712 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.916622 4885 scope.go:117] "RemoveContainer" containerID="5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395" Mar 08 19:54:43 crc kubenswrapper[4885]: E0308 19:54:43.917432 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395\": container with ID starting with 5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395 not found: ID does not exist" containerID="5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.917468 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395"} err="failed to get container status \"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395\": rpc error: code = NotFound desc = could not find container \"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395\": container with ID starting with 5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395 not found: ID does not exist" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.917525 4885 scope.go:117] "RemoveContainer" containerID="4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556" Mar 08 19:54:43 crc kubenswrapper[4885]: E0308 19:54:43.918288 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556\": container with ID starting with 4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556 not found: ID does not exist" containerID="4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.918340 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556"} err="failed to get container status \"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556\": rpc error: code = NotFound desc = could not find container \"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556\": container with ID starting with 4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556 not found: ID does not exist" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.918408 4885 scope.go:117] "RemoveContainer" containerID="5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.919026 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395"} err="failed to get container status \"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395\": rpc error: code = NotFound desc = could not find container \"5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395\": container with ID starting with 5ff23d6e6ca45f829f68b6fb2253d4892bc0b98845735d738b8633c391892395 not found: ID does not exist" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.919077 4885 scope.go:117] "RemoveContainer" containerID="4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556" Mar 08 19:54:43 crc kubenswrapper[4885]: I0308 19:54:43.920298 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556"} err="failed to get container status \"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556\": rpc error: code = NotFound desc = could not find container \"4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556\": container with ID starting with 4341825aaf8376abb17afef586a41d7a9c9fcd0fec05c4d2524913d2ffdef556 not found: ID does not exist" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.039087 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mthg\" (UniqueName: \"kubernetes.io/projected/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-kube-api-access-2mthg\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.039133 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.039166 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-config-data\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.039210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.039538 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-logs\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.142030 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mthg\" (UniqueName: \"kubernetes.io/projected/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-kube-api-access-2mthg\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.142101 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.142157 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-config-data\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.142233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.142311 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-logs\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.142824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-logs\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.145967 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.147611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-config-data\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.159303 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.166805 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mthg\" (UniqueName: \"kubernetes.io/projected/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-kube-api-access-2mthg\") pod \"nova-metadata-0\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.202848 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.733884 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:44 crc kubenswrapper[4885]: I0308 19:54:44.807655 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7db4ab0b-c8b4-4809-8ff5-31f5175de78d","Type":"ContainerStarted","Data":"44b459aaa1e4b5454e9eaf60660b8259d3e8c050ced5809387d49579fc3941c7"} Mar 08 19:54:45 crc kubenswrapper[4885]: E0308 19:54:45.248074 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72fa5124_24e9_47b1_8522_815cfef2a86b.slice/crio-conmon-6794573adf705b25644869fa29cf7b121bfba4201d8cc4c2b8d4234d9e883a1d.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:54:45 crc kubenswrapper[4885]: I0308 19:54:45.381174 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f43bb9c-a447-4ef0-8cdd-4447d8703193" path="/var/lib/kubelet/pods/1f43bb9c-a447-4ef0-8cdd-4447d8703193/volumes" Mar 08 19:54:45 crc kubenswrapper[4885]: I0308 19:54:45.824309 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7db4ab0b-c8b4-4809-8ff5-31f5175de78d","Type":"ContainerStarted","Data":"524c8a726c09b6bf03417223c87a06d2d37402e32a2b0f4addd79d17ce8ac52c"} Mar 08 19:54:45 crc kubenswrapper[4885]: I0308 19:54:45.824463 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7db4ab0b-c8b4-4809-8ff5-31f5175de78d","Type":"ContainerStarted","Data":"0ea0d064913557c827d5cc82c34118e0a93078364e86541377cb2d10a87cbb38"} Mar 08 19:54:45 crc kubenswrapper[4885]: I0308 19:54:45.827115 4885 generic.go:334] "Generic (PLEG): container finished" podID="72fa5124-24e9-47b1-8522-815cfef2a86b" containerID="6794573adf705b25644869fa29cf7b121bfba4201d8cc4c2b8d4234d9e883a1d" exitCode=0 Mar 08 19:54:45 crc kubenswrapper[4885]: I0308 19:54:45.827232 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmgdm" event={"ID":"72fa5124-24e9-47b1-8522-815cfef2a86b","Type":"ContainerDied","Data":"6794573adf705b25644869fa29cf7b121bfba4201d8cc4c2b8d4234d9e883a1d"} Mar 08 19:54:45 crc kubenswrapper[4885]: I0308 19:54:45.858864 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.85883683 podStartE2EDuration="2.85883683s" podCreationTimestamp="2026-03-08 19:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:45.840976631 +0000 UTC m=+1387.237030654" watchObservedRunningTime="2026-03-08 19:54:45.85883683 +0000 UTC m=+1387.254890893" Mar 08 19:54:46 crc kubenswrapper[4885]: I0308 19:54:46.844843 4885 generic.go:334] "Generic (PLEG): container finished" podID="1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" containerID="72f8ee44be245d4136285cfdfda421e5c74196d06b96d20eec24f989618614f0" exitCode=0 Mar 08 19:54:46 crc kubenswrapper[4885]: I0308 19:54:46.844961 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" event={"ID":"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef","Type":"ContainerDied","Data":"72f8ee44be245d4136285cfdfda421e5c74196d06b96d20eec24f989618614f0"} Mar 08 19:54:46 crc kubenswrapper[4885]: I0308 19:54:46.941212 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 19:54:46 crc kubenswrapper[4885]: I0308 19:54:46.941292 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 19:54:46 crc kubenswrapper[4885]: I0308 19:54:46.990796 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.023138 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.023200 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.114228 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.327392 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.392177 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.447319 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-lpg8x"] Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.449878 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="dnsmasq-dns" containerID="cri-o://88950a0ff1c85394753db02c54eed22b3bf6bb84a7de33f37b2a9f7d03adb063" gracePeriod=10 Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.516604 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-combined-ca-bundle\") pod \"72fa5124-24e9-47b1-8522-815cfef2a86b\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.516871 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-config-data\") pod \"72fa5124-24e9-47b1-8522-815cfef2a86b\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.516895 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-scripts\") pod \"72fa5124-24e9-47b1-8522-815cfef2a86b\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.516994 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjvvt\" (UniqueName: \"kubernetes.io/projected/72fa5124-24e9-47b1-8522-815cfef2a86b-kube-api-access-jjvvt\") pod \"72fa5124-24e9-47b1-8522-815cfef2a86b\" (UID: \"72fa5124-24e9-47b1-8522-815cfef2a86b\") " Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.523161 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-scripts" (OuterVolumeSpecName: "scripts") pod "72fa5124-24e9-47b1-8522-815cfef2a86b" (UID: "72fa5124-24e9-47b1-8522-815cfef2a86b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.524013 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72fa5124-24e9-47b1-8522-815cfef2a86b-kube-api-access-jjvvt" (OuterVolumeSpecName: "kube-api-access-jjvvt") pod "72fa5124-24e9-47b1-8522-815cfef2a86b" (UID: "72fa5124-24e9-47b1-8522-815cfef2a86b"). InnerVolumeSpecName "kube-api-access-jjvvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.544516 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72fa5124-24e9-47b1-8522-815cfef2a86b" (UID: "72fa5124-24e9-47b1-8522-815cfef2a86b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.557822 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-config-data" (OuterVolumeSpecName: "config-data") pod "72fa5124-24e9-47b1-8522-815cfef2a86b" (UID: "72fa5124-24e9-47b1-8522-815cfef2a86b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.622842 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjvvt\" (UniqueName: \"kubernetes.io/projected/72fa5124-24e9-47b1-8522-815cfef2a86b-kube-api-access-jjvvt\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.622945 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.622956 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.622964 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72fa5124-24e9-47b1-8522-815cfef2a86b-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.663624 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.165:5353: connect: connection refused" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.900284 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-pmgdm" event={"ID":"72fa5124-24e9-47b1-8522-815cfef2a86b","Type":"ContainerDied","Data":"251a9a13da15a33f312746f93ec7e90b7cec5cac915228321a2a19b174040f9e"} Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.900330 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="251a9a13da15a33f312746f93ec7e90b7cec5cac915228321a2a19b174040f9e" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.900424 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-pmgdm" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.912186 4885 generic.go:334] "Generic (PLEG): container finished" podID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerID="88950a0ff1c85394753db02c54eed22b3bf6bb84a7de33f37b2a9f7d03adb063" exitCode=0 Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.912441 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" event={"ID":"d087e374-bcc9-4a44-8fbe-aee43a47115e","Type":"ContainerDied","Data":"88950a0ff1c85394753db02c54eed22b3bf6bb84a7de33f37b2a9f7d03adb063"} Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.941989 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 19:54:47 crc kubenswrapper[4885]: I0308 19:54:47.980572 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.029985 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r6vd\" (UniqueName: \"kubernetes.io/projected/d087e374-bcc9-4a44-8fbe-aee43a47115e-kube-api-access-9r6vd\") pod \"d087e374-bcc9-4a44-8fbe-aee43a47115e\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.030086 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-svc\") pod \"d087e374-bcc9-4a44-8fbe-aee43a47115e\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.030115 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-swift-storage-0\") pod \"d087e374-bcc9-4a44-8fbe-aee43a47115e\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.030176 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-sb\") pod \"d087e374-bcc9-4a44-8fbe-aee43a47115e\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.030211 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-nb\") pod \"d087e374-bcc9-4a44-8fbe-aee43a47115e\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.030244 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-config\") pod \"d087e374-bcc9-4a44-8fbe-aee43a47115e\" (UID: \"d087e374-bcc9-4a44-8fbe-aee43a47115e\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.074248 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d087e374-bcc9-4a44-8fbe-aee43a47115e-kube-api-access-9r6vd" (OuterVolumeSpecName: "kube-api-access-9r6vd") pod "d087e374-bcc9-4a44-8fbe-aee43a47115e" (UID: "d087e374-bcc9-4a44-8fbe-aee43a47115e"). InnerVolumeSpecName "kube-api-access-9r6vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.110392 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.110400 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.110864 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.110904 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-log" containerID="cri-o://1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff" gracePeriod=30 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.111038 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-api" containerID="cri-o://52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d" gracePeriod=30 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.134106 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r6vd\" (UniqueName: \"kubernetes.io/projected/d087e374-bcc9-4a44-8fbe-aee43a47115e-kube-api-access-9r6vd\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.146580 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-config" (OuterVolumeSpecName: "config") pod "d087e374-bcc9-4a44-8fbe-aee43a47115e" (UID: "d087e374-bcc9-4a44-8fbe-aee43a47115e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.176275 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.176499 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-log" containerID="cri-o://0ea0d064913557c827d5cc82c34118e0a93078364e86541377cb2d10a87cbb38" gracePeriod=30 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.177000 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-metadata" containerID="cri-o://524c8a726c09b6bf03417223c87a06d2d37402e32a2b0f4addd79d17ce8ac52c" gracePeriod=30 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.235184 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.237280 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d087e374-bcc9-4a44-8fbe-aee43a47115e" (UID: "d087e374-bcc9-4a44-8fbe-aee43a47115e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.237404 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d087e374-bcc9-4a44-8fbe-aee43a47115e" (UID: "d087e374-bcc9-4a44-8fbe-aee43a47115e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.268974 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d087e374-bcc9-4a44-8fbe-aee43a47115e" (UID: "d087e374-bcc9-4a44-8fbe-aee43a47115e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.299452 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d087e374-bcc9-4a44-8fbe-aee43a47115e" (UID: "d087e374-bcc9-4a44-8fbe-aee43a47115e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.341098 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.341139 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.341154 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.341167 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d087e374-bcc9-4a44-8fbe-aee43a47115e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.427660 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.441469 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-config-data\") pod \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.441531 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc57k\" (UniqueName: \"kubernetes.io/projected/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-kube-api-access-pc57k\") pod \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.441601 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-combined-ca-bundle\") pod \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.441668 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-scripts\") pod \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\" (UID: \"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef\") " Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.448154 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-kube-api-access-pc57k" (OuterVolumeSpecName: "kube-api-access-pc57k") pod "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" (UID: "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef"). InnerVolumeSpecName "kube-api-access-pc57k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.452050 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-scripts" (OuterVolumeSpecName: "scripts") pod "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" (UID: "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.479562 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" (UID: "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.490417 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-config-data" (OuterVolumeSpecName: "config-data") pod "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" (UID: "1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.543439 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.543476 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.543487 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc57k\" (UniqueName: \"kubernetes.io/projected/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-kube-api-access-pc57k\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.543497 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.583528 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.951035 4885 generic.go:334] "Generic (PLEG): container finished" podID="5f00be59-96a4-4f2f-b319-6435aa008932" containerID="1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff" exitCode=143 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.951301 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f00be59-96a4-4f2f-b319-6435aa008932","Type":"ContainerDied","Data":"1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff"} Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954104 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 19:54:48 crc kubenswrapper[4885]: E0308 19:54:48.954527 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="dnsmasq-dns" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954544 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="dnsmasq-dns" Mar 08 19:54:48 crc kubenswrapper[4885]: E0308 19:54:48.954580 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72fa5124-24e9-47b1-8522-815cfef2a86b" containerName="nova-manage" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954587 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="72fa5124-24e9-47b1-8522-815cfef2a86b" containerName="nova-manage" Mar 08 19:54:48 crc kubenswrapper[4885]: E0308 19:54:48.954599 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="init" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954605 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="init" Mar 08 19:54:48 crc kubenswrapper[4885]: E0308 19:54:48.954615 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" containerName="nova-cell1-conductor-db-sync" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954622 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" containerName="nova-cell1-conductor-db-sync" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954790 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="72fa5124-24e9-47b1-8522-815cfef2a86b" containerName="nova-manage" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954817 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" containerName="nova-cell1-conductor-db-sync" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.954827 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" containerName="dnsmasq-dns" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.955445 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.962388 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" event={"ID":"d087e374-bcc9-4a44-8fbe-aee43a47115e","Type":"ContainerDied","Data":"4547972efa3892226729dccf70e00d854a9c1e79c44132cd7c28be08c974628a"} Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.962432 4885 scope.go:117] "RemoveContainer" containerID="88950a0ff1c85394753db02c54eed22b3bf6bb84a7de33f37b2a9f7d03adb063" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.962561 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b8fcc65cc-lpg8x" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.976816 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.982285 4885 generic.go:334] "Generic (PLEG): container finished" podID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerID="524c8a726c09b6bf03417223c87a06d2d37402e32a2b0f4addd79d17ce8ac52c" exitCode=0 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.982316 4885 generic.go:334] "Generic (PLEG): container finished" podID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerID="0ea0d064913557c827d5cc82c34118e0a93078364e86541377cb2d10a87cbb38" exitCode=143 Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.982356 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7db4ab0b-c8b4-4809-8ff5-31f5175de78d","Type":"ContainerDied","Data":"524c8a726c09b6bf03417223c87a06d2d37402e32a2b0f4addd79d17ce8ac52c"} Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.982381 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7db4ab0b-c8b4-4809-8ff5-31f5175de78d","Type":"ContainerDied","Data":"0ea0d064913557c827d5cc82c34118e0a93078364e86541377cb2d10a87cbb38"} Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.987593 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.994650 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-7jc2k" event={"ID":"1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef","Type":"ContainerDied","Data":"3b6c61eb70ff155909795960839c6c1ae45750413aff9ca488f8dad001880faa"} Mar 08 19:54:48 crc kubenswrapper[4885]: I0308 19:54:48.994677 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b6c61eb70ff155909795960839c6c1ae45750413aff9ca488f8dad001880faa" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.013883 4885 scope.go:117] "RemoveContainer" containerID="41bd07d83b5b4958e58a7473f1f938d73689ec0cd631180b50c3f160c3251d1c" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.031966 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-lpg8x"] Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.039366 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.039551 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b8fcc65cc-lpg8x"] Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.053568 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mthg\" (UniqueName: \"kubernetes.io/projected/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-kube-api-access-2mthg\") pod \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.053817 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-config-data\") pod \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.053915 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-nova-metadata-tls-certs\") pod \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.054083 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-combined-ca-bundle\") pod \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.054164 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-logs\") pod \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\" (UID: \"7db4ab0b-c8b4-4809-8ff5-31f5175de78d\") " Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.054542 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.054628 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs4qz\" (UniqueName: \"kubernetes.io/projected/9bbdf164-51e7-4faf-986b-fba5044fad2b-kube-api-access-hs4qz\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.054723 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.058414 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-logs" (OuterVolumeSpecName: "logs") pod "7db4ab0b-c8b4-4809-8ff5-31f5175de78d" (UID: "7db4ab0b-c8b4-4809-8ff5-31f5175de78d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.059118 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-kube-api-access-2mthg" (OuterVolumeSpecName: "kube-api-access-2mthg") pod "7db4ab0b-c8b4-4809-8ff5-31f5175de78d" (UID: "7db4ab0b-c8b4-4809-8ff5-31f5175de78d"). InnerVolumeSpecName "kube-api-access-2mthg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.088763 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7db4ab0b-c8b4-4809-8ff5-31f5175de78d" (UID: "7db4ab0b-c8b4-4809-8ff5-31f5175de78d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.090412 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-config-data" (OuterVolumeSpecName: "config-data") pod "7db4ab0b-c8b4-4809-8ff5-31f5175de78d" (UID: "7db4ab0b-c8b4-4809-8ff5-31f5175de78d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.114936 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "7db4ab0b-c8b4-4809-8ff5-31f5175de78d" (UID: "7db4ab0b-c8b4-4809-8ff5-31f5175de78d"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156116 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156344 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs4qz\" (UniqueName: \"kubernetes.io/projected/9bbdf164-51e7-4faf-986b-fba5044fad2b-kube-api-access-hs4qz\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156473 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156646 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156723 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156791 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mthg\" (UniqueName: \"kubernetes.io/projected/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-kube-api-access-2mthg\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156857 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.156914 4885 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/7db4ab0b-c8b4-4809-8ff5-31f5175de78d-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.159298 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.159746 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.170170 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs4qz\" (UniqueName: \"kubernetes.io/projected/9bbdf164-51e7-4faf-986b-fba5044fad2b-kube-api-access-hs4qz\") pod \"nova-cell1-conductor-0\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.286061 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.386847 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d087e374-bcc9-4a44-8fbe-aee43a47115e" path="/var/lib/kubelet/pods/d087e374-bcc9-4a44-8fbe-aee43a47115e/volumes" Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.802389 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 19:54:49 crc kubenswrapper[4885]: I0308 19:54:49.998209 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bbdf164-51e7-4faf-986b-fba5044fad2b","Type":"ContainerStarted","Data":"c280fe9fe13ec4f9da8c09591afb05298340012fc23ed25819dbc3970dce1bc0"} Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.000601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7db4ab0b-c8b4-4809-8ff5-31f5175de78d","Type":"ContainerDied","Data":"44b459aaa1e4b5454e9eaf60660b8259d3e8c050ced5809387d49579fc3941c7"} Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.000610 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.000857 4885 scope.go:117] "RemoveContainer" containerID="524c8a726c09b6bf03417223c87a06d2d37402e32a2b0f4addd79d17ce8ac52c" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.001257 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="699e9586-4fcb-4a93-b479-44d269162645" containerName="nova-scheduler-scheduler" containerID="cri-o://eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" gracePeriod=30 Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.038959 4885 scope.go:117] "RemoveContainer" containerID="0ea0d064913557c827d5cc82c34118e0a93078364e86541377cb2d10a87cbb38" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.040131 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.072684 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.093039 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:50 crc kubenswrapper[4885]: E0308 19:54:50.093784 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-metadata" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.093808 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-metadata" Mar 08 19:54:50 crc kubenswrapper[4885]: E0308 19:54:50.093845 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-log" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.093855 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-log" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.094264 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-log" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.094300 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" containerName="nova-metadata-metadata" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.095737 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.099723 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.099765 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.107711 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.278809 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a54375d5-43ad-493f-87b8-f10b9d6f68f9-logs\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.278882 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-config-data\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.279028 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws8s4\" (UniqueName: \"kubernetes.io/projected/a54375d5-43ad-493f-87b8-f10b9d6f68f9-kube-api-access-ws8s4\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.279065 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.279184 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.381393 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws8s4\" (UniqueName: \"kubernetes.io/projected/a54375d5-43ad-493f-87b8-f10b9d6f68f9-kube-api-access-ws8s4\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.381454 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.381575 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.381755 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a54375d5-43ad-493f-87b8-f10b9d6f68f9-logs\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.381799 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-config-data\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.382798 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a54375d5-43ad-493f-87b8-f10b9d6f68f9-logs\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.388521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-config-data\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.402138 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.405004 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.418128 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws8s4\" (UniqueName: \"kubernetes.io/projected/a54375d5-43ad-493f-87b8-f10b9d6f68f9-kube-api-access-ws8s4\") pod \"nova-metadata-0\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " pod="openstack/nova-metadata-0" Mar 08 19:54:50 crc kubenswrapper[4885]: I0308 19:54:50.717276 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:54:51 crc kubenswrapper[4885]: I0308 19:54:51.007239 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:54:51 crc kubenswrapper[4885]: I0308 19:54:51.014540 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bbdf164-51e7-4faf-986b-fba5044fad2b","Type":"ContainerStarted","Data":"bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6"} Mar 08 19:54:51 crc kubenswrapper[4885]: I0308 19:54:51.014719 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:51 crc kubenswrapper[4885]: W0308 19:54:51.016124 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda54375d5_43ad_493f_87b8_f10b9d6f68f9.slice/crio-72d850d89769888e1770839347c05fb2c096d5b48c6987176e4df5105daaa18d WatchSource:0}: Error finding container 72d850d89769888e1770839347c05fb2c096d5b48c6987176e4df5105daaa18d: Status 404 returned error can't find the container with id 72d850d89769888e1770839347c05fb2c096d5b48c6987176e4df5105daaa18d Mar 08 19:54:51 crc kubenswrapper[4885]: I0308 19:54:51.046184 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.04616695 podStartE2EDuration="3.04616695s" podCreationTimestamp="2026-03-08 19:54:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:51.033025107 +0000 UTC m=+1392.429079130" watchObservedRunningTime="2026-03-08 19:54:51.04616695 +0000 UTC m=+1392.442220973" Mar 08 19:54:51 crc kubenswrapper[4885]: I0308 19:54:51.389345 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7db4ab0b-c8b4-4809-8ff5-31f5175de78d" path="/var/lib/kubelet/pods/7db4ab0b-c8b4-4809-8ff5-31f5175de78d/volumes" Mar 08 19:54:51 crc kubenswrapper[4885]: E0308 19:54:51.943994 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:54:51 crc kubenswrapper[4885]: E0308 19:54:51.946243 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:54:51 crc kubenswrapper[4885]: E0308 19:54:51.947992 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:54:51 crc kubenswrapper[4885]: E0308 19:54:51.948076 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="699e9586-4fcb-4a93-b479-44d269162645" containerName="nova-scheduler-scheduler" Mar 08 19:54:52 crc kubenswrapper[4885]: I0308 19:54:52.029209 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a54375d5-43ad-493f-87b8-f10b9d6f68f9","Type":"ContainerStarted","Data":"bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552"} Mar 08 19:54:52 crc kubenswrapper[4885]: I0308 19:54:52.029288 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a54375d5-43ad-493f-87b8-f10b9d6f68f9","Type":"ContainerStarted","Data":"aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898"} Mar 08 19:54:52 crc kubenswrapper[4885]: I0308 19:54:52.029310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a54375d5-43ad-493f-87b8-f10b9d6f68f9","Type":"ContainerStarted","Data":"72d850d89769888e1770839347c05fb2c096d5b48c6987176e4df5105daaa18d"} Mar 08 19:54:52 crc kubenswrapper[4885]: I0308 19:54:52.054307 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.05428963 podStartE2EDuration="2.05428963s" podCreationTimestamp="2026-03-08 19:54:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:52.051826593 +0000 UTC m=+1393.447880616" watchObservedRunningTime="2026-03-08 19:54:52.05428963 +0000 UTC m=+1393.450343653" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.559577 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.664030 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-combined-ca-bundle\") pod \"699e9586-4fcb-4a93-b479-44d269162645\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.664253 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvzmh\" (UniqueName: \"kubernetes.io/projected/699e9586-4fcb-4a93-b479-44d269162645-kube-api-access-zvzmh\") pod \"699e9586-4fcb-4a93-b479-44d269162645\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.664360 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-config-data\") pod \"699e9586-4fcb-4a93-b479-44d269162645\" (UID: \"699e9586-4fcb-4a93-b479-44d269162645\") " Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.672284 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699e9586-4fcb-4a93-b479-44d269162645-kube-api-access-zvzmh" (OuterVolumeSpecName: "kube-api-access-zvzmh") pod "699e9586-4fcb-4a93-b479-44d269162645" (UID: "699e9586-4fcb-4a93-b479-44d269162645"). InnerVolumeSpecName "kube-api-access-zvzmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.699646 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-config-data" (OuterVolumeSpecName: "config-data") pod "699e9586-4fcb-4a93-b479-44d269162645" (UID: "699e9586-4fcb-4a93-b479-44d269162645"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.711626 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "699e9586-4fcb-4a93-b479-44d269162645" (UID: "699e9586-4fcb-4a93-b479-44d269162645"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.766097 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvzmh\" (UniqueName: \"kubernetes.io/projected/699e9586-4fcb-4a93-b479-44d269162645-kube-api-access-zvzmh\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.766129 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.766140 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/699e9586-4fcb-4a93-b479-44d269162645-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.891287 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:54:53 crc kubenswrapper[4885]: I0308 19:54:53.901229 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.053073 4885 generic.go:334] "Generic (PLEG): container finished" podID="5f00be59-96a4-4f2f-b319-6435aa008932" containerID="52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d" exitCode=0 Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.053167 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f00be59-96a4-4f2f-b319-6435aa008932","Type":"ContainerDied","Data":"52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d"} Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.053176 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.054098 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5f00be59-96a4-4f2f-b319-6435aa008932","Type":"ContainerDied","Data":"44a4da7dfb75724690fd74611aac2ae815bfdeff07d9af1d06d6b4bf6e5274e1"} Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.054132 4885 scope.go:117] "RemoveContainer" containerID="52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.056348 4885 generic.go:334] "Generic (PLEG): container finished" podID="699e9586-4fcb-4a93-b479-44d269162645" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" exitCode=0 Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.056383 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"699e9586-4fcb-4a93-b479-44d269162645","Type":"ContainerDied","Data":"eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922"} Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.056408 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"699e9586-4fcb-4a93-b479-44d269162645","Type":"ContainerDied","Data":"28d9f2d30308ef7519a870cfda7992bd0f04e5904fdb1fac5d4c1b4a105e7de5"} Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.056435 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.073986 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-config-data\") pod \"5f00be59-96a4-4f2f-b319-6435aa008932\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.074139 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-combined-ca-bundle\") pod \"5f00be59-96a4-4f2f-b319-6435aa008932\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.074184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9qpj\" (UniqueName: \"kubernetes.io/projected/5f00be59-96a4-4f2f-b319-6435aa008932-kube-api-access-z9qpj\") pod \"5f00be59-96a4-4f2f-b319-6435aa008932\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.075186 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f00be59-96a4-4f2f-b319-6435aa008932-logs\") pod \"5f00be59-96a4-4f2f-b319-6435aa008932\" (UID: \"5f00be59-96a4-4f2f-b319-6435aa008932\") " Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.075569 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f00be59-96a4-4f2f-b319-6435aa008932-logs" (OuterVolumeSpecName: "logs") pod "5f00be59-96a4-4f2f-b319-6435aa008932" (UID: "5f00be59-96a4-4f2f-b319-6435aa008932"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.076671 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f00be59-96a4-4f2f-b319-6435aa008932-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.076847 4885 scope.go:117] "RemoveContainer" containerID="1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.084620 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f00be59-96a4-4f2f-b319-6435aa008932-kube-api-access-z9qpj" (OuterVolumeSpecName: "kube-api-access-z9qpj") pod "5f00be59-96a4-4f2f-b319-6435aa008932" (UID: "5f00be59-96a4-4f2f-b319-6435aa008932"). InnerVolumeSpecName "kube-api-access-z9qpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.095286 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.107834 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f00be59-96a4-4f2f-b319-6435aa008932" (UID: "5f00be59-96a4-4f2f-b319-6435aa008932"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.115442 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.120482 4885 scope.go:117] "RemoveContainer" containerID="52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d" Mar 08 19:54:54 crc kubenswrapper[4885]: E0308 19:54:54.121032 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d\": container with ID starting with 52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d not found: ID does not exist" containerID="52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.121077 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d"} err="failed to get container status \"52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d\": rpc error: code = NotFound desc = could not find container \"52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d\": container with ID starting with 52cff204c42fbf2d17ed28185aac162d704b88f806a9d98f4c400d1c451c2b9d not found: ID does not exist" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.121109 4885 scope.go:117] "RemoveContainer" containerID="1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff" Mar 08 19:54:54 crc kubenswrapper[4885]: E0308 19:54:54.121358 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff\": container with ID starting with 1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff not found: ID does not exist" containerID="1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.121381 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff"} err="failed to get container status \"1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff\": rpc error: code = NotFound desc = could not find container \"1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff\": container with ID starting with 1cb1d3bf75dd69dece8dc771bcfa5a0e041547ed5df7c90a30fc705eff4f0eff not found: ID does not exist" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.121407 4885 scope.go:117] "RemoveContainer" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.123818 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: E0308 19:54:54.124383 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-log" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.124402 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-log" Mar 08 19:54:54 crc kubenswrapper[4885]: E0308 19:54:54.124460 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699e9586-4fcb-4a93-b479-44d269162645" containerName="nova-scheduler-scheduler" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.124470 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="699e9586-4fcb-4a93-b479-44d269162645" containerName="nova-scheduler-scheduler" Mar 08 19:54:54 crc kubenswrapper[4885]: E0308 19:54:54.124487 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-api" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.124496 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-api" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.124723 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="699e9586-4fcb-4a93-b479-44d269162645" containerName="nova-scheduler-scheduler" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.124739 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-api" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.124758 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" containerName="nova-api-log" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.125598 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.127719 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.133407 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.152538 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-config-data" (OuterVolumeSpecName: "config-data") pod "5f00be59-96a4-4f2f-b319-6435aa008932" (UID: "5f00be59-96a4-4f2f-b319-6435aa008932"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.178197 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.178229 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9qpj\" (UniqueName: \"kubernetes.io/projected/5f00be59-96a4-4f2f-b319-6435aa008932-kube-api-access-z9qpj\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.178259 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f00be59-96a4-4f2f-b319-6435aa008932-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.226554 4885 scope.go:117] "RemoveContainer" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" Mar 08 19:54:54 crc kubenswrapper[4885]: E0308 19:54:54.227791 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922\": container with ID starting with eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922 not found: ID does not exist" containerID="eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.227831 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922"} err="failed to get container status \"eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922\": rpc error: code = NotFound desc = could not find container \"eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922\": container with ID starting with eacf19025c5ffc88a5f9aa4d70582177f2bcd3c37a0ef78b75eb5e6d518db922 not found: ID does not exist" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.280122 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lzkz\" (UniqueName: \"kubernetes.io/projected/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-kube-api-access-9lzkz\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.280310 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-config-data\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.280504 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.381678 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lzkz\" (UniqueName: \"kubernetes.io/projected/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-kube-api-access-9lzkz\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.381814 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-config-data\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.382613 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.383784 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.385891 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-config-data\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.386240 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.397363 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.408308 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lzkz\" (UniqueName: \"kubernetes.io/projected/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-kube-api-access-9lzkz\") pod \"nova-scheduler-0\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.428526 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.429897 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.434586 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.446832 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.523611 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.586115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-config-data\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.586408 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8p8t\" (UniqueName: \"kubernetes.io/projected/273a661b-19fb-47d2-b1d6-05ddf548f212-kube-api-access-n8p8t\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.586504 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.586552 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273a661b-19fb-47d2-b1d6-05ddf548f212-logs\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.687607 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8p8t\" (UniqueName: \"kubernetes.io/projected/273a661b-19fb-47d2-b1d6-05ddf548f212-kube-api-access-n8p8t\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.687858 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.687885 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273a661b-19fb-47d2-b1d6-05ddf548f212-logs\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.687947 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-config-data\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.688435 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273a661b-19fb-47d2-b1d6-05ddf548f212-logs\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.693196 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-config-data\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.696208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.704314 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8p8t\" (UniqueName: \"kubernetes.io/projected/273a661b-19fb-47d2-b1d6-05ddf548f212-kube-api-access-n8p8t\") pod \"nova-api-0\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.812576 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:54:54 crc kubenswrapper[4885]: I0308 19:54:54.981892 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:54:54 crc kubenswrapper[4885]: W0308 19:54:54.984310 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0768dc6_7cf7_4bd7_a1de_6a68f604de14.slice/crio-0ed53f5b10de0d8d9229c24ba782e6c571620fd87d68a50336540413e9a25108 WatchSource:0}: Error finding container 0ed53f5b10de0d8d9229c24ba782e6c571620fd87d68a50336540413e9a25108: Status 404 returned error can't find the container with id 0ed53f5b10de0d8d9229c24ba782e6c571620fd87d68a50336540413e9a25108 Mar 08 19:54:55 crc kubenswrapper[4885]: I0308 19:54:55.072076 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0768dc6-7cf7-4bd7-a1de-6a68f604de14","Type":"ContainerStarted","Data":"0ed53f5b10de0d8d9229c24ba782e6c571620fd87d68a50336540413e9a25108"} Mar 08 19:54:55 crc kubenswrapper[4885]: I0308 19:54:55.075312 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:54:55 crc kubenswrapper[4885]: W0308 19:54:55.087593 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod273a661b_19fb_47d2_b1d6_05ddf548f212.slice/crio-8cdb2767b2eabddf4d25eac6833f5ae8cc579f3e077a9d8508fa5553f3b7e1b3 WatchSource:0}: Error finding container 8cdb2767b2eabddf4d25eac6833f5ae8cc579f3e077a9d8508fa5553f3b7e1b3: Status 404 returned error can't find the container with id 8cdb2767b2eabddf4d25eac6833f5ae8cc579f3e077a9d8508fa5553f3b7e1b3 Mar 08 19:54:55 crc kubenswrapper[4885]: I0308 19:54:55.381181 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f00be59-96a4-4f2f-b319-6435aa008932" path="/var/lib/kubelet/pods/5f00be59-96a4-4f2f-b319-6435aa008932/volumes" Mar 08 19:54:55 crc kubenswrapper[4885]: I0308 19:54:55.382179 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699e9586-4fcb-4a93-b479-44d269162645" path="/var/lib/kubelet/pods/699e9586-4fcb-4a93-b479-44d269162645/volumes" Mar 08 19:54:55 crc kubenswrapper[4885]: I0308 19:54:55.717778 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 19:54:55 crc kubenswrapper[4885]: I0308 19:54:55.718075 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.020735 4885 scope.go:117] "RemoveContainer" containerID="027375be475663b68fa34275cf933a5f73118e3902051a04110bd2c7ec89a43e" Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.085735 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0768dc6-7cf7-4bd7-a1de-6a68f604de14","Type":"ContainerStarted","Data":"9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35"} Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.088766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"273a661b-19fb-47d2-b1d6-05ddf548f212","Type":"ContainerStarted","Data":"a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7"} Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.088808 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"273a661b-19fb-47d2-b1d6-05ddf548f212","Type":"ContainerStarted","Data":"7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582"} Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.088825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"273a661b-19fb-47d2-b1d6-05ddf548f212","Type":"ContainerStarted","Data":"8cdb2767b2eabddf4d25eac6833f5ae8cc579f3e077a9d8508fa5553f3b7e1b3"} Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.119704 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.11967893 podStartE2EDuration="2.11967893s" podCreationTimestamp="2026-03-08 19:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:56.108203363 +0000 UTC m=+1397.504257396" watchObservedRunningTime="2026-03-08 19:54:56.11967893 +0000 UTC m=+1397.515732993" Mar 08 19:54:56 crc kubenswrapper[4885]: I0308 19:54:56.139947 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.139929253 podStartE2EDuration="2.139929253s" podCreationTimestamp="2026-03-08 19:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:54:56.129327859 +0000 UTC m=+1397.525381882" watchObservedRunningTime="2026-03-08 19:54:56.139929253 +0000 UTC m=+1397.535983276" Mar 08 19:54:57 crc kubenswrapper[4885]: I0308 19:54:57.462394 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:54:57 crc kubenswrapper[4885]: I0308 19:54:57.462877 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="50f2f07f-efc4-4778-944c-d4819f0b0e30" containerName="kube-state-metrics" containerID="cri-o://fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce" gracePeriod=30 Mar 08 19:54:57 crc kubenswrapper[4885]: I0308 19:54:57.944240 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.046804 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxdn7\" (UniqueName: \"kubernetes.io/projected/50f2f07f-efc4-4778-944c-d4819f0b0e30-kube-api-access-cxdn7\") pod \"50f2f07f-efc4-4778-944c-d4819f0b0e30\" (UID: \"50f2f07f-efc4-4778-944c-d4819f0b0e30\") " Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.059116 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f2f07f-efc4-4778-944c-d4819f0b0e30-kube-api-access-cxdn7" (OuterVolumeSpecName: "kube-api-access-cxdn7") pod "50f2f07f-efc4-4778-944c-d4819f0b0e30" (UID: "50f2f07f-efc4-4778-944c-d4819f0b0e30"). InnerVolumeSpecName "kube-api-access-cxdn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.109052 4885 generic.go:334] "Generic (PLEG): container finished" podID="50f2f07f-efc4-4778-944c-d4819f0b0e30" containerID="fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce" exitCode=2 Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.109116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"50f2f07f-efc4-4778-944c-d4819f0b0e30","Type":"ContainerDied","Data":"fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce"} Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.109153 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"50f2f07f-efc4-4778-944c-d4819f0b0e30","Type":"ContainerDied","Data":"3c7c7d0cb67bdd716e27a921f55c82cceb775a12654fa8ab0fa3866727e30e29"} Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.109180 4885 scope.go:117] "RemoveContainer" containerID="fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.109341 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.131575 4885 scope.go:117] "RemoveContainer" containerID="fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce" Mar 08 19:54:58 crc kubenswrapper[4885]: E0308 19:54:58.132052 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce\": container with ID starting with fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce not found: ID does not exist" containerID="fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.132088 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce"} err="failed to get container status \"fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce\": rpc error: code = NotFound desc = could not find container \"fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce\": container with ID starting with fefbe8c9ff79430210e767e7530d7b6bec7ae305b3486976f5217db76688f0ce not found: ID does not exist" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.152037 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxdn7\" (UniqueName: \"kubernetes.io/projected/50f2f07f-efc4-4778-944c-d4819f0b0e30-kube-api-access-cxdn7\") on node \"crc\" DevicePath \"\"" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.161025 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.180646 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.193392 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:54:58 crc kubenswrapper[4885]: E0308 19:54:58.193779 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50f2f07f-efc4-4778-944c-d4819f0b0e30" containerName="kube-state-metrics" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.193797 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f2f07f-efc4-4778-944c-d4819f0b0e30" containerName="kube-state-metrics" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.194009 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="50f2f07f-efc4-4778-944c-d4819f0b0e30" containerName="kube-state-metrics" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.194706 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.196702 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.197051 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.205334 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.355385 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqx6r\" (UniqueName: \"kubernetes.io/projected/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-api-access-bqx6r\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.355711 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.355731 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.355890 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.457597 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.457693 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.457783 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.457952 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqx6r\" (UniqueName: \"kubernetes.io/projected/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-api-access-bqx6r\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.464729 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.464830 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.467845 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.479399 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqx6r\" (UniqueName: \"kubernetes.io/projected/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-api-access-bqx6r\") pod \"kube-state-metrics-0\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " pod="openstack/kube-state-metrics-0" Mar 08 19:54:58 crc kubenswrapper[4885]: I0308 19:54:58.522141 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.032360 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.135770 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"63c3ea8e-9683-45b9-805b-d1049840b0da","Type":"ContainerStarted","Data":"c766b62f5cceadf0886905919f92953b9185d1630ad51bf23b383898036558fd"} Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.185953 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.186276 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-central-agent" containerID="cri-o://608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa" gracePeriod=30 Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.186435 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="proxy-httpd" containerID="cri-o://955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853" gracePeriod=30 Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.186511 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="sg-core" containerID="cri-o://9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1" gracePeriod=30 Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.186571 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-notification-agent" containerID="cri-o://640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055" gracePeriod=30 Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.332560 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.378839 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f2f07f-efc4-4778-944c-d4819f0b0e30" path="/var/lib/kubelet/pods/50f2f07f-efc4-4778-944c-d4819f0b0e30/volumes" Mar 08 19:54:59 crc kubenswrapper[4885]: I0308 19:54:59.528049 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.147713 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"63c3ea8e-9683-45b9-805b-d1049840b0da","Type":"ContainerStarted","Data":"72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b"} Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.148114 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.151896 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerID="955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853" exitCode=0 Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.151979 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerID="9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1" exitCode=2 Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.151993 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerID="608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa" exitCode=0 Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.152012 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerDied","Data":"955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853"} Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.152086 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerDied","Data":"9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1"} Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.152105 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerDied","Data":"608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa"} Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.173492 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.792448626 podStartE2EDuration="2.173461071s" podCreationTimestamp="2026-03-08 19:54:58 +0000 UTC" firstStartedPulling="2026-03-08 19:54:59.034330412 +0000 UTC m=+1400.430384435" lastFinishedPulling="2026-03-08 19:54:59.415342847 +0000 UTC m=+1400.811396880" observedRunningTime="2026-03-08 19:55:00.166242018 +0000 UTC m=+1401.562296061" watchObservedRunningTime="2026-03-08 19:55:00.173461071 +0000 UTC m=+1401.569515104" Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.717951 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 19:55:00 crc kubenswrapper[4885]: I0308 19:55:00.718355 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.729411 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.738812 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.738951 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.834393 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-config-data\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.834587 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-log-httpd\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.834658 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-combined-ca-bundle\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.834730 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-sg-core-conf-yaml\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.835135 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.835472 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-scripts\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.835580 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2s99\" (UniqueName: \"kubernetes.io/projected/d0b0d3fd-e485-4924-8a1b-6214b7840e52-kube-api-access-l2s99\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.835655 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-run-httpd\") pod \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\" (UID: \"d0b0d3fd-e485-4924-8a1b-6214b7840e52\") " Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.836322 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.836678 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.843092 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-scripts" (OuterVolumeSpecName: "scripts") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.844902 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0b0d3fd-e485-4924-8a1b-6214b7840e52-kube-api-access-l2s99" (OuterVolumeSpecName: "kube-api-access-l2s99") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "kube-api-access-l2s99". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.863541 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.938344 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.938368 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.938382 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2s99\" (UniqueName: \"kubernetes.io/projected/d0b0d3fd-e485-4924-8a1b-6214b7840e52-kube-api-access-l2s99\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.938392 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d0b0d3fd-e485-4924-8a1b-6214b7840e52-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:01 crc kubenswrapper[4885]: I0308 19:55:01.944364 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.005147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-config-data" (OuterVolumeSpecName: "config-data") pod "d0b0d3fd-e485-4924-8a1b-6214b7840e52" (UID: "d0b0d3fd-e485-4924-8a1b-6214b7840e52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.040264 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.040300 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0b0d3fd-e485-4924-8a1b-6214b7840e52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.170487 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerID="640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055" exitCode=0 Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.170537 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerDied","Data":"640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055"} Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.170567 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d0b0d3fd-e485-4924-8a1b-6214b7840e52","Type":"ContainerDied","Data":"031cf273d2ce6f416df1db118df4e04e2598121142c98562d1d0691ec1ae6950"} Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.170585 4885 scope.go:117] "RemoveContainer" containerID="955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.170742 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.207681 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.211316 4885 scope.go:117] "RemoveContainer" containerID="9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.228073 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.243417 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.243878 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-notification-agent" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.243984 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-notification-agent" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.244006 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="sg-core" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244014 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="sg-core" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.244067 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="proxy-httpd" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244077 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="proxy-httpd" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.244091 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-central-agent" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244099 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-central-agent" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244330 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-notification-agent" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244359 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="ceilometer-central-agent" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244369 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="sg-core" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.244388 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" containerName="proxy-httpd" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.247072 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.249571 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.249737 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.250107 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.252219 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.267507 4885 scope.go:117] "RemoveContainer" containerID="640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.303498 4885 scope.go:117] "RemoveContainer" containerID="608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.326114 4885 scope.go:117] "RemoveContainer" containerID="955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.327249 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853\": container with ID starting with 955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853 not found: ID does not exist" containerID="955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.327296 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853"} err="failed to get container status \"955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853\": rpc error: code = NotFound desc = could not find container \"955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853\": container with ID starting with 955690b6d3d5b2fc0ce7d78d7f24f273878ad204acd787cd7fb49ff90f582853 not found: ID does not exist" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.327329 4885 scope.go:117] "RemoveContainer" containerID="9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.328386 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1\": container with ID starting with 9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1 not found: ID does not exist" containerID="9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.328483 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1"} err="failed to get container status \"9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1\": rpc error: code = NotFound desc = could not find container \"9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1\": container with ID starting with 9fd2fa4dbae44ea996581b5b93222e5f213779772f8ae9977bd610c5067453c1 not found: ID does not exist" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.328503 4885 scope.go:117] "RemoveContainer" containerID="640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.329808 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055\": container with ID starting with 640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055 not found: ID does not exist" containerID="640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.329862 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055"} err="failed to get container status \"640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055\": rpc error: code = NotFound desc = could not find container \"640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055\": container with ID starting with 640d73222433e130bb6eab648c771b9973e617050d1dc8f705c434aeb775e055 not found: ID does not exist" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.329893 4885 scope.go:117] "RemoveContainer" containerID="608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa" Mar 08 19:55:02 crc kubenswrapper[4885]: E0308 19:55:02.330572 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa\": container with ID starting with 608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa not found: ID does not exist" containerID="608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.330609 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa"} err="failed to get container status \"608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa\": rpc error: code = NotFound desc = could not find container \"608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa\": container with ID starting with 608e43aed791d889fcf61f65ff91fdfd5bab94d9925188b30e1096baaa7e6caa not found: ID does not exist" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345139 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-scripts\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345235 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-log-httpd\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345295 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpp4k\" (UniqueName: \"kubernetes.io/projected/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-kube-api-access-jpp4k\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345311 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-config-data\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345346 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345367 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345380 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-run-httpd\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.345402 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.446685 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-scripts\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447063 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-log-httpd\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447138 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpp4k\" (UniqueName: \"kubernetes.io/projected/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-kube-api-access-jpp4k\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447162 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-config-data\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447213 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447251 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-run-httpd\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447312 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.447884 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-log-httpd\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.450078 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-run-httpd\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.452682 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.454459 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-scripts\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.454797 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.456733 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.463201 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-config-data\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.467519 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpp4k\" (UniqueName: \"kubernetes.io/projected/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-kube-api-access-jpp4k\") pod \"ceilometer-0\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " pod="openstack/ceilometer-0" Mar 08 19:55:02 crc kubenswrapper[4885]: I0308 19:55:02.567018 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:03 crc kubenswrapper[4885]: I0308 19:55:03.386409 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0b0d3fd-e485-4924-8a1b-6214b7840e52" path="/var/lib/kubelet/pods/d0b0d3fd-e485-4924-8a1b-6214b7840e52/volumes" Mar 08 19:55:03 crc kubenswrapper[4885]: I0308 19:55:03.624567 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:04 crc kubenswrapper[4885]: I0308 19:55:04.193597 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerStarted","Data":"3d7681c02db53e0bc7cd05053b145730f9d4263a90b7db83dc6a7abc4e56e119"} Mar 08 19:55:04 crc kubenswrapper[4885]: I0308 19:55:04.524164 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 19:55:04 crc kubenswrapper[4885]: I0308 19:55:04.572299 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 19:55:04 crc kubenswrapper[4885]: I0308 19:55:04.814016 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:55:04 crc kubenswrapper[4885]: I0308 19:55:04.814089 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:55:05 crc kubenswrapper[4885]: I0308 19:55:05.211325 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerStarted","Data":"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293"} Mar 08 19:55:05 crc kubenswrapper[4885]: I0308 19:55:05.211391 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerStarted","Data":"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9"} Mar 08 19:55:05 crc kubenswrapper[4885]: I0308 19:55:05.259231 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 19:55:05 crc kubenswrapper[4885]: I0308 19:55:05.895082 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:05 crc kubenswrapper[4885]: I0308 19:55:05.895130 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:06 crc kubenswrapper[4885]: I0308 19:55:06.219846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerStarted","Data":"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf"} Mar 08 19:55:08 crc kubenswrapper[4885]: I0308 19:55:08.244884 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerStarted","Data":"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e"} Mar 08 19:55:08 crc kubenswrapper[4885]: I0308 19:55:08.247111 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 19:55:08 crc kubenswrapper[4885]: I0308 19:55:08.291031 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.331236207 podStartE2EDuration="6.291002149s" podCreationTimestamp="2026-03-08 19:55:02 +0000 UTC" firstStartedPulling="2026-03-08 19:55:03.63909636 +0000 UTC m=+1405.035150413" lastFinishedPulling="2026-03-08 19:55:07.598862302 +0000 UTC m=+1408.994916355" observedRunningTime="2026-03-08 19:55:08.279764088 +0000 UTC m=+1409.675818151" watchObservedRunningTime="2026-03-08 19:55:08.291002149 +0000 UTC m=+1409.687056212" Mar 08 19:55:08 crc kubenswrapper[4885]: I0308 19:55:08.539522 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 08 19:55:10 crc kubenswrapper[4885]: I0308 19:55:10.723000 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 19:55:10 crc kubenswrapper[4885]: I0308 19:55:10.727854 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 19:55:10 crc kubenswrapper[4885]: I0308 19:55:10.732106 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 19:55:11 crc kubenswrapper[4885]: I0308 19:55:11.291867 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.290125 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.330463 4885 generic.go:334] "Generic (PLEG): container finished" podID="515ba29a-53ae-41dc-a444-9ffe060dc61f" containerID="4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e" exitCode=137 Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.330522 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"515ba29a-53ae-41dc-a444-9ffe060dc61f","Type":"ContainerDied","Data":"4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e"} Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.330586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"515ba29a-53ae-41dc-a444-9ffe060dc61f","Type":"ContainerDied","Data":"93b0b9013aeb93d202bb43a6cf260d8de9e26ddf4b972302a13aaf4bbc7119c7"} Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.330609 4885 scope.go:117] "RemoveContainer" containerID="4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.330538 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.374306 4885 scope.go:117] "RemoveContainer" containerID="4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e" Mar 08 19:55:13 crc kubenswrapper[4885]: E0308 19:55:13.375120 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e\": container with ID starting with 4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e not found: ID does not exist" containerID="4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.375173 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e"} err="failed to get container status \"4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e\": rpc error: code = NotFound desc = could not find container \"4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e\": container with ID starting with 4fb18a1813d17c833431ceacc5dc123519052fd003c515ffe1739264a61c520e not found: ID does not exist" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.387735 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-combined-ca-bundle\") pod \"515ba29a-53ae-41dc-a444-9ffe060dc61f\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.387889 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcznq\" (UniqueName: \"kubernetes.io/projected/515ba29a-53ae-41dc-a444-9ffe060dc61f-kube-api-access-zcznq\") pod \"515ba29a-53ae-41dc-a444-9ffe060dc61f\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.387944 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-config-data\") pod \"515ba29a-53ae-41dc-a444-9ffe060dc61f\" (UID: \"515ba29a-53ae-41dc-a444-9ffe060dc61f\") " Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.402002 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/515ba29a-53ae-41dc-a444-9ffe060dc61f-kube-api-access-zcznq" (OuterVolumeSpecName: "kube-api-access-zcznq") pod "515ba29a-53ae-41dc-a444-9ffe060dc61f" (UID: "515ba29a-53ae-41dc-a444-9ffe060dc61f"). InnerVolumeSpecName "kube-api-access-zcznq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.416657 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-config-data" (OuterVolumeSpecName: "config-data") pod "515ba29a-53ae-41dc-a444-9ffe060dc61f" (UID: "515ba29a-53ae-41dc-a444-9ffe060dc61f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.425058 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "515ba29a-53ae-41dc-a444-9ffe060dc61f" (UID: "515ba29a-53ae-41dc-a444-9ffe060dc61f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.490150 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.490201 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zcznq\" (UniqueName: \"kubernetes.io/projected/515ba29a-53ae-41dc-a444-9ffe060dc61f-kube-api-access-zcznq\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.490222 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/515ba29a-53ae-41dc-a444-9ffe060dc61f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.677673 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.690528 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.706219 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:55:13 crc kubenswrapper[4885]: E0308 19:55:13.706674 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="515ba29a-53ae-41dc-a444-9ffe060dc61f" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.706695 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="515ba29a-53ae-41dc-a444-9ffe060dc61f" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.706941 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="515ba29a-53ae-41dc-a444-9ffe060dc61f" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.707843 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.711510 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.711880 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.712115 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.740539 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.798641 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.798864 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxpr8\" (UniqueName: \"kubernetes.io/projected/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-kube-api-access-qxpr8\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.799212 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.799475 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.799550 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.900944 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.900994 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.901081 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.901128 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxpr8\" (UniqueName: \"kubernetes.io/projected/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-kube-api-access-qxpr8\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.901169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.905857 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.907281 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.907723 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.912392 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:13 crc kubenswrapper[4885]: I0308 19:55:13.919431 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxpr8\" (UniqueName: \"kubernetes.io/projected/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-kube-api-access-qxpr8\") pod \"nova-cell1-novncproxy-0\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:14 crc kubenswrapper[4885]: I0308 19:55:14.028947 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:14 crc kubenswrapper[4885]: I0308 19:55:14.613961 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:55:14 crc kubenswrapper[4885]: W0308 19:55:14.622630 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90fb4d53_4722_4f72_9f1a_99ee2b637f6e.slice/crio-36ec2d571999d46c471303241aed86c9b5d0e06a8eac136c91582cb662c8d63a WatchSource:0}: Error finding container 36ec2d571999d46c471303241aed86c9b5d0e06a8eac136c91582cb662c8d63a: Status 404 returned error can't find the container with id 36ec2d571999d46c471303241aed86c9b5d0e06a8eac136c91582cb662c8d63a Mar 08 19:55:14 crc kubenswrapper[4885]: I0308 19:55:14.817659 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 19:55:14 crc kubenswrapper[4885]: I0308 19:55:14.818497 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 19:55:14 crc kubenswrapper[4885]: I0308 19:55:14.820495 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 19:55:14 crc kubenswrapper[4885]: I0308 19:55:14.822828 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.358045 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"90fb4d53-4722-4f72-9f1a-99ee2b637f6e","Type":"ContainerStarted","Data":"32768cc82a61a862691a2497facf7d05127e1de2145b441cde8a059f45152b82"} Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.358082 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"90fb4d53-4722-4f72-9f1a-99ee2b637f6e","Type":"ContainerStarted","Data":"36ec2d571999d46c471303241aed86c9b5d0e06a8eac136c91582cb662c8d63a"} Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.358350 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.361405 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.381826 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="515ba29a-53ae-41dc-a444-9ffe060dc61f" path="/var/lib/kubelet/pods/515ba29a-53ae-41dc-a444-9ffe060dc61f/volumes" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.397720 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.397705003 podStartE2EDuration="2.397705003s" podCreationTimestamp="2026-03-08 19:55:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:15.388504656 +0000 UTC m=+1416.784558689" watchObservedRunningTime="2026-03-08 19:55:15.397705003 +0000 UTC m=+1416.793759026" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.548805 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7749c44969-7m4ps"] Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.550949 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.619771 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-7m4ps"] Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.648550 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.648620 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.648678 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-svc\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.648709 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.648757 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j727f\" (UniqueName: \"kubernetes.io/projected/8185583f-0ca5-46b1-a1ed-77c35b13a07b-kube-api-access-j727f\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.648775 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-config\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.750210 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j727f\" (UniqueName: \"kubernetes.io/projected/8185583f-0ca5-46b1-a1ed-77c35b13a07b-kube-api-access-j727f\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.750268 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-config\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.750351 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.750406 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.750482 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-svc\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.750537 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.751944 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-sb\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.752049 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-swift-storage-0\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.751981 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-svc\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.751991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-config\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.751950 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-nb\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.769873 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j727f\" (UniqueName: \"kubernetes.io/projected/8185583f-0ca5-46b1-a1ed-77c35b13a07b-kube-api-access-j727f\") pod \"dnsmasq-dns-7749c44969-7m4ps\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:15 crc kubenswrapper[4885]: I0308 19:55:15.882888 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:16 crc kubenswrapper[4885]: I0308 19:55:16.380968 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-7m4ps"] Mar 08 19:55:16 crc kubenswrapper[4885]: W0308 19:55:16.384950 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8185583f_0ca5_46b1_a1ed_77c35b13a07b.slice/crio-faaf9045f9d157b601bfa8ca719045997cf05f3ca07f75878e38d95227719aa4 WatchSource:0}: Error finding container faaf9045f9d157b601bfa8ca719045997cf05f3ca07f75878e38d95227719aa4: Status 404 returned error can't find the container with id faaf9045f9d157b601bfa8ca719045997cf05f3ca07f75878e38d95227719aa4 Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.268718 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.269290 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-central-agent" containerID="cri-o://4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" gracePeriod=30 Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.269400 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-notification-agent" containerID="cri-o://5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" gracePeriod=30 Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.269396 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="sg-core" containerID="cri-o://d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" gracePeriod=30 Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.269556 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="proxy-httpd" containerID="cri-o://f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" gracePeriod=30 Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.280932 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.203:3000/\": read tcp 10.217.0.2:41260->10.217.0.203:3000: read: connection reset by peer" Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.375992 4885 generic.go:334] "Generic (PLEG): container finished" podID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerID="4a9e799e765066afa260a1cecf9172d38f7e49cda7c9f4bc8c9ce49bcef121a4" exitCode=0 Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.390206 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" event={"ID":"8185583f-0ca5-46b1-a1ed-77c35b13a07b","Type":"ContainerDied","Data":"4a9e799e765066afa260a1cecf9172d38f7e49cda7c9f4bc8c9ce49bcef121a4"} Mar 08 19:55:17 crc kubenswrapper[4885]: I0308 19:55:17.390273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" event={"ID":"8185583f-0ca5-46b1-a1ed-77c35b13a07b","Type":"ContainerStarted","Data":"faaf9045f9d157b601bfa8ca719045997cf05f3ca07f75878e38d95227719aa4"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.383903 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.393528 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.400492 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" event={"ID":"8185583f-0ca5-46b1-a1ed-77c35b13a07b","Type":"ContainerStarted","Data":"8f4e92cbc965cfb3ab7ad14008b7ceea724e345981ffb88a02993256da9e6dcb"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.401648 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.404276 4885 generic.go:334] "Generic (PLEG): container finished" podID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" exitCode=0 Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.404310 4885 generic.go:334] "Generic (PLEG): container finished" podID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" exitCode=2 Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.404322 4885 generic.go:334] "Generic (PLEG): container finished" podID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" exitCode=0 Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.404332 4885 generic.go:334] "Generic (PLEG): container finished" podID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" exitCode=0 Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.404510 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-log" containerID="cri-o://7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582" gracePeriod=30 Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.404813 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405405 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerDied","Data":"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405443 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerDied","Data":"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405481 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerDied","Data":"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerDied","Data":"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6","Type":"ContainerDied","Data":"3d7681c02db53e0bc7cd05053b145730f9d4263a90b7db83dc6a7abc4e56e119"} Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405528 4885 scope.go:117] "RemoveContainer" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.405721 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-api" containerID="cri-o://a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7" gracePeriod=30 Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.451332 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" podStartSLOduration=3.451316817 podStartE2EDuration="3.451316817s" podCreationTimestamp="2026-03-08 19:55:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:18.446802096 +0000 UTC m=+1419.842856129" watchObservedRunningTime="2026-03-08 19:55:18.451316817 +0000 UTC m=+1419.847370840" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.457785 4885 scope.go:117] "RemoveContainer" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.496428 4885 scope.go:117] "RemoveContainer" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.514956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-scripts\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515036 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-combined-ca-bundle\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515133 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-sg-core-conf-yaml\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515171 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-run-httpd\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515192 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-config-data\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515235 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpp4k\" (UniqueName: \"kubernetes.io/projected/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-kube-api-access-jpp4k\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515302 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-log-httpd\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.515394 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-ceilometer-tls-certs\") pod \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\" (UID: \"eb4f8cd0-6d3d-456e-b735-036d5d70e3c6\") " Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.519117 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.519782 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.527021 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-scripts" (OuterVolumeSpecName: "scripts") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.528032 4885 scope.go:117] "RemoveContainer" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.537071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-kube-api-access-jpp4k" (OuterVolumeSpecName: "kube-api-access-jpp4k") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "kube-api-access-jpp4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.550062 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.570264 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.589317 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620024 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpp4k\" (UniqueName: \"kubernetes.io/projected/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-kube-api-access-jpp4k\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620269 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620279 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620289 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620297 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620305 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.620313 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.636489 4885 scope.go:117] "RemoveContainer" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.638260 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": container with ID starting with f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e not found: ID does not exist" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.638301 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e"} err="failed to get container status \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": rpc error: code = NotFound desc = could not find container \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": container with ID starting with f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.638325 4885 scope.go:117] "RemoveContainer" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.639377 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": container with ID starting with d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf not found: ID does not exist" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.639411 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf"} err="failed to get container status \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": rpc error: code = NotFound desc = could not find container \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": container with ID starting with d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.639427 4885 scope.go:117] "RemoveContainer" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.639835 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": container with ID starting with 5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293 not found: ID does not exist" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.639859 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293"} err="failed to get container status \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": rpc error: code = NotFound desc = could not find container \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": container with ID starting with 5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.639873 4885 scope.go:117] "RemoveContainer" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.640143 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": container with ID starting with 4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9 not found: ID does not exist" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.640187 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9"} err="failed to get container status \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": rpc error: code = NotFound desc = could not find container \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": container with ID starting with 4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.640215 4885 scope.go:117] "RemoveContainer" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.640490 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e"} err="failed to get container status \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": rpc error: code = NotFound desc = could not find container \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": container with ID starting with f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.640514 4885 scope.go:117] "RemoveContainer" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.640949 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf"} err="failed to get container status \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": rpc error: code = NotFound desc = could not find container \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": container with ID starting with d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.640968 4885 scope.go:117] "RemoveContainer" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641163 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293"} err="failed to get container status \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": rpc error: code = NotFound desc = could not find container \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": container with ID starting with 5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641180 4885 scope.go:117] "RemoveContainer" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641356 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9"} err="failed to get container status \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": rpc error: code = NotFound desc = could not find container \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": container with ID starting with 4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641373 4885 scope.go:117] "RemoveContainer" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641550 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e"} err="failed to get container status \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": rpc error: code = NotFound desc = could not find container \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": container with ID starting with f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641568 4885 scope.go:117] "RemoveContainer" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641768 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf"} err="failed to get container status \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": rpc error: code = NotFound desc = could not find container \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": container with ID starting with d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.641787 4885 scope.go:117] "RemoveContainer" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642007 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293"} err="failed to get container status \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": rpc error: code = NotFound desc = could not find container \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": container with ID starting with 5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642023 4885 scope.go:117] "RemoveContainer" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642232 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9"} err="failed to get container status \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": rpc error: code = NotFound desc = could not find container \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": container with ID starting with 4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642250 4885 scope.go:117] "RemoveContainer" containerID="f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642419 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e"} err="failed to get container status \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": rpc error: code = NotFound desc = could not find container \"f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e\": container with ID starting with f1bc7d35e965635274008a287a04b9e2fab2ddd6e33b8cb3816579c3ed12283e not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642435 4885 scope.go:117] "RemoveContainer" containerID="d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642649 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf"} err="failed to get container status \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": rpc error: code = NotFound desc = could not find container \"d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf\": container with ID starting with d96ab3acc0da38fbfdf37835d2a681d495f2494fdac6f4fbc5790e5e4a9507cf not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642666 4885 scope.go:117] "RemoveContainer" containerID="5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642826 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293"} err="failed to get container status \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": rpc error: code = NotFound desc = could not find container \"5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293\": container with ID starting with 5789f79516010b3e1b1a8c19e534a3a871ab974cea6e14ee722bd9a27803f293 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.642842 4885 scope.go:117] "RemoveContainer" containerID="4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.643142 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9"} err="failed to get container status \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": rpc error: code = NotFound desc = could not find container \"4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9\": container with ID starting with 4be3a3ed84eed256f7a0ed0aae1256d41fd3af2748c4981ff03376af1fae7df9 not found: ID does not exist" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.653062 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-config-data" (OuterVolumeSpecName: "config-data") pod "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" (UID: "eb4f8cd0-6d3d-456e-b735-036d5d70e3c6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.721633 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.743722 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.751792 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.760509 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.760898 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="sg-core" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.760937 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="sg-core" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.760957 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-notification-agent" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.760965 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-notification-agent" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.760982 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-central-agent" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.760988 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-central-agent" Mar 08 19:55:18 crc kubenswrapper[4885]: E0308 19:55:18.760999 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="proxy-httpd" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.761004 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="proxy-httpd" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.761160 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-notification-agent" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.761168 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="sg-core" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.761181 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="proxy-httpd" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.761191 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" containerName="ceilometer-central-agent" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.763248 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.768144 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.768319 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.769023 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.776978 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823299 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823351 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823368 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-log-httpd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823414 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-scripts\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823579 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hqdd\" (UniqueName: \"kubernetes.io/projected/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-kube-api-access-4hqdd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823714 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823859 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-config-data\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.823902 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-run-httpd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.925734 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-run-httpd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.926385 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.926823 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.926896 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-log-httpd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.927014 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-scripts\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.927141 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hqdd\" (UniqueName: \"kubernetes.io/projected/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-kube-api-access-4hqdd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.927222 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.927305 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-config-data\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.927845 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-log-httpd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.926307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-run-httpd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.933001 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.934143 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.938655 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-scripts\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.939890 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-config-data\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.940095 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:18 crc kubenswrapper[4885]: I0308 19:55:18.944545 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hqdd\" (UniqueName: \"kubernetes.io/projected/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-kube-api-access-4hqdd\") pod \"ceilometer-0\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " pod="openstack/ceilometer-0" Mar 08 19:55:19 crc kubenswrapper[4885]: I0308 19:55:19.029259 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:19 crc kubenswrapper[4885]: I0308 19:55:19.081615 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:19 crc kubenswrapper[4885]: I0308 19:55:19.377086 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb4f8cd0-6d3d-456e-b735-036d5d70e3c6" path="/var/lib/kubelet/pods/eb4f8cd0-6d3d-456e-b735-036d5d70e3c6/volumes" Mar 08 19:55:19 crc kubenswrapper[4885]: I0308 19:55:19.565703 4885 generic.go:334] "Generic (PLEG): container finished" podID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerID="7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582" exitCode=143 Mar 08 19:55:19 crc kubenswrapper[4885]: I0308 19:55:19.565811 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"273a661b-19fb-47d2-b1d6-05ddf548f212","Type":"ContainerDied","Data":"7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582"} Mar 08 19:55:19 crc kubenswrapper[4885]: I0308 19:55:19.624517 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:20 crc kubenswrapper[4885]: I0308 19:55:20.576364 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerStarted","Data":"0b1f8ae1cf897e68887902e9c908567ae03c21713be181a14ae5dfb412f11087"} Mar 08 19:55:20 crc kubenswrapper[4885]: I0308 19:55:20.576973 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerStarted","Data":"cc7fd9651b0daebb9ca4a98f8eaeb673920ad2e9d2e71ad709ac41621e099e5b"} Mar 08 19:55:21 crc kubenswrapper[4885]: I0308 19:55:21.014557 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:21 crc kubenswrapper[4885]: I0308 19:55:21.594556 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerStarted","Data":"28cf832d6e33ae08ba17636d041e0bb1069736d882a89f4775ec27440fe30839"} Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.368086 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.518066 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-config-data\") pod \"273a661b-19fb-47d2-b1d6-05ddf548f212\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.518154 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-combined-ca-bundle\") pod \"273a661b-19fb-47d2-b1d6-05ddf548f212\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.518232 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8p8t\" (UniqueName: \"kubernetes.io/projected/273a661b-19fb-47d2-b1d6-05ddf548f212-kube-api-access-n8p8t\") pod \"273a661b-19fb-47d2-b1d6-05ddf548f212\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.518344 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273a661b-19fb-47d2-b1d6-05ddf548f212-logs\") pod \"273a661b-19fb-47d2-b1d6-05ddf548f212\" (UID: \"273a661b-19fb-47d2-b1d6-05ddf548f212\") " Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.523263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/273a661b-19fb-47d2-b1d6-05ddf548f212-logs" (OuterVolumeSpecName: "logs") pod "273a661b-19fb-47d2-b1d6-05ddf548f212" (UID: "273a661b-19fb-47d2-b1d6-05ddf548f212"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.526910 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/273a661b-19fb-47d2-b1d6-05ddf548f212-kube-api-access-n8p8t" (OuterVolumeSpecName: "kube-api-access-n8p8t") pod "273a661b-19fb-47d2-b1d6-05ddf548f212" (UID: "273a661b-19fb-47d2-b1d6-05ddf548f212"). InnerVolumeSpecName "kube-api-access-n8p8t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.545814 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-config-data" (OuterVolumeSpecName: "config-data") pod "273a661b-19fb-47d2-b1d6-05ddf548f212" (UID: "273a661b-19fb-47d2-b1d6-05ddf548f212"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.548399 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "273a661b-19fb-47d2-b1d6-05ddf548f212" (UID: "273a661b-19fb-47d2-b1d6-05ddf548f212"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.610627 4885 generic.go:334] "Generic (PLEG): container finished" podID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerID="a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7" exitCode=0 Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.610706 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.610733 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"273a661b-19fb-47d2-b1d6-05ddf548f212","Type":"ContainerDied","Data":"a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7"} Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.610761 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"273a661b-19fb-47d2-b1d6-05ddf548f212","Type":"ContainerDied","Data":"8cdb2767b2eabddf4d25eac6833f5ae8cc579f3e077a9d8508fa5553f3b7e1b3"} Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.610776 4885 scope.go:117] "RemoveContainer" containerID="a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.619165 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerStarted","Data":"8f7f8df970a5fdc517bdf1a2dd73795a6937f04ee672227be9fa57b2dac077d8"} Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.620629 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273a661b-19fb-47d2-b1d6-05ddf548f212-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.620650 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.620659 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273a661b-19fb-47d2-b1d6-05ddf548f212-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.620779 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8p8t\" (UniqueName: \"kubernetes.io/projected/273a661b-19fb-47d2-b1d6-05ddf548f212-kube-api-access-n8p8t\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.652193 4885 scope.go:117] "RemoveContainer" containerID="7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.655509 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.676721 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.692509 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:22 crc kubenswrapper[4885]: E0308 19:55:22.692880 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-log" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.692897 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-log" Mar 08 19:55:22 crc kubenswrapper[4885]: E0308 19:55:22.692931 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-api" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.692938 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-api" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.693115 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-api" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.693132 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" containerName="nova-api-log" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.693996 4885 scope.go:117] "RemoveContainer" containerID="a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.694079 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: E0308 19:55:22.695502 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7\": container with ID starting with a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7 not found: ID does not exist" containerID="a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.695538 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7"} err="failed to get container status \"a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7\": rpc error: code = NotFound desc = could not find container \"a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7\": container with ID starting with a82fadb1e5410289c042bfb676c889cae8bcad59905ba1bc2106344f13aee3a7 not found: ID does not exist" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.695563 4885 scope.go:117] "RemoveContainer" containerID="7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.699197 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 08 19:55:22 crc kubenswrapper[4885]: E0308 19:55:22.699194 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582\": container with ID starting with 7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582 not found: ID does not exist" containerID="7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.699364 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582"} err="failed to get container status \"7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582\": rpc error: code = NotFound desc = could not find container \"7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582\": container with ID starting with 7e909f49e131ef71fb86e3b5abd64685384dfa2c69b2cd879353c24cf9b6e582 not found: ID does not exist" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.699426 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.699943 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.710825 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.823829 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p2mj\" (UniqueName: \"kubernetes.io/projected/006cf27f-ef31-4316-a5de-833141664964-kube-api-access-5p2mj\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.824189 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-internal-tls-certs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.824230 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.824264 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-config-data\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.824391 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-public-tls-certs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.824418 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/006cf27f-ef31-4316-a5de-833141664964-logs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.926131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-public-tls-certs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.926180 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/006cf27f-ef31-4316-a5de-833141664964-logs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.926216 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p2mj\" (UniqueName: \"kubernetes.io/projected/006cf27f-ef31-4316-a5de-833141664964-kube-api-access-5p2mj\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.926237 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-internal-tls-certs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.926266 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.926294 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-config-data\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.927571 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/006cf27f-ef31-4316-a5de-833141664964-logs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.931449 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-config-data\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.938391 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-internal-tls-certs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.940500 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.942129 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-public-tls-certs\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:22 crc kubenswrapper[4885]: I0308 19:55:22.945313 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p2mj\" (UniqueName: \"kubernetes.io/projected/006cf27f-ef31-4316-a5de-833141664964-kube-api-access-5p2mj\") pod \"nova-api-0\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " pod="openstack/nova-api-0" Mar 08 19:55:23 crc kubenswrapper[4885]: I0308 19:55:23.016171 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:23 crc kubenswrapper[4885]: I0308 19:55:23.380933 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="273a661b-19fb-47d2-b1d6-05ddf548f212" path="/var/lib/kubelet/pods/273a661b-19fb-47d2-b1d6-05ddf548f212/volumes" Mar 08 19:55:23 crc kubenswrapper[4885]: I0308 19:55:23.498973 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:23 crc kubenswrapper[4885]: I0308 19:55:23.639825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"006cf27f-ef31-4316-a5de-833141664964","Type":"ContainerStarted","Data":"30344ac6a0d4ba0d54022fa56e76ff23a3c0282a327f06f069842f7bf31a4249"} Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.029374 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.050976 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.654867 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"006cf27f-ef31-4316-a5de-833141664964","Type":"ContainerStarted","Data":"e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359"} Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.655468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"006cf27f-ef31-4316-a5de-833141664964","Type":"ContainerStarted","Data":"442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8"} Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.658105 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerStarted","Data":"4339ad679bd1880ac414ad5b365774f68d747ea0a83d951d90e3f68c160e4b31"} Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.658588 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-central-agent" containerID="cri-o://0b1f8ae1cf897e68887902e9c908567ae03c21713be181a14ae5dfb412f11087" gracePeriod=30 Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.658721 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-notification-agent" containerID="cri-o://28cf832d6e33ae08ba17636d041e0bb1069736d882a89f4775ec27440fe30839" gracePeriod=30 Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.658729 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="sg-core" containerID="cri-o://8f7f8df970a5fdc517bdf1a2dd73795a6937f04ee672227be9fa57b2dac077d8" gracePeriod=30 Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.658755 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="proxy-httpd" containerID="cri-o://4339ad679bd1880ac414ad5b365774f68d747ea0a83d951d90e3f68c160e4b31" gracePeriod=30 Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.687403 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.696942 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.696905269 podStartE2EDuration="2.696905269s" podCreationTimestamp="2026-03-08 19:55:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:24.681106066 +0000 UTC m=+1426.077160149" watchObservedRunningTime="2026-03-08 19:55:24.696905269 +0000 UTC m=+1426.092959302" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.720146 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.6344976879999997 podStartE2EDuration="6.720130191s" podCreationTimestamp="2026-03-08 19:55:18 +0000 UTC" firstStartedPulling="2026-03-08 19:55:19.640970199 +0000 UTC m=+1421.037024222" lastFinishedPulling="2026-03-08 19:55:23.726602692 +0000 UTC m=+1425.122656725" observedRunningTime="2026-03-08 19:55:24.71748985 +0000 UTC m=+1426.113543883" watchObservedRunningTime="2026-03-08 19:55:24.720130191 +0000 UTC m=+1426.116184224" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.910753 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-l6hg9"] Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.911995 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.913903 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.914221 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.919293 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-l6hg9"] Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.991538 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt7vh\" (UniqueName: \"kubernetes.io/projected/53478e7f-ae6d-4540-a51d-2fd03f142027-kube-api-access-rt7vh\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.991631 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.991717 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-config-data\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:24 crc kubenswrapper[4885]: I0308 19:55:24.991961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-scripts\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.093785 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rt7vh\" (UniqueName: \"kubernetes.io/projected/53478e7f-ae6d-4540-a51d-2fd03f142027-kube-api-access-rt7vh\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.093833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.093854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-config-data\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.093998 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-scripts\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.100123 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-scripts\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.101953 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.102414 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-config-data\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.117480 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rt7vh\" (UniqueName: \"kubernetes.io/projected/53478e7f-ae6d-4540-a51d-2fd03f142027-kube-api-access-rt7vh\") pod \"nova-cell1-cell-mapping-l6hg9\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.228577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.720993 4885 generic.go:334] "Generic (PLEG): container finished" podID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerID="4339ad679bd1880ac414ad5b365774f68d747ea0a83d951d90e3f68c160e4b31" exitCode=0 Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.721055 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerDied","Data":"4339ad679bd1880ac414ad5b365774f68d747ea0a83d951d90e3f68c160e4b31"} Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.722014 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerDied","Data":"8f7f8df970a5fdc517bdf1a2dd73795a6937f04ee672227be9fa57b2dac077d8"} Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.721959 4885 generic.go:334] "Generic (PLEG): container finished" podID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerID="8f7f8df970a5fdc517bdf1a2dd73795a6937f04ee672227be9fa57b2dac077d8" exitCode=2 Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.723534 4885 generic.go:334] "Generic (PLEG): container finished" podID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerID="28cf832d6e33ae08ba17636d041e0bb1069736d882a89f4775ec27440fe30839" exitCode=0 Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.723602 4885 generic.go:334] "Generic (PLEG): container finished" podID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerID="0b1f8ae1cf897e68887902e9c908567ae03c21713be181a14ae5dfb412f11087" exitCode=0 Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.723907 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerDied","Data":"28cf832d6e33ae08ba17636d041e0bb1069736d882a89f4775ec27440fe30839"} Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.723954 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerDied","Data":"0b1f8ae1cf897e68887902e9c908567ae03c21713be181a14ae5dfb412f11087"} Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.753933 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-l6hg9"] Mar 08 19:55:25 crc kubenswrapper[4885]: W0308 19:55:25.757712 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53478e7f_ae6d_4540_a51d_2fd03f142027.slice/crio-4c3ffab2d2b866b89801a11076cc2cdcac79305b778a66c018ea9d5b7d4245b6 WatchSource:0}: Error finding container 4c3ffab2d2b866b89801a11076cc2cdcac79305b778a66c018ea9d5b7d4245b6: Status 404 returned error can't find the container with id 4c3ffab2d2b866b89801a11076cc2cdcac79305b778a66c018ea9d5b7d4245b6 Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.885032 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.951463 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-8dnjm"] Mar 08 19:55:25 crc kubenswrapper[4885]: I0308 19:55:25.951693 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" podUID="0274624f-a49d-425f-b025-753e4e174477" containerName="dnsmasq-dns" containerID="cri-o://65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db" gracePeriod=10 Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.031208 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120208 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-scripts\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120277 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-config-data\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120327 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hqdd\" (UniqueName: \"kubernetes.io/projected/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-kube-api-access-4hqdd\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120385 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-run-httpd\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120421 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-combined-ca-bundle\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120475 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-ceilometer-tls-certs\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120533 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-sg-core-conf-yaml\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120566 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-log-httpd\") pod \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\" (UID: \"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.120805 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.121161 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.121247 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.121266 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.125879 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-kube-api-access-4hqdd" (OuterVolumeSpecName: "kube-api-access-4hqdd") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "kube-api-access-4hqdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.133149 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-scripts" (OuterVolumeSpecName: "scripts") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.177225 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.189427 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.206862 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.225737 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.225761 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.225771 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.225782 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.225790 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hqdd\" (UniqueName: \"kubernetes.io/projected/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-kube-api-access-4hqdd\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.257005 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-config-data" (OuterVolumeSpecName: "config-data") pod "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" (UID: "3851acf4-e600-4f5b-86a8-2cb1c01f2d6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.327885 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.499798 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.632955 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-config\") pod \"0274624f-a49d-425f-b025-753e4e174477\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.633013 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-sb\") pod \"0274624f-a49d-425f-b025-753e4e174477\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.633058 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-swift-storage-0\") pod \"0274624f-a49d-425f-b025-753e4e174477\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.633148 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-nb\") pod \"0274624f-a49d-425f-b025-753e4e174477\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.633191 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frh8b\" (UniqueName: \"kubernetes.io/projected/0274624f-a49d-425f-b025-753e4e174477-kube-api-access-frh8b\") pod \"0274624f-a49d-425f-b025-753e4e174477\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.633225 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-svc\") pod \"0274624f-a49d-425f-b025-753e4e174477\" (UID: \"0274624f-a49d-425f-b025-753e4e174477\") " Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.639254 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0274624f-a49d-425f-b025-753e4e174477-kube-api-access-frh8b" (OuterVolumeSpecName: "kube-api-access-frh8b") pod "0274624f-a49d-425f-b025-753e4e174477" (UID: "0274624f-a49d-425f-b025-753e4e174477"). InnerVolumeSpecName "kube-api-access-frh8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.686529 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0274624f-a49d-425f-b025-753e4e174477" (UID: "0274624f-a49d-425f-b025-753e4e174477"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.695294 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0274624f-a49d-425f-b025-753e4e174477" (UID: "0274624f-a49d-425f-b025-753e4e174477"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.695636 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0274624f-a49d-425f-b025-753e4e174477" (UID: "0274624f-a49d-425f-b025-753e4e174477"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.700722 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0274624f-a49d-425f-b025-753e4e174477" (UID: "0274624f-a49d-425f-b025-753e4e174477"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.710694 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-config" (OuterVolumeSpecName: "config") pod "0274624f-a49d-425f-b025-753e4e174477" (UID: "0274624f-a49d-425f-b025-753e4e174477"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.736682 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.736718 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frh8b\" (UniqueName: \"kubernetes.io/projected/0274624f-a49d-425f-b025-753e4e174477-kube-api-access-frh8b\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.736732 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.736743 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.736751 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.736759 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0274624f-a49d-425f-b025-753e4e174477-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.743234 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l6hg9" event={"ID":"53478e7f-ae6d-4540-a51d-2fd03f142027","Type":"ContainerStarted","Data":"52341d43266cc07b81a420fcef2575f373be02c72919fe4d7ea1b1bbdd8174c2"} Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.743282 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l6hg9" event={"ID":"53478e7f-ae6d-4540-a51d-2fd03f142027","Type":"ContainerStarted","Data":"4c3ffab2d2b866b89801a11076cc2cdcac79305b778a66c018ea9d5b7d4245b6"} Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.746988 4885 generic.go:334] "Generic (PLEG): container finished" podID="0274624f-a49d-425f-b025-753e4e174477" containerID="65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db" exitCode=0 Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.747008 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.747037 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" event={"ID":"0274624f-a49d-425f-b025-753e4e174477","Type":"ContainerDied","Data":"65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db"} Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.747218 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bd5679c8c-8dnjm" event={"ID":"0274624f-a49d-425f-b025-753e4e174477","Type":"ContainerDied","Data":"e0f14b4c223b387a85388f249c4bcd0b49cb89c791c76c20d6484eb655e195a3"} Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.747246 4885 scope.go:117] "RemoveContainer" containerID="65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.751465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3851acf4-e600-4f5b-86a8-2cb1c01f2d6e","Type":"ContainerDied","Data":"cc7fd9651b0daebb9ca4a98f8eaeb673920ad2e9d2e71ad709ac41621e099e5b"} Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.751538 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.777889 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-l6hg9" podStartSLOduration=2.777871513 podStartE2EDuration="2.777871513s" podCreationTimestamp="2026-03-08 19:55:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:26.759006387 +0000 UTC m=+1428.155060430" watchObservedRunningTime="2026-03-08 19:55:26.777871513 +0000 UTC m=+1428.173925536" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.801631 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-8dnjm"] Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.804260 4885 scope.go:117] "RemoveContainer" containerID="bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.813188 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bd5679c8c-8dnjm"] Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.834910 4885 scope.go:117] "RemoveContainer" containerID="65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.835384 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db\": container with ID starting with 65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db not found: ID does not exist" containerID="65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.835486 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db"} err="failed to get container status \"65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db\": rpc error: code = NotFound desc = could not find container \"65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db\": container with ID starting with 65a6b5aac176afee004de2a2cecbd502d65e8a81bffef1a4bd53b672cbb083db not found: ID does not exist" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.835565 4885 scope.go:117] "RemoveContainer" containerID="bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.835893 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7\": container with ID starting with bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7 not found: ID does not exist" containerID="bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.836077 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7"} err="failed to get container status \"bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7\": rpc error: code = NotFound desc = could not find container \"bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7\": container with ID starting with bdeb020ae9d86e63615918e36d9b431e13b33bba34989f4410a1917cf2f161d7 not found: ID does not exist" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.836143 4885 scope.go:117] "RemoveContainer" containerID="4339ad679bd1880ac414ad5b365774f68d747ea0a83d951d90e3f68c160e4b31" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.842056 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.856860 4885 scope.go:117] "RemoveContainer" containerID="8f7f8df970a5fdc517bdf1a2dd73795a6937f04ee672227be9fa57b2dac077d8" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.866376 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877349 4885 scope.go:117] "RemoveContainer" containerID="28cf832d6e33ae08ba17636d041e0bb1069736d882a89f4775ec27440fe30839" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877454 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.877799 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-notification-agent" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877810 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-notification-agent" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.877830 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="sg-core" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877836 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="sg-core" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.877848 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-central-agent" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877854 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-central-agent" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.877865 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="proxy-httpd" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877870 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="proxy-httpd" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.877887 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0274624f-a49d-425f-b025-753e4e174477" containerName="init" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877892 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0274624f-a49d-425f-b025-753e4e174477" containerName="init" Mar 08 19:55:26 crc kubenswrapper[4885]: E0308 19:55:26.877904 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0274624f-a49d-425f-b025-753e4e174477" containerName="dnsmasq-dns" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.877909 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0274624f-a49d-425f-b025-753e4e174477" containerName="dnsmasq-dns" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.878118 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0274624f-a49d-425f-b025-753e4e174477" containerName="dnsmasq-dns" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.878135 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-central-agent" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.878143 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="ceilometer-notification-agent" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.878153 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="proxy-httpd" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.878161 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" containerName="sg-core" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.879679 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.882428 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.882571 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.883030 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.885616 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:26 crc kubenswrapper[4885]: I0308 19:55:26.914203 4885 scope.go:117] "RemoveContainer" containerID="0b1f8ae1cf897e68887902e9c908567ae03c21713be181a14ae5dfb412f11087" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.044845 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-run-httpd\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.044963 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-scripts\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.045001 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.045068 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-config-data\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.045115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-log-httpd\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.045199 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.045390 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.045451 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n89rb\" (UniqueName: \"kubernetes.io/projected/6a1f465c-123b-455f-8bd8-720d3f8a4bef-kube-api-access-n89rb\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148022 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-scripts\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148092 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148158 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-config-data\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148203 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-log-httpd\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148287 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148441 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148498 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n89rb\" (UniqueName: \"kubernetes.io/projected/6a1f465c-123b-455f-8bd8-720d3f8a4bef-kube-api-access-n89rb\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.148560 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-run-httpd\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.149494 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-log-httpd\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.150035 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-run-httpd\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.155059 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.155564 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.156499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-scripts\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.158120 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-config-data\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.159553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.166863 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n89rb\" (UniqueName: \"kubernetes.io/projected/6a1f465c-123b-455f-8bd8-720d3f8a4bef-kube-api-access-n89rb\") pod \"ceilometer-0\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.206297 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.395250 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0274624f-a49d-425f-b025-753e4e174477" path="/var/lib/kubelet/pods/0274624f-a49d-425f-b025-753e4e174477/volumes" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.397000 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3851acf4-e600-4f5b-86a8-2cb1c01f2d6e" path="/var/lib/kubelet/pods/3851acf4-e600-4f5b-86a8-2cb1c01f2d6e/volumes" Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.705371 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:55:27 crc kubenswrapper[4885]: I0308 19:55:27.763366 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerStarted","Data":"a1d4f26c989a88dcb6b8292fefbf9776a8ae2f04c4981fe1a6564019613a69ba"} Mar 08 19:55:28 crc kubenswrapper[4885]: I0308 19:55:28.780047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerStarted","Data":"a5a0e7af89f0943433efc0423974aa4157ace5b596adabc6170e4373acc330a7"} Mar 08 19:55:29 crc kubenswrapper[4885]: I0308 19:55:29.790162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerStarted","Data":"7e93f87815197e303bd6f0ad768ac092887798b010a49cf9460b37861d1fc6db"} Mar 08 19:55:30 crc kubenswrapper[4885]: I0308 19:55:30.804572 4885 generic.go:334] "Generic (PLEG): container finished" podID="53478e7f-ae6d-4540-a51d-2fd03f142027" containerID="52341d43266cc07b81a420fcef2575f373be02c72919fe4d7ea1b1bbdd8174c2" exitCode=0 Mar 08 19:55:30 crc kubenswrapper[4885]: I0308 19:55:30.804642 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l6hg9" event={"ID":"53478e7f-ae6d-4540-a51d-2fd03f142027","Type":"ContainerDied","Data":"52341d43266cc07b81a420fcef2575f373be02c72919fe4d7ea1b1bbdd8174c2"} Mar 08 19:55:30 crc kubenswrapper[4885]: I0308 19:55:30.809799 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerStarted","Data":"c26148668f63c2c808f3994e48705725bbf52e07fae581041f2a8517c972eb19"} Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.298271 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.467948 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-scripts\") pod \"53478e7f-ae6d-4540-a51d-2fd03f142027\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.468128 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rt7vh\" (UniqueName: \"kubernetes.io/projected/53478e7f-ae6d-4540-a51d-2fd03f142027-kube-api-access-rt7vh\") pod \"53478e7f-ae6d-4540-a51d-2fd03f142027\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.468170 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-combined-ca-bundle\") pod \"53478e7f-ae6d-4540-a51d-2fd03f142027\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.468347 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-config-data\") pod \"53478e7f-ae6d-4540-a51d-2fd03f142027\" (UID: \"53478e7f-ae6d-4540-a51d-2fd03f142027\") " Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.475877 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-scripts" (OuterVolumeSpecName: "scripts") pod "53478e7f-ae6d-4540-a51d-2fd03f142027" (UID: "53478e7f-ae6d-4540-a51d-2fd03f142027"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.476009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53478e7f-ae6d-4540-a51d-2fd03f142027-kube-api-access-rt7vh" (OuterVolumeSpecName: "kube-api-access-rt7vh") pod "53478e7f-ae6d-4540-a51d-2fd03f142027" (UID: "53478e7f-ae6d-4540-a51d-2fd03f142027"). InnerVolumeSpecName "kube-api-access-rt7vh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.503492 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-config-data" (OuterVolumeSpecName: "config-data") pod "53478e7f-ae6d-4540-a51d-2fd03f142027" (UID: "53478e7f-ae6d-4540-a51d-2fd03f142027"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.504884 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53478e7f-ae6d-4540-a51d-2fd03f142027" (UID: "53478e7f-ae6d-4540-a51d-2fd03f142027"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.571557 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rt7vh\" (UniqueName: \"kubernetes.io/projected/53478e7f-ae6d-4540-a51d-2fd03f142027-kube-api-access-rt7vh\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.571627 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.571659 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.571686 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53478e7f-ae6d-4540-a51d-2fd03f142027-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.851453 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerStarted","Data":"46513b3771e23d8ed82d3b5bc73c4d07608c21fefaf4830b5055b5a4a5d6d688"} Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.851566 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.854380 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-l6hg9" event={"ID":"53478e7f-ae6d-4540-a51d-2fd03f142027","Type":"ContainerDied","Data":"4c3ffab2d2b866b89801a11076cc2cdcac79305b778a66c018ea9d5b7d4245b6"} Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.854434 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c3ffab2d2b866b89801a11076cc2cdcac79305b778a66c018ea9d5b7d4245b6" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.854458 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-l6hg9" Mar 08 19:55:32 crc kubenswrapper[4885]: I0308 19:55:32.905869 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.843370281 podStartE2EDuration="6.905846405s" podCreationTimestamp="2026-03-08 19:55:26 +0000 UTC" firstStartedPulling="2026-03-08 19:55:27.713510281 +0000 UTC m=+1429.109564304" lastFinishedPulling="2026-03-08 19:55:31.775986385 +0000 UTC m=+1433.172040428" observedRunningTime="2026-03-08 19:55:32.893829493 +0000 UTC m=+1434.289883536" watchObservedRunningTime="2026-03-08 19:55:32.905846405 +0000 UTC m=+1434.301900448" Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.016858 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.016913 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.024119 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.051448 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.051724 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" containerName="nova-scheduler-scheduler" containerID="cri-o://9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" gracePeriod=30 Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.106461 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.106715 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-log" containerID="cri-o://aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898" gracePeriod=30 Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.107202 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-metadata" containerID="cri-o://bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552" gracePeriod=30 Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.866248 4885 generic.go:334] "Generic (PLEG): container finished" podID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerID="aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898" exitCode=143 Mar 08 19:55:33 crc kubenswrapper[4885]: I0308 19:55:33.866336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a54375d5-43ad-493f-87b8-f10b9d6f68f9","Type":"ContainerDied","Data":"aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898"} Mar 08 19:55:34 crc kubenswrapper[4885]: I0308 19:55:34.030116 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:34 crc kubenswrapper[4885]: I0308 19:55:34.030160 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:34 crc kubenswrapper[4885]: E0308 19:55:34.526654 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:55:34 crc kubenswrapper[4885]: E0308 19:55:34.528763 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:55:34 crc kubenswrapper[4885]: E0308 19:55:34.530122 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:55:34 crc kubenswrapper[4885]: E0308 19:55:34.530195 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" containerName="nova-scheduler-scheduler" Mar 08 19:55:34 crc kubenswrapper[4885]: I0308 19:55:34.879445 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-log" containerID="cri-o://442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8" gracePeriod=30 Mar 08 19:55:34 crc kubenswrapper[4885]: I0308 19:55:34.879583 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-api" containerID="cri-o://e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359" gracePeriod=30 Mar 08 19:55:35 crc kubenswrapper[4885]: I0308 19:55:35.896784 4885 generic.go:334] "Generic (PLEG): container finished" podID="006cf27f-ef31-4316-a5de-833141664964" containerID="442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8" exitCode=143 Mar 08 19:55:35 crc kubenswrapper[4885]: I0308 19:55:35.897041 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"006cf27f-ef31-4316-a5de-833141664964","Type":"ContainerDied","Data":"442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8"} Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.263895 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:59308->10.217.0.199:8775: read: connection reset by peer" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.263974 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.199:8775/\": read tcp 10.217.0.2:59300->10.217.0.199:8775: read: connection reset by peer" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.807390 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.911575 4885 generic.go:334] "Generic (PLEG): container finished" podID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerID="bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552" exitCode=0 Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.911628 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a54375d5-43ad-493f-87b8-f10b9d6f68f9","Type":"ContainerDied","Data":"bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552"} Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.911660 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a54375d5-43ad-493f-87b8-f10b9d6f68f9","Type":"ContainerDied","Data":"72d850d89769888e1770839347c05fb2c096d5b48c6987176e4df5105daaa18d"} Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.911679 4885 scope.go:117] "RemoveContainer" containerID="bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.911700 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.939737 4885 scope.go:117] "RemoveContainer" containerID="aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.960577 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws8s4\" (UniqueName: \"kubernetes.io/projected/a54375d5-43ad-493f-87b8-f10b9d6f68f9-kube-api-access-ws8s4\") pod \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.960659 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-nova-metadata-tls-certs\") pod \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.960742 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-config-data\") pod \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.960779 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a54375d5-43ad-493f-87b8-f10b9d6f68f9-logs\") pod \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.960817 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-combined-ca-bundle\") pod \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\" (UID: \"a54375d5-43ad-493f-87b8-f10b9d6f68f9\") " Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.961383 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a54375d5-43ad-493f-87b8-f10b9d6f68f9-logs" (OuterVolumeSpecName: "logs") pod "a54375d5-43ad-493f-87b8-f10b9d6f68f9" (UID: "a54375d5-43ad-493f-87b8-f10b9d6f68f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.961822 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a54375d5-43ad-493f-87b8-f10b9d6f68f9-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.967301 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a54375d5-43ad-493f-87b8-f10b9d6f68f9-kube-api-access-ws8s4" (OuterVolumeSpecName: "kube-api-access-ws8s4") pod "a54375d5-43ad-493f-87b8-f10b9d6f68f9" (UID: "a54375d5-43ad-493f-87b8-f10b9d6f68f9"). InnerVolumeSpecName "kube-api-access-ws8s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.977180 4885 scope.go:117] "RemoveContainer" containerID="bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552" Mar 08 19:55:36 crc kubenswrapper[4885]: E0308 19:55:36.977830 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552\": container with ID starting with bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552 not found: ID does not exist" containerID="bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.977907 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552"} err="failed to get container status \"bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552\": rpc error: code = NotFound desc = could not find container \"bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552\": container with ID starting with bd35fddb19c3982e88d06c7438f00e7ca365f8be0075aba824e1aab719cef552 not found: ID does not exist" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.977973 4885 scope.go:117] "RemoveContainer" containerID="aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898" Mar 08 19:55:36 crc kubenswrapper[4885]: E0308 19:55:36.978670 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898\": container with ID starting with aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898 not found: ID does not exist" containerID="aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898" Mar 08 19:55:36 crc kubenswrapper[4885]: I0308 19:55:36.978709 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898"} err="failed to get container status \"aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898\": rpc error: code = NotFound desc = could not find container \"aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898\": container with ID starting with aea792e65be3f92dd2aa6f4b8f94c0dc6e9aaf0c05b7680c537f6ce76fc4b898 not found: ID does not exist" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.006677 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-config-data" (OuterVolumeSpecName: "config-data") pod "a54375d5-43ad-493f-87b8-f10b9d6f68f9" (UID: "a54375d5-43ad-493f-87b8-f10b9d6f68f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.016578 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a54375d5-43ad-493f-87b8-f10b9d6f68f9" (UID: "a54375d5-43ad-493f-87b8-f10b9d6f68f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.063068 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a54375d5-43ad-493f-87b8-f10b9d6f68f9" (UID: "a54375d5-43ad-493f-87b8-f10b9d6f68f9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.063506 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws8s4\" (UniqueName: \"kubernetes.io/projected/a54375d5-43ad-493f-87b8-f10b9d6f68f9-kube-api-access-ws8s4\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.063528 4885 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.063538 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.063550 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a54375d5-43ad-493f-87b8-f10b9d6f68f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.258600 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.273876 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.285675 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:55:37 crc kubenswrapper[4885]: E0308 19:55:37.286153 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53478e7f-ae6d-4540-a51d-2fd03f142027" containerName="nova-manage" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.286170 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="53478e7f-ae6d-4540-a51d-2fd03f142027" containerName="nova-manage" Mar 08 19:55:37 crc kubenswrapper[4885]: E0308 19:55:37.286185 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-metadata" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.286191 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-metadata" Mar 08 19:55:37 crc kubenswrapper[4885]: E0308 19:55:37.286217 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-log" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.286224 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-log" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.286385 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="53478e7f-ae6d-4540-a51d-2fd03f142027" containerName="nova-manage" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.286395 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-log" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.286418 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" containerName="nova-metadata-metadata" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.287383 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.292159 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.292329 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.316558 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.377697 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a54375d5-43ad-493f-87b8-f10b9d6f68f9" path="/var/lib/kubelet/pods/a54375d5-43ad-493f-87b8-f10b9d6f68f9/volumes" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.470000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdd926a8-442c-4f63-bb36-3e6a425436c2-logs\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.470045 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxxc2\" (UniqueName: \"kubernetes.io/projected/cdd926a8-442c-4f63-bb36-3e6a425436c2-kube-api-access-xxxc2\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.470105 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.470147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.470176 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-config-data\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.571639 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-config-data\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.571805 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdd926a8-442c-4f63-bb36-3e6a425436c2-logs\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.571839 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxxc2\" (UniqueName: \"kubernetes.io/projected/cdd926a8-442c-4f63-bb36-3e6a425436c2-kube-api-access-xxxc2\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.571904 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.572037 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.572434 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdd926a8-442c-4f63-bb36-3e6a425436c2-logs\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.575015 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.576738 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-config-data\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.581867 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.613211 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxxc2\" (UniqueName: \"kubernetes.io/projected/cdd926a8-442c-4f63-bb36-3e6a425436c2-kube-api-access-xxxc2\") pod \"nova-metadata-0\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " pod="openstack/nova-metadata-0" Mar 08 19:55:37 crc kubenswrapper[4885]: I0308 19:55:37.620941 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:55:38 crc kubenswrapper[4885]: W0308 19:55:38.158399 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdd926a8_442c_4f63_bb36_3e6a425436c2.slice/crio-94340869c04149bbf12f6aa9cc506894a0395e25d240510a2463c412aa09d8dc WatchSource:0}: Error finding container 94340869c04149bbf12f6aa9cc506894a0395e25d240510a2463c412aa09d8dc: Status 404 returned error can't find the container with id 94340869c04149bbf12f6aa9cc506894a0395e25d240510a2463c412aa09d8dc Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.164053 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.529806 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.696169 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-combined-ca-bundle\") pod \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.696424 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lzkz\" (UniqueName: \"kubernetes.io/projected/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-kube-api-access-9lzkz\") pod \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.696504 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-config-data\") pod \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\" (UID: \"d0768dc6-7cf7-4bd7-a1de-6a68f604de14\") " Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.703023 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-kube-api-access-9lzkz" (OuterVolumeSpecName: "kube-api-access-9lzkz") pod "d0768dc6-7cf7-4bd7-a1de-6a68f604de14" (UID: "d0768dc6-7cf7-4bd7-a1de-6a68f604de14"). InnerVolumeSpecName "kube-api-access-9lzkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.742006 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-config-data" (OuterVolumeSpecName: "config-data") pod "d0768dc6-7cf7-4bd7-a1de-6a68f604de14" (UID: "d0768dc6-7cf7-4bd7-a1de-6a68f604de14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.744609 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0768dc6-7cf7-4bd7-a1de-6a68f604de14" (UID: "d0768dc6-7cf7-4bd7-a1de-6a68f604de14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.799388 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lzkz\" (UniqueName: \"kubernetes.io/projected/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-kube-api-access-9lzkz\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.799454 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.799483 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0768dc6-7cf7-4bd7-a1de-6a68f604de14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.955463 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" exitCode=0 Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.955534 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0768dc6-7cf7-4bd7-a1de-6a68f604de14","Type":"ContainerDied","Data":"9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35"} Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.955565 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.955614 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0768dc6-7cf7-4bd7-a1de-6a68f604de14","Type":"ContainerDied","Data":"0ed53f5b10de0d8d9229c24ba782e6c571620fd87d68a50336540413e9a25108"} Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.955699 4885 scope.go:117] "RemoveContainer" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.958698 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cdd926a8-442c-4f63-bb36-3e6a425436c2","Type":"ContainerStarted","Data":"2d8347d05b060d48223bdd22690e18395a24601df91142452e20c85edd93a56a"} Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.958750 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cdd926a8-442c-4f63-bb36-3e6a425436c2","Type":"ContainerStarted","Data":"8eea35d6899ebbfa973a2d7d5bbaae841e42e4044906216a557300884b93a37e"} Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.958764 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cdd926a8-442c-4f63-bb36-3e6a425436c2","Type":"ContainerStarted","Data":"94340869c04149bbf12f6aa9cc506894a0395e25d240510a2463c412aa09d8dc"} Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.988039 4885 scope.go:117] "RemoveContainer" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" Mar 08 19:55:38 crc kubenswrapper[4885]: E0308 19:55:38.991017 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35\": container with ID starting with 9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35 not found: ID does not exist" containerID="9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.991083 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35"} err="failed to get container status \"9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35\": rpc error: code = NotFound desc = could not find container \"9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35\": container with ID starting with 9b794b5dcfa5e223589b3422cb66557a49c9769378bb77ef22effbafbde36b35 not found: ID does not exist" Mar 08 19:55:38 crc kubenswrapper[4885]: I0308 19:55:38.992583 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9925451920000001 podStartE2EDuration="1.992545192s" podCreationTimestamp="2026-03-08 19:55:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:38.982953065 +0000 UTC m=+1440.379007088" watchObservedRunningTime="2026-03-08 19:55:38.992545192 +0000 UTC m=+1440.388599215" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.016873 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.034625 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.063291 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:55:39 crc kubenswrapper[4885]: E0308 19:55:39.064212 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" containerName="nova-scheduler-scheduler" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.064237 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" containerName="nova-scheduler-scheduler" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.064521 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" containerName="nova-scheduler-scheduler" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.065363 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.068755 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.077221 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.212359 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-config-data\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.212469 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.212767 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkbq5\" (UniqueName: \"kubernetes.io/projected/945717bc-405f-4628-934c-66e4500f56f0-kube-api-access-nkbq5\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.315622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-config-data\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.315885 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.316162 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkbq5\" (UniqueName: \"kubernetes.io/projected/945717bc-405f-4628-934c-66e4500f56f0-kube-api-access-nkbq5\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.323777 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-config-data\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.323886 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.338003 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkbq5\" (UniqueName: \"kubernetes.io/projected/945717bc-405f-4628-934c-66e4500f56f0-kube-api-access-nkbq5\") pod \"nova-scheduler-0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.385333 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.404407 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0768dc6-7cf7-4bd7-a1de-6a68f604de14" path="/var/lib/kubelet/pods/d0768dc6-7cf7-4bd7-a1de-6a68f604de14/volumes" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.782384 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:39 crc kubenswrapper[4885]: W0308 19:55:39.911992 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod945717bc_405f_4628_934c_66e4500f56f0.slice/crio-4a4309458de44ee77bdd68329eb6288f0e4d16b95e6497926e889647bf594a28 WatchSource:0}: Error finding container 4a4309458de44ee77bdd68329eb6288f0e4d16b95e6497926e889647bf594a28: Status 404 returned error can't find the container with id 4a4309458de44ee77bdd68329eb6288f0e4d16b95e6497926e889647bf594a28 Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.912873 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.933262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-config-data\") pod \"006cf27f-ef31-4316-a5de-833141664964\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.933299 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-combined-ca-bundle\") pod \"006cf27f-ef31-4316-a5de-833141664964\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.933322 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-public-tls-certs\") pod \"006cf27f-ef31-4316-a5de-833141664964\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.933361 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-internal-tls-certs\") pod \"006cf27f-ef31-4316-a5de-833141664964\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.933406 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/006cf27f-ef31-4316-a5de-833141664964-logs\") pod \"006cf27f-ef31-4316-a5de-833141664964\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.933463 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p2mj\" (UniqueName: \"kubernetes.io/projected/006cf27f-ef31-4316-a5de-833141664964-kube-api-access-5p2mj\") pod \"006cf27f-ef31-4316-a5de-833141664964\" (UID: \"006cf27f-ef31-4316-a5de-833141664964\") " Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.934625 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/006cf27f-ef31-4316-a5de-833141664964-logs" (OuterVolumeSpecName: "logs") pod "006cf27f-ef31-4316-a5de-833141664964" (UID: "006cf27f-ef31-4316-a5de-833141664964"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.937692 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/006cf27f-ef31-4316-a5de-833141664964-kube-api-access-5p2mj" (OuterVolumeSpecName: "kube-api-access-5p2mj") pod "006cf27f-ef31-4316-a5de-833141664964" (UID: "006cf27f-ef31-4316-a5de-833141664964"). InnerVolumeSpecName "kube-api-access-5p2mj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.979596 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-config-data" (OuterVolumeSpecName: "config-data") pod "006cf27f-ef31-4316-a5de-833141664964" (UID: "006cf27f-ef31-4316-a5de-833141664964"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.980518 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"945717bc-405f-4628-934c-66e4500f56f0","Type":"ContainerStarted","Data":"4a4309458de44ee77bdd68329eb6288f0e4d16b95e6497926e889647bf594a28"} Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.986262 4885 generic.go:334] "Generic (PLEG): container finished" podID="006cf27f-ef31-4316-a5de-833141664964" containerID="e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359" exitCode=0 Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.986376 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"006cf27f-ef31-4316-a5de-833141664964","Type":"ContainerDied","Data":"e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359"} Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.986440 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.986459 4885 scope.go:117] "RemoveContainer" containerID="e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359" Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.986445 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"006cf27f-ef31-4316-a5de-833141664964","Type":"ContainerDied","Data":"30344ac6a0d4ba0d54022fa56e76ff23a3c0282a327f06f069842f7bf31a4249"} Mar 08 19:55:39 crc kubenswrapper[4885]: I0308 19:55:39.991206 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "006cf27f-ef31-4316-a5de-833141664964" (UID: "006cf27f-ef31-4316-a5de-833141664964"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.014260 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "006cf27f-ef31-4316-a5de-833141664964" (UID: "006cf27f-ef31-4316-a5de-833141664964"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.019824 4885 scope.go:117] "RemoveContainer" containerID="442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.035649 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.035674 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.035683 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.035691 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/006cf27f-ef31-4316-a5de-833141664964-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.035701 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p2mj\" (UniqueName: \"kubernetes.io/projected/006cf27f-ef31-4316-a5de-833141664964-kube-api-access-5p2mj\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.039453 4885 scope.go:117] "RemoveContainer" containerID="e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359" Mar 08 19:55:40 crc kubenswrapper[4885]: E0308 19:55:40.039993 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359\": container with ID starting with e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359 not found: ID does not exist" containerID="e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.040049 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359"} err="failed to get container status \"e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359\": rpc error: code = NotFound desc = could not find container \"e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359\": container with ID starting with e2b09d3795be6dc660ec539dd9f0373b372c9537f42ee5b6e7dc97bdc1aa8359 not found: ID does not exist" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.040082 4885 scope.go:117] "RemoveContainer" containerID="442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8" Mar 08 19:55:40 crc kubenswrapper[4885]: E0308 19:55:40.040438 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8\": container with ID starting with 442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8 not found: ID does not exist" containerID="442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.040494 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8"} err="failed to get container status \"442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8\": rpc error: code = NotFound desc = could not find container \"442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8\": container with ID starting with 442dd8eaa5f04259e523b1bac46ef55df30b3f9b02f73526dc20f3cf885af0a8 not found: ID does not exist" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.045473 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "006cf27f-ef31-4316-a5de-833141664964" (UID: "006cf27f-ef31-4316-a5de-833141664964"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.138303 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/006cf27f-ef31-4316-a5de-833141664964-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.346844 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.372415 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.385694 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:40 crc kubenswrapper[4885]: E0308 19:55:40.386221 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-log" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.386243 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-log" Mar 08 19:55:40 crc kubenswrapper[4885]: E0308 19:55:40.386275 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-api" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.386283 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-api" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.386493 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-api" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.386519 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="006cf27f-ef31-4316-a5de-833141664964" containerName="nova-api-log" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.387741 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.389151 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.390950 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.391180 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.396436 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.549256 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.549319 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w82z5\" (UniqueName: \"kubernetes.io/projected/a083a431-5afc-4289-a5cf-625bc619465e-kube-api-access-w82z5\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.549358 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-config-data\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.549374 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083a431-5afc-4289-a5cf-625bc619465e-logs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.549400 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.549494 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-public-tls-certs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.651200 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-config-data\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.651241 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083a431-5afc-4289-a5cf-625bc619465e-logs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.651276 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.651344 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-public-tls-certs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.651391 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.651434 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w82z5\" (UniqueName: \"kubernetes.io/projected/a083a431-5afc-4289-a5cf-625bc619465e-kube-api-access-w82z5\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.652954 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083a431-5afc-4289-a5cf-625bc619465e-logs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.656071 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-config-data\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.658419 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.658506 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.669557 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-public-tls-certs\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.671320 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w82z5\" (UniqueName: \"kubernetes.io/projected/a083a431-5afc-4289-a5cf-625bc619465e-kube-api-access-w82z5\") pod \"nova-api-0\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " pod="openstack/nova-api-0" Mar 08 19:55:40 crc kubenswrapper[4885]: I0308 19:55:40.777709 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:55:41 crc kubenswrapper[4885]: I0308 19:55:41.004049 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"945717bc-405f-4628-934c-66e4500f56f0","Type":"ContainerStarted","Data":"aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0"} Mar 08 19:55:41 crc kubenswrapper[4885]: I0308 19:55:41.021986 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.021969945 podStartE2EDuration="2.021969945s" podCreationTimestamp="2026-03-08 19:55:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:41.017547427 +0000 UTC m=+1442.413601470" watchObservedRunningTime="2026-03-08 19:55:41.021969945 +0000 UTC m=+1442.418023968" Mar 08 19:55:41 crc kubenswrapper[4885]: I0308 19:55:41.260444 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:55:41 crc kubenswrapper[4885]: W0308 19:55:41.264881 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda083a431_5afc_4289_a5cf_625bc619465e.slice/crio-5cda51b7f06b2df67ff66ec569970b5a1ee1ed8790ed67119aa132ee93bae076 WatchSource:0}: Error finding container 5cda51b7f06b2df67ff66ec569970b5a1ee1ed8790ed67119aa132ee93bae076: Status 404 returned error can't find the container with id 5cda51b7f06b2df67ff66ec569970b5a1ee1ed8790ed67119aa132ee93bae076 Mar 08 19:55:41 crc kubenswrapper[4885]: I0308 19:55:41.385909 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="006cf27f-ef31-4316-a5de-833141664964" path="/var/lib/kubelet/pods/006cf27f-ef31-4316-a5de-833141664964/volumes" Mar 08 19:55:42 crc kubenswrapper[4885]: I0308 19:55:42.015363 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a083a431-5afc-4289-a5cf-625bc619465e","Type":"ContainerStarted","Data":"4621296620fb47f88fa5eba2c20d8a6e4bdfb8042020e5c5c63fda5c073094aa"} Mar 08 19:55:42 crc kubenswrapper[4885]: I0308 19:55:42.015740 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a083a431-5afc-4289-a5cf-625bc619465e","Type":"ContainerStarted","Data":"701b0f4ea9f8a0e00b422b3381b168dbb04b81fe4ea92e28f34d36f84301009f"} Mar 08 19:55:42 crc kubenswrapper[4885]: I0308 19:55:42.015749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a083a431-5afc-4289-a5cf-625bc619465e","Type":"ContainerStarted","Data":"5cda51b7f06b2df67ff66ec569970b5a1ee1ed8790ed67119aa132ee93bae076"} Mar 08 19:55:42 crc kubenswrapper[4885]: I0308 19:55:42.044219 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.044198473 podStartE2EDuration="2.044198473s" podCreationTimestamp="2026-03-08 19:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 19:55:42.038188252 +0000 UTC m=+1443.434242295" watchObservedRunningTime="2026-03-08 19:55:42.044198473 +0000 UTC m=+1443.440252506" Mar 08 19:55:42 crc kubenswrapper[4885]: I0308 19:55:42.621452 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 19:55:42 crc kubenswrapper[4885]: I0308 19:55:42.621559 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 19:55:44 crc kubenswrapper[4885]: I0308 19:55:44.385513 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 19:55:47 crc kubenswrapper[4885]: I0308 19:55:47.621842 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 19:55:47 crc kubenswrapper[4885]: I0308 19:55:47.623189 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 19:55:48 crc kubenswrapper[4885]: I0308 19:55:48.641220 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:48 crc kubenswrapper[4885]: I0308 19:55:48.641257 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:49 crc kubenswrapper[4885]: I0308 19:55:49.387738 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 19:55:49 crc kubenswrapper[4885]: I0308 19:55:49.434339 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 19:55:50 crc kubenswrapper[4885]: I0308 19:55:50.165844 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 19:55:50 crc kubenswrapper[4885]: I0308 19:55:50.778485 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:55:50 crc kubenswrapper[4885]: I0308 19:55:50.778526 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 19:55:51 crc kubenswrapper[4885]: I0308 19:55:51.790175 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:51 crc kubenswrapper[4885]: I0308 19:55:51.790216 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 08 19:55:57 crc kubenswrapper[4885]: I0308 19:55:57.223138 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 08 19:55:57 crc kubenswrapper[4885]: I0308 19:55:57.636286 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 19:55:57 crc kubenswrapper[4885]: I0308 19:55:57.636756 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 19:55:57 crc kubenswrapper[4885]: I0308 19:55:57.645410 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 19:55:57 crc kubenswrapper[4885]: I0308 19:55:57.647555 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.163329 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549996-rr28z"] Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.171775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.175573 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.176079 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.179948 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.193307 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549996-rr28z"] Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.299087 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnkc6\" (UniqueName: \"kubernetes.io/projected/5d9eec97-1b10-4d6a-9787-e137d3c37dec-kube-api-access-bnkc6\") pod \"auto-csr-approver-29549996-rr28z\" (UID: \"5d9eec97-1b10-4d6a-9787-e137d3c37dec\") " pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.401447 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnkc6\" (UniqueName: \"kubernetes.io/projected/5d9eec97-1b10-4d6a-9787-e137d3c37dec-kube-api-access-bnkc6\") pod \"auto-csr-approver-29549996-rr28z\" (UID: \"5d9eec97-1b10-4d6a-9787-e137d3c37dec\") " pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.425548 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnkc6\" (UniqueName: \"kubernetes.io/projected/5d9eec97-1b10-4d6a-9787-e137d3c37dec-kube-api-access-bnkc6\") pod \"auto-csr-approver-29549996-rr28z\" (UID: \"5d9eec97-1b10-4d6a-9787-e137d3c37dec\") " pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.511550 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.812022 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.813046 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.813093 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.825332 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 19:56:00 crc kubenswrapper[4885]: I0308 19:56:00.966238 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549996-rr28z"] Mar 08 19:56:00 crc kubenswrapper[4885]: W0308 19:56:00.967078 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d9eec97_1b10_4d6a_9787_e137d3c37dec.slice/crio-45fccff8d73ed4e7f6c9cf8004ceb4c7c2a34e4d1eac6b393faaffdcbc018f8f WatchSource:0}: Error finding container 45fccff8d73ed4e7f6c9cf8004ceb4c7c2a34e4d1eac6b393faaffdcbc018f8f: Status 404 returned error can't find the container with id 45fccff8d73ed4e7f6c9cf8004ceb4c7c2a34e4d1eac6b393faaffdcbc018f8f Mar 08 19:56:01 crc kubenswrapper[4885]: I0308 19:56:01.240983 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549996-rr28z" event={"ID":"5d9eec97-1b10-4d6a-9787-e137d3c37dec","Type":"ContainerStarted","Data":"45fccff8d73ed4e7f6c9cf8004ceb4c7c2a34e4d1eac6b393faaffdcbc018f8f"} Mar 08 19:56:01 crc kubenswrapper[4885]: I0308 19:56:01.241592 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 19:56:01 crc kubenswrapper[4885]: I0308 19:56:01.254340 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 19:56:03 crc kubenswrapper[4885]: I0308 19:56:03.269642 4885 generic.go:334] "Generic (PLEG): container finished" podID="5d9eec97-1b10-4d6a-9787-e137d3c37dec" containerID="5576d8a075a0f44218f0cea569a1265805ed7bccbb93726d2adc83621dd67e49" exitCode=0 Mar 08 19:56:03 crc kubenswrapper[4885]: I0308 19:56:03.269724 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549996-rr28z" event={"ID":"5d9eec97-1b10-4d6a-9787-e137d3c37dec","Type":"ContainerDied","Data":"5576d8a075a0f44218f0cea569a1265805ed7bccbb93726d2adc83621dd67e49"} Mar 08 19:56:04 crc kubenswrapper[4885]: I0308 19:56:04.623512 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:04 crc kubenswrapper[4885]: I0308 19:56:04.786406 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnkc6\" (UniqueName: \"kubernetes.io/projected/5d9eec97-1b10-4d6a-9787-e137d3c37dec-kube-api-access-bnkc6\") pod \"5d9eec97-1b10-4d6a-9787-e137d3c37dec\" (UID: \"5d9eec97-1b10-4d6a-9787-e137d3c37dec\") " Mar 08 19:56:04 crc kubenswrapper[4885]: I0308 19:56:04.795086 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9eec97-1b10-4d6a-9787-e137d3c37dec-kube-api-access-bnkc6" (OuterVolumeSpecName: "kube-api-access-bnkc6") pod "5d9eec97-1b10-4d6a-9787-e137d3c37dec" (UID: "5d9eec97-1b10-4d6a-9787-e137d3c37dec"). InnerVolumeSpecName "kube-api-access-bnkc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:04 crc kubenswrapper[4885]: I0308 19:56:04.889816 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnkc6\" (UniqueName: \"kubernetes.io/projected/5d9eec97-1b10-4d6a-9787-e137d3c37dec-kube-api-access-bnkc6\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:05 crc kubenswrapper[4885]: I0308 19:56:05.293894 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549996-rr28z" event={"ID":"5d9eec97-1b10-4d6a-9787-e137d3c37dec","Type":"ContainerDied","Data":"45fccff8d73ed4e7f6c9cf8004ceb4c7c2a34e4d1eac6b393faaffdcbc018f8f"} Mar 08 19:56:05 crc kubenswrapper[4885]: I0308 19:56:05.293967 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45fccff8d73ed4e7f6c9cf8004ceb4c7c2a34e4d1eac6b393faaffdcbc018f8f" Mar 08 19:56:05 crc kubenswrapper[4885]: I0308 19:56:05.294004 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549996-rr28z" Mar 08 19:56:05 crc kubenswrapper[4885]: I0308 19:56:05.712523 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549990-8n4qs"] Mar 08 19:56:05 crc kubenswrapper[4885]: I0308 19:56:05.729600 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549990-8n4qs"] Mar 08 19:56:07 crc kubenswrapper[4885]: I0308 19:56:07.388363 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8bff80c-e537-4de5-8a05-85ee81004c30" path="/var/lib/kubelet/pods/d8bff80c-e537-4de5-8a05-85ee81004c30/volumes" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.084781 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.085414 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" containerName="openstackclient" containerID="cri-o://6670e6817995526cd80a6c1b2064f3af999a3d367e59a87b40d4c34b2c61c6e3" gracePeriod=2 Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.105420 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.173077 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-ccxvz"] Mar 08 19:56:23 crc kubenswrapper[4885]: E0308 19:56:23.173428 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9eec97-1b10-4d6a-9787-e137d3c37dec" containerName="oc" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.173445 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9eec97-1b10-4d6a-9787-e137d3c37dec" containerName="oc" Mar 08 19:56:23 crc kubenswrapper[4885]: E0308 19:56:23.173485 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" containerName="openstackclient" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.173491 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" containerName="openstackclient" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.173649 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9eec97-1b10-4d6a-9787-e137d3c37dec" containerName="oc" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.173672 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" containerName="openstackclient" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.174440 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.186839 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.215741 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-ccxvz"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.248418 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-4khmp"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.261050 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2b5kk"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.266866 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-4khmp"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.273433 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2b5kk"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.293470 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts\") pod \"nova-cell1-cd3f-account-create-update-ccxvz\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.293614 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zhx2\" (UniqueName: \"kubernetes.io/projected/a3e6330b-4e2d-44ca-b9be-d36b2f613571-kube-api-access-7zhx2\") pod \"nova-cell1-cd3f-account-create-update-ccxvz\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.301785 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-bq8dk"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.303045 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.315968 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-m2lkt"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.317467 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.321458 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.321600 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.333398 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-qx2kg"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.350997 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-qx2kg"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.395566 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts\") pod \"nova-cell1-cd3f-account-create-update-ccxvz\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.395666 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zhx2\" (UniqueName: \"kubernetes.io/projected/a3e6330b-4e2d-44ca-b9be-d36b2f613571-kube-api-access-7zhx2\") pod \"nova-cell1-cd3f-account-create-update-ccxvz\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.396619 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts\") pod \"nova-cell1-cd3f-account-create-update-ccxvz\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.405443 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e75774-c86c-459a-9c66-eaf3c43addac" path="/var/lib/kubelet/pods/11e75774-c86c-459a-9c66-eaf3c43addac/volumes" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.419097 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b6d8115-d92a-4305-a2d2-8d9874a81390" path="/var/lib/kubelet/pods/3b6d8115-d92a-4305-a2d2-8d9874a81390/volumes" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.419892 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ed43ce9-6f70-49b2-aa6e-50917ea9ca41" path="/var/lib/kubelet/pods/8ed43ce9-6f70-49b2-aa6e-50917ea9ca41/volumes" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.420448 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.420473 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-bq8dk"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.434387 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-86ea-account-create-update-tdtcq"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.452238 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-86ea-account-create-update-tdtcq"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.473494 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zhx2\" (UniqueName: \"kubernetes.io/projected/a3e6330b-4e2d-44ca-b9be-d36b2f613571-kube-api-access-7zhx2\") pod \"nova-cell1-cd3f-account-create-update-ccxvz\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.488211 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-m2lkt"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.506270 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.520165 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd8sz\" (UniqueName: \"kubernetes.io/projected/09db13b9-d564-49c9-b383-5fbfe0e43c9b-kube-api-access-gd8sz\") pod \"root-account-create-update-bq8dk\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.520262 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2242ad5f-8a7e-4017-8441-6d05b2c94930-operator-scripts\") pod \"nova-cell0-1b03-account-create-update-m2lkt\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.520414 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hl27\" (UniqueName: \"kubernetes.io/projected/2242ad5f-8a7e-4017-8441-6d05b2c94930-kube-api-access-9hl27\") pod \"nova-cell0-1b03-account-create-update-m2lkt\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.520455 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts\") pod \"root-account-create-update-bq8dk\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.641029 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd8sz\" (UniqueName: \"kubernetes.io/projected/09db13b9-d564-49c9-b383-5fbfe0e43c9b-kube-api-access-gd8sz\") pod \"root-account-create-update-bq8dk\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.641086 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2242ad5f-8a7e-4017-8441-6d05b2c94930-operator-scripts\") pod \"nova-cell0-1b03-account-create-update-m2lkt\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.641151 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hl27\" (UniqueName: \"kubernetes.io/projected/2242ad5f-8a7e-4017-8441-6d05b2c94930-kube-api-access-9hl27\") pod \"nova-cell0-1b03-account-create-update-m2lkt\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.641175 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts\") pod \"root-account-create-update-bq8dk\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: E0308 19:56:23.641301 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 08 19:56:23 crc kubenswrapper[4885]: E0308 19:56:23.641347 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data podName:01dc1fd5-4e2f-4129-9452-ed50fa1d182b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:24.141331407 +0000 UTC m=+1485.537385430 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data") pod "rabbitmq-server-0" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b") : configmap "rabbitmq-config-data" not found Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.642267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts\") pod \"root-account-create-update-bq8dk\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.642844 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2242ad5f-8a7e-4017-8441-6d05b2c94930-operator-scripts\") pod \"nova-cell0-1b03-account-create-update-m2lkt\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.681376 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-86ea-account-create-update-zb2lr"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.690151 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.697438 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.710365 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hl27\" (UniqueName: \"kubernetes.io/projected/2242ad5f-8a7e-4017-8441-6d05b2c94930-kube-api-access-9hl27\") pod \"nova-cell0-1b03-account-create-update-m2lkt\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.718460 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3284-account-create-update-qht6h"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.732196 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-86ea-account-create-update-zb2lr"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.783545 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3284-account-create-update-qht6h"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.786670 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd8sz\" (UniqueName: \"kubernetes.io/projected/09db13b9-d564-49c9-b383-5fbfe0e43c9b-kube-api-access-gd8sz\") pod \"root-account-create-update-bq8dk\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.808484 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-f33b-account-create-update-vbmkj"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.809963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.827721 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.833584 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-f33b-account-create-update-vbmkj"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.852561 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/619a568c-d0c3-408b-96c1-39a3a769d1ad-operator-scripts\") pod \"barbican-86ea-account-create-update-zb2lr\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.852619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqv7f\" (UniqueName: \"kubernetes.io/projected/619a568c-d0c3-408b-96c1-39a3a769d1ad-kube-api-access-hqv7f\") pod \"barbican-86ea-account-create-update-zb2lr\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.852778 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-031a-account-create-update-ss6cl"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.854158 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.858412 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.886844 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-031a-account-create-update-ss6cl"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.930278 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.936054 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.941868 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-pp4rs"] Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.958312 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3508e5-7126-47b0-a598-da6515457cb7-operator-scripts\") pod \"cinder-f33b-account-create-update-vbmkj\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.958402 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36678078-1658-4edc-a256-3f0bb8d23ed8-operator-scripts\") pod \"neutron-031a-account-create-update-ss6cl\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.958467 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/619a568c-d0c3-408b-96c1-39a3a769d1ad-operator-scripts\") pod \"barbican-86ea-account-create-update-zb2lr\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.958487 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfcbl\" (UniqueName: \"kubernetes.io/projected/36678078-1658-4edc-a256-3f0bb8d23ed8-kube-api-access-rfcbl\") pod \"neutron-031a-account-create-update-ss6cl\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.958507 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqv7f\" (UniqueName: \"kubernetes.io/projected/619a568c-d0c3-408b-96c1-39a3a769d1ad-kube-api-access-hqv7f\") pod \"barbican-86ea-account-create-update-zb2lr\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.958554 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkg9d\" (UniqueName: \"kubernetes.io/projected/ce3508e5-7126-47b0-a598-da6515457cb7-kube-api-access-dkg9d\") pod \"cinder-f33b-account-create-update-vbmkj\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:23 crc kubenswrapper[4885]: I0308 19:56:23.964148 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/619a568c-d0c3-408b-96c1-39a3a769d1ad-operator-scripts\") pod \"barbican-86ea-account-create-update-zb2lr\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.004736 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqv7f\" (UniqueName: \"kubernetes.io/projected/619a568c-d0c3-408b-96c1-39a3a769d1ad-kube-api-access-hqv7f\") pod \"barbican-86ea-account-create-update-zb2lr\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.004800 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-5qsh8"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.005004 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-5qsh8" podUID="474d55a2-f4f0-4e46-809c-367a3110c33d" containerName="openstack-network-exporter" containerID="cri-o://e8623499ee05972629def49d746cd17bbc01095c41d8c7e431c723d1dc4187ee" gracePeriod=30 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.027949 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mn4lz"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.062945 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3508e5-7126-47b0-a598-da6515457cb7-operator-scripts\") pod \"cinder-f33b-account-create-update-vbmkj\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.063211 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36678078-1658-4edc-a256-3f0bb8d23ed8-operator-scripts\") pod \"neutron-031a-account-create-update-ss6cl\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.063273 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfcbl\" (UniqueName: \"kubernetes.io/projected/36678078-1658-4edc-a256-3f0bb8d23ed8-kube-api-access-rfcbl\") pod \"neutron-031a-account-create-update-ss6cl\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.063310 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkg9d\" (UniqueName: \"kubernetes.io/projected/ce3508e5-7126-47b0-a598-da6515457cb7-kube-api-access-dkg9d\") pod \"cinder-f33b-account-create-update-vbmkj\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.067640 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36678078-1658-4edc-a256-3f0bb8d23ed8-operator-scripts\") pod \"neutron-031a-account-create-update-ss6cl\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.068172 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3508e5-7126-47b0-a598-da6515457cb7-operator-scripts\") pod \"cinder-f33b-account-create-update-vbmkj\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.098564 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f33b-account-create-update-hcg7j"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.119834 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkg9d\" (UniqueName: \"kubernetes.io/projected/ce3508e5-7126-47b0-a598-da6515457cb7-kube-api-access-dkg9d\") pod \"cinder-f33b-account-create-update-vbmkj\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.121494 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfcbl\" (UniqueName: \"kubernetes.io/projected/36678078-1658-4edc-a256-3f0bb8d23ed8-kube-api-access-rfcbl\") pod \"neutron-031a-account-create-update-ss6cl\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.129363 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f33b-account-create-update-hcg7j"] Mar 08 19:56:24 crc kubenswrapper[4885]: E0308 19:56:24.165027 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 08 19:56:24 crc kubenswrapper[4885]: E0308 19:56:24.165092 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data podName:01dc1fd5-4e2f-4129-9452-ed50fa1d182b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:25.165078034 +0000 UTC m=+1486.561132057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data") pod "rabbitmq-server-0" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b") : configmap "rabbitmq-config-data" not found Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.181955 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.186671 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-a047-account-create-update-qrjwk"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.188170 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.193566 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.206060 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-a047-account-create-update-qrjwk"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.226076 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-031a-account-create-update-qvdpr"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.232813 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.254891 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.255420 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="ovn-northd" containerID="cri-o://e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f" gracePeriod=30 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.255671 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="openstack-network-exporter" containerID="cri-o://619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0" gracePeriod=30 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.272775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.272899 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-031a-account-create-update-qvdpr"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.287572 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.307426 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-zz9c5"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.317849 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-zz9c5"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.326110 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-mn5x8"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.348033 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-mn5x8"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.363790 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-7m4ps"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.364026 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerName="dnsmasq-dns" containerID="cri-o://8f4e92cbc965cfb3ab7ad14008b7ceea724e345981ffb88a02993256da9e6dcb" gracePeriod=10 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.371225 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhdsh\" (UniqueName: \"kubernetes.io/projected/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-kube-api-access-bhdsh\") pod \"glance-a047-account-create-update-qrjwk\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.371292 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-operator-scripts\") pod \"glance-a047-account-create-update-qrjwk\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.388088 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a047-account-create-update-64r5q"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.429141 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a047-account-create-update-64r5q"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.447050 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmgdm"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.489194 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhdsh\" (UniqueName: \"kubernetes.io/projected/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-kube-api-access-bhdsh\") pod \"glance-a047-account-create-update-qrjwk\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.489366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-operator-scripts\") pod \"glance-a047-account-create-update-qrjwk\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: E0308 19:56:24.490492 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:24 crc kubenswrapper[4885]: E0308 19:56:24.490538 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data podName:96257eac-42ec-44cf-80be-9be68c0ebb1b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:24.990523914 +0000 UTC m=+1486.386577937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b") : configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.490707 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-operator-scripts\") pod \"glance-a047-account-create-update-qrjwk\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.524822 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-pmgdm"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.560513 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhdsh\" (UniqueName: \"kubernetes.io/projected/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-kube-api-access-bhdsh\") pod \"glance-a047-account-create-update-qrjwk\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.690523 4885 generic.go:334] "Generic (PLEG): container finished" podID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerID="8f4e92cbc965cfb3ab7ad14008b7ceea724e345981ffb88a02993256da9e6dcb" exitCode=0 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.690885 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" event={"ID":"8185583f-0ca5-46b1-a1ed-77c35b13a07b","Type":"ContainerDied","Data":"8f4e92cbc965cfb3ab7ad14008b7ceea724e345981ffb88a02993256da9e6dcb"} Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.700178 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-l6hg9"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.794130 4885 generic.go:334] "Generic (PLEG): container finished" podID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerID="619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0" exitCode=2 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.794168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1f46cb2-c95d-40f5-9acc-720e094b91bc","Type":"ContainerDied","Data":"619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0"} Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.826875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.855806 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-l6hg9"] Mar 08 19:56:24 crc kubenswrapper[4885]: E0308 19:56:24.865516 4885 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-mn4lz" message=< Mar 08 19:56:24 crc kubenswrapper[4885]: Exiting ovn-controller (1) [ OK ] Mar 08 19:56:24 crc kubenswrapper[4885]: > Mar 08 19:56:24 crc kubenswrapper[4885]: E0308 19:56:24.865556 4885 kuberuntime_container.go:691] "PreStop hook failed" err="command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: " pod="openstack/ovn-controller-mn4lz" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" containerID="cri-o://40b3ba3ccd4cd0fabe1a8de0a1537908216c0198be6d0bf26dce86c9b6a32605" Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.865588 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-mn4lz" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" containerID="cri-o://40b3ba3ccd4cd0fabe1a8de0a1537908216c0198be6d0bf26dce86c9b6a32605" gracePeriod=30 Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.916189 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-fwjsc"] Mar 08 19:56:24 crc kubenswrapper[4885]: I0308 19:56:24.997548 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-fwjsc"] Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.038974 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:25 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: if [ -n "nova_cell1" ]; then Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="nova_cell1" Mar 08 19:56:25 crc kubenswrapper[4885]: else Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:25 crc kubenswrapper[4885]: fi Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:25 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:25 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:25 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:25 crc kubenswrapper[4885]: # support updates Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.040364 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" podUID="a3e6330b-4e2d-44ca-b9be-d36b2f613571" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.047748 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.047806 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data podName:96257eac-42ec-44cf-80be-9be68c0ebb1b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:26.047791953 +0000 UTC m=+1487.443845976 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b") : configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.054027 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-mp8c9"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.123506 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-mp8c9"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.191253 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kpvl2"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.215066 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kpvl2"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.231821 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-pq8mq"] Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.250627 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.250694 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data podName:01dc1fd5-4e2f-4129-9452-ed50fa1d182b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:27.250678631 +0000 UTC m=+1488.646732654 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data") pod "rabbitmq-server-0" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b") : configmap "rabbitmq-config-data" not found Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.259963 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-pq8mq"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.270790 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271240 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-server" containerID="cri-o://803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271337 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-reaper" containerID="cri-o://3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271307 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-server" containerID="cri-o://2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271359 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-server" containerID="cri-o://a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271397 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-replicator" containerID="cri-o://5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271492 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-auditor" containerID="cri-o://6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271530 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-updater" containerID="cri-o://e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271553 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-replicator" containerID="cri-o://25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271388 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-auditor" containerID="cri-o://904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271618 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-expirer" containerID="cri-o://7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271650 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="swift-recon-cron" containerID="cri-o://9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271652 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-auditor" containerID="cri-o://37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271677 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="rsync" containerID="cri-o://64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271705 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-updater" containerID="cri-o://6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.271777 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-replicator" containerID="cri-o://d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.322167 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.322815 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="openstack-network-exporter" containerID="cri-o://e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5" gracePeriod=300 Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.374489 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:25 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: if [ -n "nova_cell0" ]; then Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="nova_cell0" Mar 08 19:56:25 crc kubenswrapper[4885]: else Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:25 crc kubenswrapper[4885]: fi Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:25 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:25 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:25 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:25 crc kubenswrapper[4885]: # support updates Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.375727 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" podUID="2242ad5f-8a7e-4017-8441-6d05b2c94930" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.435588 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f704685-800d-4386-a47d-8c60b0885aca" path="/var/lib/kubelet/pods/0f704685-800d-4386-a47d-8c60b0885aca/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.436326 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2efc22fd-a92b-422c-876d-7b80f06928b2" path="/var/lib/kubelet/pods/2efc22fd-a92b-422c-876d-7b80f06928b2/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.450838 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dcb0d23-1927-4f70-ac45-bcc01f9a081a" path="/var/lib/kubelet/pods/3dcb0d23-1927-4f70-ac45-bcc01f9a081a/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.451378 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46290bd2-6ad7-46f4-86f4-48aa73bc304a" path="/var/lib/kubelet/pods/46290bd2-6ad7-46f4-86f4-48aa73bc304a/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.451869 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53478e7f-ae6d-4540-a51d-2fd03f142027" path="/var/lib/kubelet/pods/53478e7f-ae6d-4540-a51d-2fd03f142027/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.456602 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57032abe-6c4f-4711-9f48-5733d6a29ec3" path="/var/lib/kubelet/pods/57032abe-6c4f-4711-9f48-5733d6a29ec3/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.457141 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.457277 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="618b5189-8b29-473f-b59c-e911fca71041" path="/var/lib/kubelet/pods/618b5189-8b29-473f-b59c-e911fca71041/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.458139 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fff4a7a-1b14-4e29-8c84-d7fc55de879c" path="/var/lib/kubelet/pods/6fff4a7a-1b14-4e29-8c84-d7fc55de879c/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.459314 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72fa5124-24e9-47b1-8522-815cfef2a86b" path="/var/lib/kubelet/pods/72fa5124-24e9-47b1-8522-815cfef2a86b/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.460063 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85d25daf-f279-4be1-be4a-75e05e47923c" path="/var/lib/kubelet/pods/85d25daf-f279-4be1-be4a-75e05e47923c/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.460747 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="915cd482-d3dc-42c1-96cc-0fcc18bbaff2" path="/var/lib/kubelet/pods/915cd482-d3dc-42c1-96cc-0fcc18bbaff2/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.465245 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b04611e7-17b5-48ae-8169-534f684a101b" path="/var/lib/kubelet/pods/b04611e7-17b5-48ae-8169-534f684a101b/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.479278 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4353f36-d8f9-41ff-8062-f874bd53ef12" path="/var/lib/kubelet/pods/e4353f36-d8f9-41ff-8062-f874bd53ef12/volumes" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.479867 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-ccxvz"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.540508 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-58c657b6d6-r4tf7"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.540761 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-58c657b6d6-r4tf7" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-log" containerID="cri-o://2ea3e6e51d477fb7795967def43fe0be522063fbf11d9053ce41fa22a8bf42b3" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.541138 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-58c657b6d6-r4tf7" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-api" containerID="cri-o://1b8b8e4856a24e16b23d4c15ef261857dfcc94531017a1b02728028102e1d5ce" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.545162 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-25qrp"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.567663 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-25qrp"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.576322 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.579239 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="openstack-network-exporter" containerID="cri-o://f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8" gracePeriod=300 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.582859 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-sb\") pod \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.582907 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-nb\") pod \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.582941 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j727f\" (UniqueName: \"kubernetes.io/projected/8185583f-0ca5-46b1-a1ed-77c35b13a07b-kube-api-access-j727f\") pod \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.583015 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-config\") pod \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.583037 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-svc\") pod \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.583065 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-swift-storage-0\") pod \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\" (UID: \"8185583f-0ca5-46b1-a1ed-77c35b13a07b\") " Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.611523 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8185583f-0ca5-46b1-a1ed-77c35b13a07b-kube-api-access-j727f" (OuterVolumeSpecName: "kube-api-access-j727f") pod "8185583f-0ca5-46b1-a1ed-77c35b13a07b" (UID: "8185583f-0ca5-46b1-a1ed-77c35b13a07b"). InnerVolumeSpecName "kube-api-access-j727f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.622746 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.628750 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-log" containerID="cri-o://701b0f4ea9f8a0e00b422b3381b168dbb04b81fe4ea92e28f34d36f84301009f" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.630315 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-api" containerID="cri-o://4621296620fb47f88fa5eba2c20d8a6e4bdfb8042020e5c5c63fda5c073094aa" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.633226 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.709317 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j727f\" (UniqueName: \"kubernetes.io/projected/8185583f-0ca5-46b1-a1ed-77c35b13a07b-kube-api-access-j727f\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.731862 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-5fh2h"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.735838 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="ovsdbserver-nb" containerID="cri-o://4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c" gracePeriod=300 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.770200 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="ovsdbserver-sb" containerID="cri-o://1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" gracePeriod=300 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.796098 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-5fh2h"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.803335 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-m2lkt"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.816997 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-hjcwx"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.819448 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5qsh8_474d55a2-f4f0-4e46-809c-367a3110c33d/openstack-network-exporter/0.log" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.819490 4885 generic.go:334] "Generic (PLEG): container finished" podID="474d55a2-f4f0-4e46-809c-367a3110c33d" containerID="e8623499ee05972629def49d746cd17bbc01095c41d8c7e431c723d1dc4187ee" exitCode=2 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.819537 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5qsh8" event={"ID":"474d55a2-f4f0-4e46-809c-367a3110c33d","Type":"ContainerDied","Data":"e8623499ee05972629def49d746cd17bbc01095c41d8c7e431c723d1dc4187ee"} Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.823586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" event={"ID":"a3e6330b-4e2d-44ca-b9be-d36b2f613571","Type":"ContainerStarted","Data":"17603ca27d7ecde56e3c1a978d3f1cf7aefc5c4500d55699231533afe9deacd5"} Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.834665 4885 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" secret="" err="secret \"galera-openstack-cell1-dockercfg-5kqpk\" not found" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.837242 4885 generic.go:334] "Generic (PLEG): container finished" podID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerID="2ea3e6e51d477fb7795967def43fe0be522063fbf11d9053ce41fa22a8bf42b3" exitCode=143 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.837302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c657b6d6-r4tf7" event={"ID":"719b68df-d1ac-49e5-ac34-dfa3ba33c97f","Type":"ContainerDied","Data":"2ea3e6e51d477fb7795967def43fe0be522063fbf11d9053ce41fa22a8bf42b3"} Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.841882 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:25 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: if [ -n "nova_cell1" ]; then Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="nova_cell1" Mar 08 19:56:25 crc kubenswrapper[4885]: else Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:25 crc kubenswrapper[4885]: fi Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:25 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:25 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:25 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:25 crc kubenswrapper[4885]: # support updates Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.842785 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5qsh8_474d55a2-f4f0-4e46-809c-367a3110c33d/openstack-network-exporter/0.log" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.842831 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.842970 4885 generic.go:334] "Generic (PLEG): container finished" podID="a083a431-5afc-4289-a5cf-625bc619465e" containerID="701b0f4ea9f8a0e00b422b3381b168dbb04b81fe4ea92e28f34d36f84301009f" exitCode=143 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.843033 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a083a431-5afc-4289-a5cf-625bc619465e","Type":"ContainerDied","Data":"701b0f4ea9f8a0e00b422b3381b168dbb04b81fe4ea92e28f34d36f84301009f"} Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.843072 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" podUID="a3e6330b-4e2d-44ca-b9be-d36b2f613571" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.854258 4885 generic.go:334] "Generic (PLEG): container finished" podID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" containerID="6670e6817995526cd80a6c1b2064f3af999a3d367e59a87b40d4c34b2c61c6e3" exitCode=137 Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.869908 4885 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Mar 08 19:56:25 crc kubenswrapper[4885]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 08 19:56:25 crc kubenswrapper[4885]: + source /usr/local/bin/container-scripts/functions Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNBridge=br-int Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNRemote=tcp:localhost:6642 Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNEncapType=geneve Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNAvailabilityZones= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ EnableChassisAsGateway=true Mar 08 19:56:25 crc kubenswrapper[4885]: ++ PhysicalNetworks= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNHostName= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 08 19:56:25 crc kubenswrapper[4885]: ++ ovs_dir=/var/lib/openvswitch Mar 08 19:56:25 crc kubenswrapper[4885]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 08 19:56:25 crc kubenswrapper[4885]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 08 19:56:25 crc kubenswrapper[4885]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + cleanup_ovsdb_server_semaphore Mar 08 19:56:25 crc kubenswrapper[4885]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 08 19:56:25 crc kubenswrapper[4885]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 08 19:56:25 crc kubenswrapper[4885]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-pp4rs" message=< Mar 08 19:56:25 crc kubenswrapper[4885]: Exiting ovsdb-server (5) ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 08 19:56:25 crc kubenswrapper[4885]: + source /usr/local/bin/container-scripts/functions Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNBridge=br-int Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNRemote=tcp:localhost:6642 Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNEncapType=geneve Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNAvailabilityZones= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ EnableChassisAsGateway=true Mar 08 19:56:25 crc kubenswrapper[4885]: ++ PhysicalNetworks= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNHostName= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 08 19:56:25 crc kubenswrapper[4885]: ++ ovs_dir=/var/lib/openvswitch Mar 08 19:56:25 crc kubenswrapper[4885]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 08 19:56:25 crc kubenswrapper[4885]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 08 19:56:25 crc kubenswrapper[4885]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + cleanup_ovsdb_server_semaphore Mar 08 19:56:25 crc kubenswrapper[4885]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 08 19:56:25 crc kubenswrapper[4885]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 08 19:56:25 crc kubenswrapper[4885]: > Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.871192 4885 kuberuntime_container.go:691] "PreStop hook failed" err=< Mar 08 19:56:25 crc kubenswrapper[4885]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Mar 08 19:56:25 crc kubenswrapper[4885]: + source /usr/local/bin/container-scripts/functions Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNBridge=br-int Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNRemote=tcp:localhost:6642 Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNEncapType=geneve Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNAvailabilityZones= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ EnableChassisAsGateway=true Mar 08 19:56:25 crc kubenswrapper[4885]: ++ PhysicalNetworks= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ OVNHostName= Mar 08 19:56:25 crc kubenswrapper[4885]: ++ DB_FILE=/etc/openvswitch/conf.db Mar 08 19:56:25 crc kubenswrapper[4885]: ++ ovs_dir=/var/lib/openvswitch Mar 08 19:56:25 crc kubenswrapper[4885]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Mar 08 19:56:25 crc kubenswrapper[4885]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Mar 08 19:56:25 crc kubenswrapper[4885]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + sleep 0.5 Mar 08 19:56:25 crc kubenswrapper[4885]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Mar 08 19:56:25 crc kubenswrapper[4885]: + cleanup_ovsdb_server_semaphore Mar 08 19:56:25 crc kubenswrapper[4885]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Mar 08 19:56:25 crc kubenswrapper[4885]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Mar 08 19:56:25 crc kubenswrapper[4885]: > pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" containerID="cri-o://a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.871822 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" containerID="cri-o://a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" gracePeriod=29 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.871233 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-hjcwx"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.871404 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8185583f-0ca5-46b1-a1ed-77c35b13a07b" (UID: "8185583f-0ca5-46b1-a1ed-77c35b13a07b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.872477 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8185583f-0ca5-46b1-a1ed-77c35b13a07b" (UID: "8185583f-0ca5-46b1-a1ed-77c35b13a07b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.872672 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.873344 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7749c44969-7m4ps" event={"ID":"8185583f-0ca5-46b1-a1ed-77c35b13a07b","Type":"ContainerDied","Data":"faaf9045f9d157b601bfa8ca719045997cf05f3ca07f75878e38d95227719aa4"} Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.873413 4885 scope.go:117] "RemoveContainer" containerID="8f4e92cbc965cfb3ab7ad14008b7ceea724e345981ffb88a02993256da9e6dcb" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.884887 4885 generic.go:334] "Generic (PLEG): container finished" podID="b8dd6448-dd16-4487-bc90-f835712effc1" containerID="f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8" exitCode=2 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.884969 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b8dd6448-dd16-4487-bc90-f835712effc1","Type":"ContainerDied","Data":"f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8"} Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.889069 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" containerID="cri-o://e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" gracePeriod=29 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.890098 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8185583f-0ca5-46b1-a1ed-77c35b13a07b" (UID: "8185583f-0ca5-46b1-a1ed-77c35b13a07b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.895147 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7 is running failed: container process not found" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.900305 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.900567 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="cinder-scheduler" containerID="cri-o://fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.900692 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="probe" containerID="cri-o://ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.902218 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7 is running failed: container process not found" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.903408 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7 is running failed: container process not found" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" cmd=["/usr/bin/pidof","ovsdb-server"] Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.903437 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="ovsdbserver-sb" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.904610 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d768ed9e-b089-4308-befc-e3bd6aa68683/ovsdbserver-nb/0.log" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.904642 4885 generic.go:334] "Generic (PLEG): container finished" podID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerID="e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5" exitCode=2 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.904699 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d768ed9e-b089-4308-befc-e3bd6aa68683","Type":"ContainerDied","Data":"e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5"} Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.910097 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" event={"ID":"2242ad5f-8a7e-4017-8441-6d05b2c94930","Type":"ContainerStarted","Data":"82162e312894c7024d5a8960ccd0eb1055e87a8eb4f79ab3a9a1cc489e5a6576"} Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.918566 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8185583f-0ca5-46b1-a1ed-77c35b13a07b" (UID: "8185583f-0ca5-46b1-a1ed-77c35b13a07b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.919191 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-config" (OuterVolumeSpecName: "config") pod "8185583f-0ca5-46b1-a1ed-77c35b13a07b" (UID: "8185583f-0ca5-46b1-a1ed-77c35b13a07b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.922605 4885 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.922666 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts podName:a3e6330b-4e2d-44ca-b9be-d36b2f613571 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:26.422651823 +0000 UTC m=+1487.818705846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts") pod "nova-cell1-cd3f-account-create-update-ccxvz" (UID: "a3e6330b-4e2d-44ca-b9be-d36b2f613571") : configmap "openstack-cell1-scripts" not found Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.922931 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.923347 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.923360 4885 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.923370 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.923384 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8185583f-0ca5-46b1-a1ed-77c35b13a07b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.936037 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-m2lkt"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.942534 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.943461 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.950413 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:25 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: if [ -n "nova_cell0" ]; then Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="nova_cell0" Mar 08 19:56:25 crc kubenswrapper[4885]: else Mar 08 19:56:25 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:25 crc kubenswrapper[4885]: fi Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:25 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:25 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:25 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:25 crc kubenswrapper[4885]: # support updates Mar 08 19:56:25 crc kubenswrapper[4885]: Mar 08 19:56:25 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:25 crc kubenswrapper[4885]: E0308 19:56:25.953154 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell0-db-secret\\\" not found\"" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" podUID="2242ad5f-8a7e-4017-8441-6d05b2c94930" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.973946 4885 scope.go:117] "RemoveContainer" containerID="4a9e799e765066afa260a1cecf9172d38f7e49cda7c9f4bc8c9ce49bcef121a4" Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.975826 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-744484b5fc-g6mjz"] Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.990582 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-744484b5fc-g6mjz" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-httpd" containerID="cri-o://67f1959dc61ea688f5069874c4fb20e1a6cd0f9f33725553c532e43624ce5124" gracePeriod=30 Mar 08 19:56:25 crc kubenswrapper[4885]: I0308 19:56:25.990846 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-744484b5fc-g6mjz" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-server" containerID="cri-o://52b3a63b99137a90e03814ec54cafbe70d9ceac7f36dc9e81e362bf716f0276b" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.007971 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.008252 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api-log" containerID="cri-o://48e1f046f7d97f16af118173fbff33a7753d9f3ef98b111d3153850bfbfdaf65" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.008414 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api" containerID="cri-o://7305f11ea3e6d044101cca24b88547af01f2e1506724f8e566c2a9df42c34dc6" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.014285 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.014558 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-log" containerID="cri-o://8eea35d6899ebbfa973a2d7d5bbaae841e42e4044906216a557300884b93a37e" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.015102 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-metadata" containerID="cri-o://2d8347d05b060d48223bdd22690e18395a24601df91142452e20c85edd93a56a" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.024035 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bb5b9c587-nd8hp"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.024224 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bb5b9c587-nd8hp" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-api" containerID="cri-o://17e37a1234fac68b042cb982b6be421ba7a3bd54c84d93b8bbb1842a9f1fa332" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025182 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-combined-ca-bundle\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025229 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9cs\" (UniqueName: \"kubernetes.io/projected/1c223ffe-b12c-4c78-920a-66e6feb9178f-kube-api-access-lz9cs\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025275 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025305 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-log-ovn\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025320 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovn-rundir\") pod \"474d55a2-f4f0-4e46-809c-367a3110c33d\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025395 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4tlt\" (UniqueName: \"kubernetes.io/projected/474d55a2-f4f0-4e46-809c-367a3110c33d-kube-api-access-l4tlt\") pod \"474d55a2-f4f0-4e46-809c-367a3110c33d\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025416 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-metrics-certs-tls-certs\") pod \"474d55a2-f4f0-4e46-809c-367a3110c33d\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025489 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-combined-ca-bundle\") pod \"474d55a2-f4f0-4e46-809c-367a3110c33d\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025515 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d55a2-f4f0-4e46-809c-367a3110c33d-config\") pod \"474d55a2-f4f0-4e46-809c-367a3110c33d\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025545 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovs-rundir\") pod \"474d55a2-f4f0-4e46-809c-367a3110c33d\" (UID: \"474d55a2-f4f0-4e46-809c-367a3110c33d\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025575 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-ovn-controller-tls-certs\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025602 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run-ovn\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.025619 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c223ffe-b12c-4c78-920a-66e6feb9178f-scripts\") pod \"1c223ffe-b12c-4c78-920a-66e6feb9178f\" (UID: \"1c223ffe-b12c-4c78-920a-66e6feb9178f\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.026284 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5bb5b9c587-nd8hp" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-httpd" containerID="cri-o://46919954f7a8695f89f60ffd2c95fd19f9f50cf97e2bbb06931bbceff7c47a47" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.026489 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run" (OuterVolumeSpecName: "var-run") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.026801 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "474d55a2-f4f0-4e46-809c-367a3110c33d" (UID: "474d55a2-f4f0-4e46-809c-367a3110c33d"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.026830 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.026851 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "474d55a2-f4f0-4e46-809c-367a3110c33d" (UID: "474d55a2-f4f0-4e46-809c-367a3110c33d"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.027824 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/474d55a2-f4f0-4e46-809c-367a3110c33d-config" (OuterVolumeSpecName: "config") pod "474d55a2-f4f0-4e46-809c-367a3110c33d" (UID: "474d55a2-f4f0-4e46-809c-367a3110c33d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.028514 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.046810 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/474d55a2-f4f0-4e46-809c-367a3110c33d-kube-api-access-l4tlt" (OuterVolumeSpecName: "kube-api-access-l4tlt") pod "474d55a2-f4f0-4e46-809c-367a3110c33d" (UID: "474d55a2-f4f0-4e46-809c-367a3110c33d"). InnerVolumeSpecName "kube-api-access-l4tlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060090 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060123 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060132 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060138 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060145 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060152 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060159 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060165 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060171 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060177 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060242 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060267 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060277 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060286 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060295 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060318 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060327 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.060335 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82"} Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.061150 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:26 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: if [ -n "" ]; then Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="" Mar 08 19:56:26 crc kubenswrapper[4885]: else Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:26 crc kubenswrapper[4885]: fi Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:26 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:26 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:26 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:26 crc kubenswrapper[4885]: # support updates Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.062470 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-bq8dk" podUID="09db13b9-d564-49c9-b383-5fbfe0e43c9b" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.064615 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c223ffe-b12c-4c78-920a-66e6feb9178f-scripts" (OuterVolumeSpecName: "scripts") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.065730 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-ccxvz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.066391 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c223ffe-b12c-4c78-920a-66e6feb9178f-kube-api-access-lz9cs" (OuterVolumeSpecName: "kube-api-access-lz9cs") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "kube-api-access-lz9cs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.067521 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="galera" containerID="cri-o://db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.071231 4885 generic.go:334] "Generic (PLEG): container finished" podID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerID="40b3ba3ccd4cd0fabe1a8de0a1537908216c0198be6d0bf26dce86c9b6a32605" exitCode=0 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.071282 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-mn4lz" event={"ID":"1c223ffe-b12c-4c78-920a-66e6feb9178f","Type":"ContainerDied","Data":"40b3ba3ccd4cd0fabe1a8de0a1537908216c0198be6d0bf26dce86c9b6a32605"} Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.071312 4885 scope.go:117] "RemoveContainer" containerID="40b3ba3ccd4cd0fabe1a8de0a1537908216c0198be6d0bf26dce86c9b6a32605" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.071460 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-mn4lz" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.073201 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-b5mql"] Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.080434 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:26 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: if [ -n "barbican" ]; then Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="barbican" Mar 08 19:56:26 crc kubenswrapper[4885]: else Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:26 crc kubenswrapper[4885]: fi Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:26 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:26 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:26 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:26 crc kubenswrapper[4885]: # support updates Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.080694 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-b5mql"] Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.083694 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-86ea-account-create-update-zb2lr" podUID="619a568c-d0c3-408b-96c1-39a3a769d1ad" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.085798 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.086043 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-log" containerID="cri-o://4e4c7d9e4404bd1a5433f1787f2f7abf1d5d2e0fd51aebb6079e0aa7c48cd16e" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.086270 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-httpd" containerID="cri-o://41d8583c3d498141cc2e38d1ed8623082609a86faa3f087e124f2692ac0c8871" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.091039 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-fdab-account-create-update-vs9sz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.098070 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-fdab-account-create-update-vs9sz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.109343 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-8wdm8"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.112046 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.130451 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config-secret\") pod \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.130730 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phmx9\" (UniqueName: \"kubernetes.io/projected/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-kube-api-access-phmx9\") pod \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.130775 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-combined-ca-bundle\") pod \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.130800 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config\") pod \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\" (UID: \"ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131207 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4tlt\" (UniqueName: \"kubernetes.io/projected/474d55a2-f4f0-4e46-809c-367a3110c33d-kube-api-access-l4tlt\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131220 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/474d55a2-f4f0-4e46-809c-367a3110c33d-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131229 4885 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovs-rundir\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131239 4885 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131247 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c223ffe-b12c-4c78-920a-66e6feb9178f-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131254 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131264 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9cs\" (UniqueName: \"kubernetes.io/projected/1c223ffe-b12c-4c78-920a-66e6feb9178f-kube-api-access-lz9cs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131272 4885 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131280 4885 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1c223ffe-b12c-4c78-920a-66e6feb9178f-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.131289 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/474d55a2-f4f0-4e46-809c-367a3110c33d-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.131340 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.131381 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data podName:96257eac-42ec-44cf-80be-9be68c0ebb1b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:28.131367856 +0000 UTC m=+1489.527421879 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b") : configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.133599 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-kube-api-access-phmx9" (OuterVolumeSpecName: "kube-api-access-phmx9") pod "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" (UID: "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84"). InnerVolumeSpecName "kube-api-access-phmx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.135912 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "474d55a2-f4f0-4e46-809c-367a3110c33d" (UID: "474d55a2-f4f0-4e46-809c-367a3110c33d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.147700 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-8wdm8"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.162161 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.162389 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-log" containerID="cri-o://2d562399fb223e806d5f3ddff2425b5e427d18a16330c90ac41a561625d41719" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.162781 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-httpd" containerID="cri-o://f5e5f2790360729d9c4394c0e85bb4e8ea8164ab35be9023623e79b3a117f852" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.164560 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" (UID: "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.198080 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" (UID: "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.213022 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" (UID: "ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.214411 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "474d55a2-f4f0-4e46-809c-367a3110c33d" (UID: "474d55a2-f4f0-4e46-809c-367a3110c33d"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.220223 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f33b-account-create-update-vbmkj"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.241903 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.242125 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.242135 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phmx9\" (UniqueName: \"kubernetes.io/projected/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-kube-api-access-phmx9\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.242144 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.242254 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.242264 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/474d55a2-f4f0-4e46-809c-367a3110c33d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.246052 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "1c223ffe-b12c-4c78-920a-66e6feb9178f" (UID: "1c223ffe-b12c-4c78-920a-66e6feb9178f"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.259695 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.277628 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-031a-account-create-update-ss6cl"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.286361 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-wphzx"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.295010 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a047-account-create-update-qrjwk"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.311869 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="rabbitmq" containerID="cri-o://ab00747eae0e5726409cc3faafb18065815833a98680950d3e1962529cb0f73d" gracePeriod=604800 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.315508 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-wphzx"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.324344 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-9jddz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.338217 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d768ed9e-b089-4308-befc-e3bd6aa68683/ovsdbserver-nb/0.log" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.338277 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.343806 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/1c223ffe-b12c-4c78-920a-66e6feb9178f-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.349370 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-9jddz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.355450 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-td7dc"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.362672 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-td7dc"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.370469 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jc2k"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.376943 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-86ea-account-create-update-zb2lr"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.390240 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.390453 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="9bbdf164-51e7-4faf-986b-fba5044fad2b" containerName="nova-cell1-conductor-conductor" containerID="cri-o://bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.397590 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-7jc2k"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.436680 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.436983 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="90fb4d53-4722-4f72-9f1a-99ee2b637f6e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://32768cc82a61a862691a2497facf7d05127e1de2145b441cde8a059f45152b82" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445609 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdb-rundir\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445732 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-config\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445751 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445861 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chcvg\" (UniqueName: \"kubernetes.io/projected/d768ed9e-b089-4308-befc-e3bd6aa68683-kube-api-access-chcvg\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445884 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-combined-ca-bundle\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445945 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-metrics-certs-tls-certs\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.445965 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdbserver-nb-tls-certs\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.446047 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-scripts\") pod \"d768ed9e-b089-4308-befc-e3bd6aa68683\" (UID: \"d768ed9e-b089-4308-befc-e3bd6aa68683\") " Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.446465 4885 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.446521 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts podName:a3e6330b-4e2d-44ca-b9be-d36b2f613571 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:27.446508432 +0000 UTC m=+1488.842562455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts") pod "nova-cell1-cd3f-account-create-update-ccxvz" (UID: "a3e6330b-4e2d-44ca-b9be-d36b2f613571") : configmap "openstack-cell1-scripts" not found Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.447416 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.447900 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-config" (OuterVolumeSpecName: "config") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.449472 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-scripts" (OuterVolumeSpecName: "scripts") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.449710 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5b88496c9d-2g95h"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.449984 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener-log" containerID="cri-o://3a20cf21bbfb4da8c71131e4075d64b83bae96d5c5020bc3cfadcf8d7226f8bc" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.450731 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener" containerID="cri-o://6e7be97046549290741b9a7850306bb8d9be298e24617283ccb5d04dda12497f" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.474786 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.481580 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7dfc6b7fcc-dpq7t"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.482512 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker-log" containerID="cri-o://d5cd5c3527dc17515d5a33bed3c5118e0fcbd6d15187bcfb409883f29afc80a6" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.482846 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker" containerID="cri-o://b8d53aa1399bba98dc12433735d0a8b3cb69b3036f3c8fb648dbc900fdb658b2" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.498122 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.499062 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b8dd6448-dd16-4487-bc90-f835712effc1/ovsdbserver-sb/0.log" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.499152 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.504658 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d768ed9e-b089-4308-befc-e3bd6aa68683-kube-api-access-chcvg" (OuterVolumeSpecName: "kube-api-access-chcvg") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "kube-api-access-chcvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.505170 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.505292 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.509087 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.509154 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="galera" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.544811 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:26 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: if [ -n "neutron" ]; then Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="neutron" Mar 08 19:56:26 crc kubenswrapper[4885]: else Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:26 crc kubenswrapper[4885]: fi Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:26 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:26 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:26 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:26 crc kubenswrapper[4885]: # support updates Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.546604 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:26 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: if [ -n "cinder" ]; then Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="cinder" Mar 08 19:56:26 crc kubenswrapper[4885]: else Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:26 crc kubenswrapper[4885]: fi Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:26 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:26 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:26 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:26 crc kubenswrapper[4885]: # support updates Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.547077 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"neutron-db-secret\\\" not found\"" pod="openstack/neutron-031a-account-create-update-ss6cl" podUID="36678078-1658-4edc-a256-3f0bb8d23ed8" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.548032 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"cinder-db-secret\\\" not found\"" pod="openstack/cinder-f33b-account-create-update-vbmkj" podUID="ce3508e5-7126-47b0-a598-da6515457cb7" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.549913 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.549960 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.549969 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d768ed9e-b089-4308-befc-e3bd6aa68683-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.549987 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.549997 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chcvg\" (UniqueName: \"kubernetes.io/projected/d768ed9e-b089-4308-befc-e3bd6aa68683-kube-api-access-chcvg\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.550026 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.554765 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.555261 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" containerName="nova-cell0-conductor-conductor" containerID="cri-o://9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: W0308 19:56:26.561253 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5ba6c20_150e_48ca_ac4a_4a6a8ef1f525.slice/crio-34abe95868deb87094ecfc507d2e30a4f3b296f2e25ac6896a50280fdc7341bb WatchSource:0}: Error finding container 34abe95868deb87094ecfc507d2e30a4f3b296f2e25ac6896a50280fdc7341bb: Status 404 returned error can't find the container with id 34abe95868deb87094ecfc507d2e30a4f3b296f2e25ac6896a50280fdc7341bb Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.573914 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.582458 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.586772 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5pkh8"] Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.591709 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:26 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: if [ -n "glance" ]; then Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="glance" Mar 08 19:56:26 crc kubenswrapper[4885]: else Mar 08 19:56:26 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:26 crc kubenswrapper[4885]: fi Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:26 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:26 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:26 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:26 crc kubenswrapper[4885]: # support updates Mar 08 19:56:26 crc kubenswrapper[4885]: Mar 08 19:56:26 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:26 crc kubenswrapper[4885]: E0308 19:56:26.593719 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"glance-db-secret\\\" not found\"" pod="openstack/glance-a047-account-create-update-qrjwk" podUID="d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.597859 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5pkh8"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.612529 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-796cf584f6-dfmcm"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.612764 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-796cf584f6-dfmcm" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api-log" containerID="cri-o://d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.613185 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-796cf584f6-dfmcm" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api" containerID="cri-o://786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.624249 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bq8dk"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.629766 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.629969 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="945717bc-405f-4628-934c-66e4500f56f0" containerName="nova-scheduler-scheduler" containerID="cri-o://aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" gracePeriod=30 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.633067 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "d768ed9e-b089-4308-befc-e3bd6aa68683" (UID: "d768ed9e-b089-4308-befc-e3bd6aa68683"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.639570 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663174 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-config\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663349 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdbserver-sb-tls-certs\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663378 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-scripts\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663417 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-metrics-certs-tls-certs\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663461 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk872\" (UniqueName: \"kubernetes.io/projected/b8dd6448-dd16-4487-bc90-f835712effc1-kube-api-access-wk872\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663491 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdb-rundir\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663529 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.663607 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-combined-ca-bundle\") pod \"b8dd6448-dd16-4487-bc90-f835712effc1\" (UID: \"b8dd6448-dd16-4487-bc90-f835712effc1\") " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.666117 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.666145 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.666172 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d768ed9e-b089-4308-befc-e3bd6aa68683-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.669637 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.675021 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-scripts" (OuterVolumeSpecName: "scripts") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.675074 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-config" (OuterVolumeSpecName: "config") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.677308 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8dd6448-dd16-4487-bc90-f835712effc1-kube-api-access-wk872" (OuterVolumeSpecName: "kube-api-access-wk872") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "kube-api-access-wk872". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.678033 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.688274 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerName="rabbitmq" containerID="cri-o://c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e" gracePeriod=604800 Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.729191 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.763899 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bq8dk"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.767547 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.767583 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b8dd6448-dd16-4487-bc90-f835712effc1-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.767593 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk872\" (UniqueName: \"kubernetes.io/projected/b8dd6448-dd16-4487-bc90-f835712effc1-kube-api-access-wk872\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.767603 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.767633 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.767642 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.775524 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-86ea-account-create-update-zb2lr"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.783972 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-7m4ps"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.787689 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7749c44969-7m4ps"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.813562 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-mn4lz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.815968 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.818049 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "b8dd6448-dd16-4487-bc90-f835712effc1" (UID: "b8dd6448-dd16-4487-bc90-f835712effc1"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.824701 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-mn4lz"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.830969 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f33b-account-create-update-vbmkj"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.839754 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-031a-account-create-update-ss6cl"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.851545 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a047-account-create-update-qrjwk"] Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.864320 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.869032 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.869133 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:26 crc kubenswrapper[4885]: I0308 19:56:26.869231 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b8dd6448-dd16-4487-bc90-f835712effc1-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.089820 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-5qsh8_474d55a2-f4f0-4e46-809c-367a3110c33d/openstack-network-exporter/0.log" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.089898 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-5qsh8" event={"ID":"474d55a2-f4f0-4e46-809c-367a3110c33d","Type":"ContainerDied","Data":"35bbd23060f341a3429a5bab6384434330b103927103bf0acea28883cf67dc65"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.089971 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-5qsh8" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.090009 4885 scope.go:117] "RemoveContainer" containerID="e8623499ee05972629def49d746cd17bbc01095c41d8c7e431c723d1dc4187ee" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.094393 4885 generic.go:334] "Generic (PLEG): container finished" podID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerID="52b3a63b99137a90e03814ec54cafbe70d9ceac7f36dc9e81e362bf716f0276b" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.094433 4885 generic.go:334] "Generic (PLEG): container finished" podID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerID="67f1959dc61ea688f5069874c4fb20e1a6cd0f9f33725553c532e43624ce5124" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.094487 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744484b5fc-g6mjz" event={"ID":"60f9821e-e554-4594-bfb2-9521cd3c171a","Type":"ContainerDied","Data":"52b3a63b99137a90e03814ec54cafbe70d9ceac7f36dc9e81e362bf716f0276b"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.094515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744484b5fc-g6mjz" event={"ID":"60f9821e-e554-4594-bfb2-9521cd3c171a","Type":"ContainerDied","Data":"67f1959dc61ea688f5069874c4fb20e1a6cd0f9f33725553c532e43624ce5124"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.100651 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a083cf5-4ca2-440c-840a-6b159151609f" containerID="d5cd5c3527dc17515d5a33bed3c5118e0fcbd6d15187bcfb409883f29afc80a6" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.100696 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" event={"ID":"2a083cf5-4ca2-440c-840a-6b159151609f","Type":"ContainerDied","Data":"d5cd5c3527dc17515d5a33bed3c5118e0fcbd6d15187bcfb409883f29afc80a6"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.104136 4885 generic.go:334] "Generic (PLEG): container finished" podID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.104258 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerDied","Data":"a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.107528 4885 generic.go:334] "Generic (PLEG): container finished" podID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerID="4e4c7d9e4404bd1a5433f1787f2f7abf1d5d2e0fd51aebb6079e0aa7c48cd16e" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.107662 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4ca493a-f707-45c3-b457-1a1053c3dfe5","Type":"ContainerDied","Data":"4e4c7d9e4404bd1a5433f1787f2f7abf1d5d2e0fd51aebb6079e0aa7c48cd16e"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.109782 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bq8dk" event={"ID":"09db13b9-d564-49c9-b383-5fbfe0e43c9b","Type":"ContainerStarted","Data":"8eb8421d6a99d9718a9f12940cde328117f97ad0e5bd5e92cde85c15eeb1502b"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.110265 4885 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/root-account-create-update-bq8dk" secret="" err="secret \"galera-openstack-cell1-dockercfg-5kqpk\" not found" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.121705 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:27 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: if [ -n "" ]; then Mar 08 19:56:27 crc kubenswrapper[4885]: GRANT_DATABASE="" Mar 08 19:56:27 crc kubenswrapper[4885]: else Mar 08 19:56:27 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:27 crc kubenswrapper[4885]: fi Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:27 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:27 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:27 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:27 crc kubenswrapper[4885]: # support updates Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.123264 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"openstack-cell1-mariadb-root-db-secret\\\" not found\"" pod="openstack/root-account-create-update-bq8dk" podUID="09db13b9-d564-49c9-b383-5fbfe0e43c9b" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.125000 4885 generic.go:334] "Generic (PLEG): container finished" podID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerID="2d562399fb223e806d5f3ddff2425b5e427d18a16330c90ac41a561625d41719" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.125047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50b429e9-fb10-48ba-b15c-ec25d57e707a","Type":"ContainerDied","Data":"2d562399fb223e806d5f3ddff2425b5e427d18a16330c90ac41a561625d41719"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.137204 4885 generic.go:334] "Generic (PLEG): container finished" podID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerID="db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.137273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"925797ff-e1b0-4df7-83db-2091264a4bb8","Type":"ContainerDied","Data":"db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.142771 4885 generic.go:334] "Generic (PLEG): container finished" podID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerID="ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.142849 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13df70e2-1a9e-4d81-b23b-c461291bce93","Type":"ContainerDied","Data":"ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.154108 4885 generic.go:334] "Generic (PLEG): container finished" podID="a7268474-e124-4139-bf24-6b3f605b9511" containerID="3a20cf21bbfb4da8c71131e4075d64b83bae96d5c5020bc3cfadcf8d7226f8bc" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.154222 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" event={"ID":"a7268474-e124-4139-bf24-6b3f605b9511","Type":"ContainerDied","Data":"3a20cf21bbfb4da8c71131e4075d64b83bae96d5c5020bc3cfadcf8d7226f8bc"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.164182 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-5qsh8"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.166512 4885 generic.go:334] "Generic (PLEG): container finished" podID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerID="46919954f7a8695f89f60ffd2c95fd19f9f50cf97e2bbb06931bbceff7c47a47" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.166579 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb5b9c587-nd8hp" event={"ID":"d1b91750-253e-46eb-9a1c-f7208dab2496","Type":"ContainerDied","Data":"46919954f7a8695f89f60ffd2c95fd19f9f50cf97e2bbb06931bbceff7c47a47"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.173596 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a047-account-create-update-qrjwk" event={"ID":"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525","Type":"ContainerStarted","Data":"34abe95868deb87094ecfc507d2e30a4f3b296f2e25ac6896a50280fdc7341bb"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.176391 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f33b-account-create-update-vbmkj" event={"ID":"ce3508e5-7126-47b0-a598-da6515457cb7","Type":"ContainerStarted","Data":"92b9ea85ebe6946aa9aeb6553980bbeaa2d164cb127da6a8cd1b41eb2565b1b7"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.180404 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-5qsh8"] Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.180955 4885 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.181874 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts podName:09db13b9-d564-49c9-b383-5fbfe0e43c9b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:27.681858399 +0000 UTC m=+1489.077912422 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts") pod "root-account-create-update-bq8dk" (UID: "09db13b9-d564-49c9-b383-5fbfe0e43c9b") : configmap "openstack-cell1-scripts" not found Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.186361 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86ea-account-create-update-zb2lr" event={"ID":"619a568c-d0c3-408b-96c1-39a3a769d1ad","Type":"ContainerStarted","Data":"ecad256f4e94f24cfab1fe144e2c009ebd76a6803d21f421f8c4cfc3079aa401"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.190876 4885 generic.go:334] "Generic (PLEG): container finished" podID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerID="8eea35d6899ebbfa973a2d7d5bbaae841e42e4044906216a557300884b93a37e" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.190957 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cdd926a8-442c-4f63-bb36-3e6a425436c2","Type":"ContainerDied","Data":"8eea35d6899ebbfa973a2d7d5bbaae841e42e4044906216a557300884b93a37e"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.191086 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.221374 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_b8dd6448-dd16-4487-bc90-f835712effc1/ovsdbserver-sb/0.log" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.221424 4885 generic.go:334] "Generic (PLEG): container finished" podID="b8dd6448-dd16-4487-bc90-f835712effc1" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.221534 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.221964 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b8dd6448-dd16-4487-bc90-f835712effc1","Type":"ContainerDied","Data":"1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.221990 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"b8dd6448-dd16-4487-bc90-f835712effc1","Type":"ContainerDied","Data":"27094a5eecfea3bd81d2314594b8cfdb03f329abe60f33c847c8c969d4747a0d"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.254360 4885 generic.go:334] "Generic (PLEG): container finished" podID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerID="d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.254434 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-796cf584f6-dfmcm" event={"ID":"35e55887-f8af-4c57-820d-c46d0ee9cd9f","Type":"ContainerDied","Data":"d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281511 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-operator-scripts\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281556 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jqgd\" (UniqueName: \"kubernetes.io/projected/925797ff-e1b0-4df7-83db-2091264a4bb8-kube-api-access-8jqgd\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281614 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-generated\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281647 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-galera-tls-certs\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281669 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-combined-ca-bundle\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281716 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-kolla-config\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281754 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.281826 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-default\") pod \"925797ff-e1b0-4df7-83db-2091264a4bb8\" (UID: \"925797ff-e1b0-4df7-83db-2091264a4bb8\") " Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.282369 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.282419 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data podName:01dc1fd5-4e2f-4129-9452-ed50fa1d182b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:31.282405515 +0000 UTC m=+1492.678459538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data") pod "rabbitmq-server-0" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b") : configmap "rabbitmq-config-data" not found Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.283444 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.286025 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.286608 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-031a-account-create-update-ss6cl" event={"ID":"36678078-1658-4edc-a256-3f0bb8d23ed8","Type":"ContainerStarted","Data":"81ac023e6b35f76413c8e20a66666b09686bcf823a6a24f94f4a77215fb4b2b2"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.288069 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.288866 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.304794 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "mysql-db") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.336593 4885 generic.go:334] "Generic (PLEG): container finished" podID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerID="48e1f046f7d97f16af118173fbff33a7753d9f3ef98b111d3153850bfbfdaf65" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.336686 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64baa35e-d1c2-48fe-a7a1-d0a4d1485908","Type":"ContainerDied","Data":"48e1f046f7d97f16af118173fbff33a7753d9f3ef98b111d3153850bfbfdaf65"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.336722 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925797ff-e1b0-4df7-83db-2091264a4bb8-kube-api-access-8jqgd" (OuterVolumeSpecName: "kube-api-access-8jqgd") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "kube-api-access-8jqgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.336733 4885 scope.go:117] "RemoveContainer" containerID="f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.337796 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.338165 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:27 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: if [ -n "barbican" ]; then Mar 08 19:56:27 crc kubenswrapper[4885]: GRANT_DATABASE="barbican" Mar 08 19:56:27 crc kubenswrapper[4885]: else Mar 08 19:56:27 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:27 crc kubenswrapper[4885]: fi Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:27 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:27 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:27 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:27 crc kubenswrapper[4885]: # support updates Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.339808 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"barbican-db-secret\\\" not found\"" pod="openstack/barbican-86ea-account-create-update-zb2lr" podUID="619a568c-d0c3-408b-96c1-39a3a769d1ad" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.354732 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.364094 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.379239 4885 scope.go:117] "RemoveContainer" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.382163 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.385378 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.386342 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jqgd\" (UniqueName: \"kubernetes.io/projected/925797ff-e1b0-4df7-83db-2091264a4bb8-kube-api-access-8jqgd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.386383 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.386395 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.386408 4885 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.386430 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.386455 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/925797ff-e1b0-4df7-83db-2091264a4bb8-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.389693 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.393431 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.399836 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15d49c2d-56cf-46e9-b0e9-c5aac516fdf7" path="/var/lib/kubelet/pods/15d49c2d-56cf-46e9-b0e9-c5aac516fdf7/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.400400 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" path="/var/lib/kubelet/pods/1c223ffe-b12c-4c78-920a-66e6feb9178f/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.401055 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef" path="/var/lib/kubelet/pods/1ea65f9e-cbf1-47a6-8800-aa6b7fe9ffef/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.402020 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="474d55a2-f4f0-4e46-809c-367a3110c33d" path="/var/lib/kubelet/pods/474d55a2-f4f0-4e46-809c-367a3110c33d/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.402497 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a7dd20b-387a-4061-ab5a-a53ee6a240ef" path="/var/lib/kubelet/pods/4a7dd20b-387a-4061-ab5a-a53ee6a240ef/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.403002 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f0edc25-2cc1-4111-96e3-3807e6463d57" path="/var/lib/kubelet/pods/5f0edc25-2cc1-4111-96e3-3807e6463d57/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.403510 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="761f5c93-2ed3-43f0-acaf-ee92d0719ec3" path="/var/lib/kubelet/pods/761f5c93-2ed3-43f0-acaf-ee92d0719ec3/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.406359 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "925797ff-e1b0-4df7-83db-2091264a4bb8" (UID: "925797ff-e1b0-4df7-83db-2091264a4bb8"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.409450 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c" path="/var/lib/kubelet/pods/7f6b5122-d4c7-491c-9b00-dd4eef0e3a0c/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.409958 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" path="/var/lib/kubelet/pods/8185583f-0ca5-46b1-a1ed-77c35b13a07b/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.410480 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84470b78-5e74-473c-88d3-5343943c01fb" path="/var/lib/kubelet/pods/84470b78-5e74-473c-88d3-5343943c01fb/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.420062 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92191eaa-0c0a-4927-adf4-a4e386ed2552" path="/var/lib/kubelet/pods/92191eaa-0c0a-4927-adf4-a4e386ed2552/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.420659 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44" path="/var/lib/kubelet/pods/a9cdd234-0e3f-4bd4-9382-1f4ca59aeb44/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.422626 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" path="/var/lib/kubelet/pods/b8dd6448-dd16-4487-bc90-f835712effc1/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.423201 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbf04089-b7ff-4c4d-acad-f41d45ac6bfb" path="/var/lib/kubelet/pods/dbf04089-b7ff-4c4d-acad-f41d45ac6bfb/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.423709 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84" path="/var/lib/kubelet/pods/ddd62ab9-bb59-47ef-b639-fd0a0a4c4b84/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.429466 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e437f837-ac56-4b1a-b7ec-7a22cf98c8b3" path="/var/lib/kubelet/pods/e437f837-ac56-4b1a-b7ec-7a22cf98c8b3/volumes" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.442547 4885 scope.go:117] "RemoveContainer" containerID="f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.463653 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8\": container with ID starting with f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8 not found: ID does not exist" containerID="f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.463716 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8"} err="failed to get container status \"f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8\": rpc error: code = NotFound desc = could not find container \"f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8\": container with ID starting with f091f4bc961e2987b72ec45d7d195e9247086d3bb76b496ee88fa39c714f31e8 not found: ID does not exist" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.463745 4885 scope.go:117] "RemoveContainer" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.466839 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7\": container with ID starting with 1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7 not found: ID does not exist" containerID="1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.466870 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7"} err="failed to get container status \"1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7\": rpc error: code = NotFound desc = could not find container \"1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7\": container with ID starting with 1a5202bbfaccb833bb098be6bb6c7c9b768ad5822523c39fcc8c3dd46431d3f7 not found: ID does not exist" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.466887 4885 scope.go:117] "RemoveContainer" containerID="6670e6817995526cd80a6c1b2064f3af999a3d367e59a87b40d4c34b2c61c6e3" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479188 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479241 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479248 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479255 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479305 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479328 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479340 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.479349 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.481107 4885 generic.go:334] "Generic (PLEG): container finished" podID="90fb4d53-4722-4f72-9f1a-99ee2b637f6e" containerID="32768cc82a61a862691a2497facf7d05127e1de2145b441cde8a059f45152b82" exitCode=0 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.481155 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"90fb4d53-4722-4f72-9f1a-99ee2b637f6e","Type":"ContainerDied","Data":"32768cc82a61a862691a2497facf7d05127e1de2145b441cde8a059f45152b82"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.481211 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.484691 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d768ed9e-b089-4308-befc-e3bd6aa68683/ovsdbserver-nb/0.log" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.484723 4885 generic.go:334] "Generic (PLEG): container finished" podID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerID="4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c" exitCode=143 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.485240 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.486336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d768ed9e-b089-4308-befc-e3bd6aa68683","Type":"ContainerDied","Data":"4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.486364 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d768ed9e-b089-4308-befc-e3bd6aa68683","Type":"ContainerDied","Data":"d04c91ea8c65ff23403733626ccc4e0944e79d09733c79a70fe91cced28380f8"} Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.486561 4885 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" secret="" err="secret \"galera-openstack-cell1-dockercfg-5kqpk\" not found" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.489449 4885 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 08 19:56:27 crc kubenswrapper[4885]: container &Container{Name:mariadb-account-create-update,Image:quay.io/podified-antelope-centos9/openstack-mariadb@sha256:763d1f1e8a1cf877c151c59609960fd2fa29e7e50001f8818122a2d51878befa,Command:[/bin/sh -c #!/bin/bash Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: MYSQL_REMOTE_HOST="" source /var/lib/operator-scripts/mysql_root_auth.sh Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: export DatabasePassword=${DatabasePassword:?"Please specify a DatabasePassword variable."} Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: MYSQL_CMD="mysql -h -u root -P 3306" Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: if [ -n "nova_cell1" ]; then Mar 08 19:56:27 crc kubenswrapper[4885]: GRANT_DATABASE="nova_cell1" Mar 08 19:56:27 crc kubenswrapper[4885]: else Mar 08 19:56:27 crc kubenswrapper[4885]: GRANT_DATABASE="*" Mar 08 19:56:27 crc kubenswrapper[4885]: fi Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: # going for maximum compatibility here: Mar 08 19:56:27 crc kubenswrapper[4885]: # 1. MySQL 8 no longer allows implicit create user when GRANT is used Mar 08 19:56:27 crc kubenswrapper[4885]: # 2. MariaDB has "CREATE OR REPLACE", but MySQL does not Mar 08 19:56:27 crc kubenswrapper[4885]: # 3. create user with CREATE but then do all password and TLS with ALTER to Mar 08 19:56:27 crc kubenswrapper[4885]: # support updates Mar 08 19:56:27 crc kubenswrapper[4885]: Mar 08 19:56:27 crc kubenswrapper[4885]: $MYSQL_CMD < logger="UnhandledError" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.490991 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-create-update\" with CreateContainerConfigError: \"secret \\\"nova-cell1-db-secret\\\" not found\"" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" podUID="a3e6330b-4e2d-44ca-b9be-d36b2f613571" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.495308 4885 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/925797ff-e1b0-4df7-83db-2091264a4bb8-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.496316 4885 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.496425 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts podName:a3e6330b-4e2d-44ca-b9be-d36b2f613571 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:29.49640977 +0000 UTC m=+1490.892463793 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts") pod "nova-cell1-cd3f-account-create-update-ccxvz" (UID: "a3e6330b-4e2d-44ca-b9be-d36b2f613571") : configmap "openstack-cell1-scripts" not found Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.524910 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.607806 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-public-tls-certs\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.610748 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-config-data\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.611972 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-etc-swift\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.612005 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-config-data\") pod \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.612046 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-run-httpd\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.612065 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-log-httpd\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.612086 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-vencrypt-tls-certs\") pod \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.612120 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-internal-tls-certs\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.614064 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-nova-novncproxy-tls-certs\") pod \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.614126 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-combined-ca-bundle\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.614163 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxpr8\" (UniqueName: \"kubernetes.io/projected/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-kube-api-access-qxpr8\") pod \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.614200 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsh5k\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-kube-api-access-jsh5k\") pod \"60f9821e-e554-4594-bfb2-9521cd3c171a\" (UID: \"60f9821e-e554-4594-bfb2-9521cd3c171a\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.614477 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-combined-ca-bundle\") pod \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\" (UID: \"90fb4d53-4722-4f72-9f1a-99ee2b637f6e\") " Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.616251 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.617350 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.617374 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.637588 4885 scope.go:117] "RemoveContainer" containerID="32768cc82a61a862691a2497facf7d05127e1de2145b441cde8a059f45152b82" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.637745 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.647218 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.647555 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-central-agent" containerID="cri-o://a5a0e7af89f0943433efc0423974aa4157ace5b596adabc6170e4373acc330a7" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.650185 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-notification-agent" containerID="cri-o://7e93f87815197e303bd6f0ad768ac092887798b010a49cf9460b37861d1fc6db" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.650368 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="sg-core" containerID="cri-o://c26148668f63c2c808f3994e48705725bbf52e07fae581041f2a8517c972eb19" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.650954 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="proxy-httpd" containerID="cri-o://46513b3771e23d8ed82d3b5bc73c4d07608c21fefaf4830b5055b5a4a5d6d688" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.680437 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.693253 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-kube-api-access-jsh5k" (OuterVolumeSpecName: "kube-api-access-jsh5k") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "kube-api-access-jsh5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.711839 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-kube-api-access-qxpr8" (OuterVolumeSpecName: "kube-api-access-qxpr8") pod "90fb4d53-4722-4f72-9f1a-99ee2b637f6e" (UID: "90fb4d53-4722-4f72-9f1a-99ee2b637f6e"). InnerVolumeSpecName "kube-api-access-qxpr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.730249 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60f9821e-e554-4594-bfb2-9521cd3c171a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.730290 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxpr8\" (UniqueName: \"kubernetes.io/projected/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-kube-api-access-qxpr8\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.730300 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsh5k\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-kube-api-access-jsh5k\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.730372 4885 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.730423 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts podName:09db13b9-d564-49c9-b383-5fbfe0e43c9b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:28.730408286 +0000 UTC m=+1490.126462309 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts") pod "root-account-create-update-bq8dk" (UID: "09db13b9-d564-49c9-b383-5fbfe0e43c9b") : configmap "openstack-cell1-scripts" not found Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.791472 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.800821 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.817041 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.817276 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="63c3ea8e-9683-45b9-805b-d1049840b0da" containerName="kube-state-metrics" containerID="cri-o://72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.841406 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.841650 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="da1d62ba-4033-4906-87c1-d673c1ab8637" containerName="memcached" containerID="cri-o://f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.857399 4885 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/60f9821e-e554-4594-bfb2-9521cd3c171a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.861094 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3705-account-create-update-2brz9"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.867597 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3705-account-create-update-2brz9"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.875584 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-zfp8t"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.880519 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-zfp8t"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.889892 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-j97wh"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.899803 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3705-account-create-update-c72qw"] Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900266 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90fb4d53-4722-4f72-9f1a-99ee2b637f6e" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900288 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="90fb4d53-4722-4f72-9f1a-99ee2b637f6e" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900305 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerName="dnsmasq-dns" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900312 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerName="dnsmasq-dns" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900321 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerName="init" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900328 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerName="init" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900337 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900343 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900353 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="474d55a2-f4f0-4e46-809c-367a3110c33d" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900359 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="474d55a2-f4f0-4e46-809c-367a3110c33d" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900367 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="ovsdbserver-nb" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900372 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="ovsdbserver-nb" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900381 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-httpd" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900387 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-httpd" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900400 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="mysql-bootstrap" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900407 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="mysql-bootstrap" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900416 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="galera" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900422 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="galera" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900442 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900448 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900458 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="ovsdbserver-sb" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900464 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="ovsdbserver-sb" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900474 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-server" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900479 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-server" Mar 08 19:56:27 crc kubenswrapper[4885]: E0308 19:56:27.900494 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900499 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900673 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="474d55a2-f4f0-4e46-809c-367a3110c33d" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900682 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8185583f-0ca5-46b1-a1ed-77c35b13a07b" containerName="dnsmasq-dns" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900690 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-server" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900700 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c223ffe-b12c-4c78-920a-66e6feb9178f" containerName="ovn-controller" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900708 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="ovsdbserver-sb" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900720 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="ovsdbserver-nb" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900734 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" containerName="proxy-httpd" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900744 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" containerName="galera" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900755 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900762 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8dd6448-dd16-4487-bc90-f835712effc1" containerName="openstack-network-exporter" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.900772 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="90fb4d53-4722-4f72-9f1a-99ee2b637f6e" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.902096 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.902715 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.910236 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-j97wh"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.912257 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90fb4d53-4722-4f72-9f1a-99ee2b637f6e" (UID: "90fb4d53-4722-4f72-9f1a-99ee2b637f6e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.915600 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.924836 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-574d5c476f-sq4hm"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.925059 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-574d5c476f-sq4hm" podUID="1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" containerName="keystone-api" containerID="cri-o://0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421" gracePeriod=30 Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.939754 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3705-account-create-update-c72qw"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.948653 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.959755 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.959781 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.962970 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-ll64z"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.968601 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3705-account-create-update-c72qw"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.976689 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.978282 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-ll64z"] Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.980656 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "90fb4d53-4722-4f72-9f1a-99ee2b637f6e" (UID: "90fb4d53-4722-4f72-9f1a-99ee2b637f6e"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.986166 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-config-data" (OuterVolumeSpecName: "config-data") pod "90fb4d53-4722-4f72-9f1a-99ee2b637f6e" (UID: "90fb4d53-4722-4f72-9f1a-99ee2b637f6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.987515 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "90fb4d53-4722-4f72-9f1a-99ee2b637f6e" (UID: "90fb4d53-4722-4f72-9f1a-99ee2b637f6e"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:27 crc kubenswrapper[4885]: I0308 19:56:27.997275 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.003945 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.005473 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-config-data" (OuterVolumeSpecName: "config-data") pod "60f9821e-e554-4594-bfb2-9521cd3c171a" (UID: "60f9821e-e554-4594-bfb2-9521cd3c171a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.005706 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-k2xgr operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/keystone-3705-account-create-update-c72qw" podUID="9e649171-680c-445a-b418-734a5c7322e3" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.008669 4885 scope.go:117] "RemoveContainer" containerID="e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.009546 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.064601 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3508e5-7126-47b0-a598-da6515457cb7-operator-scripts\") pod \"ce3508e5-7126-47b0-a598-da6515457cb7\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.064885 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs4qz\" (UniqueName: \"kubernetes.io/projected/9bbdf164-51e7-4faf-986b-fba5044fad2b-kube-api-access-hs4qz\") pod \"9bbdf164-51e7-4faf-986b-fba5044fad2b\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.064935 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkg9d\" (UniqueName: \"kubernetes.io/projected/ce3508e5-7126-47b0-a598-da6515457cb7-kube-api-access-dkg9d\") pod \"ce3508e5-7126-47b0-a598-da6515457cb7\" (UID: \"ce3508e5-7126-47b0-a598-da6515457cb7\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065014 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-config-data\") pod \"9bbdf164-51e7-4faf-986b-fba5044fad2b\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065103 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2xgr\" (UniqueName: \"kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065152 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065345 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065362 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065371 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065379 4885 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065388 4885 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/90fb4d53-4722-4f72-9f1a-99ee2b637f6e-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065398 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60f9821e-e554-4594-bfb2-9521cd3c171a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.065787 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce3508e5-7126-47b0-a598-da6515457cb7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce3508e5-7126-47b0-a598-da6515457cb7" (UID: "ce3508e5-7126-47b0-a598-da6515457cb7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.073842 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bbdf164-51e7-4faf-986b-fba5044fad2b-kube-api-access-hs4qz" (OuterVolumeSpecName: "kube-api-access-hs4qz") pod "9bbdf164-51e7-4faf-986b-fba5044fad2b" (UID: "9bbdf164-51e7-4faf-986b-fba5044fad2b"). InnerVolumeSpecName "kube-api-access-hs4qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.104895 4885 scope.go:117] "RemoveContainer" containerID="4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.106834 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce3508e5-7126-47b0-a598-da6515457cb7-kube-api-access-dkg9d" (OuterVolumeSpecName: "kube-api-access-dkg9d") pod "ce3508e5-7126-47b0-a598-da6515457cb7" (UID: "ce3508e5-7126-47b0-a598-da6515457cb7"). InnerVolumeSpecName "kube-api-access-dkg9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.137124 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-config-data" (OuterVolumeSpecName: "config-data") pod "9bbdf164-51e7-4faf-986b-fba5044fad2b" (UID: "9bbdf164-51e7-4faf-986b-fba5044fad2b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.152686 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.155180 4885 scope.go:117] "RemoveContainer" containerID="e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.155602 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5\": container with ID starting with e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5 not found: ID does not exist" containerID="e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.155641 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5"} err="failed to get container status \"e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5\": rpc error: code = NotFound desc = could not find container \"e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5\": container with ID starting with e5809b1046bf2457d118e13d4918cf24d791ba3c263a168e5b9035a9c6c9b0c5 not found: ID does not exist" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.155669 4885 scope.go:117] "RemoveContainer" containerID="4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.155885 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c\": container with ID starting with 4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c not found: ID does not exist" containerID="4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.155904 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c"} err="failed to get container status \"4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c\": rpc error: code = NotFound desc = could not find container \"4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c\": container with ID starting with 4112f7dcbc148819c820fd2e2d3a386eba7065442ec73e2ea7880fb2f7e6560c not found: ID does not exist" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.159131 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.166892 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-combined-ca-bundle\") pod \"9bbdf164-51e7-4faf-986b-fba5044fad2b\" (UID: \"9bbdf164-51e7-4faf-986b-fba5044fad2b\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.167404 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2xgr\" (UniqueName: \"kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.167529 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.167724 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce3508e5-7126-47b0-a598-da6515457cb7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.167798 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs4qz\" (UniqueName: \"kubernetes.io/projected/9bbdf164-51e7-4faf-986b-fba5044fad2b-kube-api-access-hs4qz\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.167856 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkg9d\" (UniqueName: \"kubernetes.io/projected/ce3508e5-7126-47b0-a598-da6515457cb7-kube-api-access-dkg9d\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.167910 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.167935 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.168190 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data podName:96257eac-42ec-44cf-80be-9be68c0ebb1b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:32.168157645 +0000 UTC m=+1493.564211668 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b") : configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.168567 4885 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.168659 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts podName:9e649171-680c-445a-b418-734a5c7322e3 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:28.668648838 +0000 UTC m=+1490.064702861 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts") pod "keystone-3705-account-create-update-c72qw" (UID: "9e649171-680c-445a-b418-734a5c7322e3") : configmap "openstack-scripts" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.172693 4885 projected.go:194] Error preparing data for projected volume kube-api-access-k2xgr for pod openstack/keystone-3705-account-create-update-c72qw: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.172750 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr podName:9e649171-680c-445a-b418-734a5c7322e3 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:28.672733056 +0000 UTC m=+1490.068787079 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-k2xgr" (UniqueName: "kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr") pod "keystone-3705-account-create-update-c72qw" (UID: "9e649171-680c-445a-b418-734a5c7322e3") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.178737 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerName="galera" containerID="cri-o://88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67" gracePeriod=30 Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.204346 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.214823 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.217352 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1f465c_123b_455f_8bd8_720d3f8a4bef.slice/crio-conmon-46513b3771e23d8ed82d3b5bc73c4d07608c21fefaf4830b5055b5a4a5d6d688.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1f465c_123b_455f_8bd8_720d3f8a4bef.slice/crio-c26148668f63c2c808f3994e48705725bbf52e07fae581041f2a8517c972eb19.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90fb4d53_4722_4f72_9f1a_99ee2b637f6e.slice/crio-36ec2d571999d46c471303241aed86c9b5d0e06a8eac136c91582cb662c8d63a\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a1f465c_123b_455f_8bd8_720d3f8a4bef.slice/crio-conmon-a5a0e7af89f0943433efc0423974aa4157ace5b596adabc6170e4373acc330a7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63c3ea8e_9683_45b9_805b_d1049840b0da.slice/crio-72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63c3ea8e_9683_45b9_805b_d1049840b0da.slice/crio-conmon-72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b.scope\": RecentStats: unable to find data in memory cache]" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.253847 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.259154 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bbdf164-51e7-4faf-986b-fba5044fad2b" (UID: "9bbdf164-51e7-4faf-986b-fba5044fad2b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.269829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-operator-scripts\") pod \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.269896 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2242ad5f-8a7e-4017-8441-6d05b2c94930-operator-scripts\") pod \"2242ad5f-8a7e-4017-8441-6d05b2c94930\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.270422 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bbdf164-51e7-4faf-986b-fba5044fad2b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.270859 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525" (UID: "d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.270892 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2242ad5f-8a7e-4017-8441-6d05b2c94930-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2242ad5f-8a7e-4017-8441-6d05b2c94930" (UID: "2242ad5f-8a7e-4017-8441-6d05b2c94930"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.340667 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.371211 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfcbl\" (UniqueName: \"kubernetes.io/projected/36678078-1658-4edc-a256-3f0bb8d23ed8-kube-api-access-rfcbl\") pod \"36678078-1658-4edc-a256-3f0bb8d23ed8\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.371589 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhdsh\" (UniqueName: \"kubernetes.io/projected/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-kube-api-access-bhdsh\") pod \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\" (UID: \"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.371710 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36678078-1658-4edc-a256-3f0bb8d23ed8-operator-scripts\") pod \"36678078-1658-4edc-a256-3f0bb8d23ed8\" (UID: \"36678078-1658-4edc-a256-3f0bb8d23ed8\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.371880 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hl27\" (UniqueName: \"kubernetes.io/projected/2242ad5f-8a7e-4017-8441-6d05b2c94930-kube-api-access-9hl27\") pod \"2242ad5f-8a7e-4017-8441-6d05b2c94930\" (UID: \"2242ad5f-8a7e-4017-8441-6d05b2c94930\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.372217 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36678078-1658-4edc-a256-3f0bb8d23ed8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36678078-1658-4edc-a256-3f0bb8d23ed8" (UID: "36678078-1658-4edc-a256-3f0bb8d23ed8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.373109 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.373284 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36678078-1658-4edc-a256-3f0bb8d23ed8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.373366 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2242ad5f-8a7e-4017-8441-6d05b2c94930-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.374935 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-kube-api-access-bhdsh" (OuterVolumeSpecName: "kube-api-access-bhdsh") pod "d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525" (UID: "d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525"). InnerVolumeSpecName "kube-api-access-bhdsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.375499 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2242ad5f-8a7e-4017-8441-6d05b2c94930-kube-api-access-9hl27" (OuterVolumeSpecName: "kube-api-access-9hl27") pod "2242ad5f-8a7e-4017-8441-6d05b2c94930" (UID: "2242ad5f-8a7e-4017-8441-6d05b2c94930"). InnerVolumeSpecName "kube-api-access-9hl27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.376316 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36678078-1658-4edc-a256-3f0bb8d23ed8-kube-api-access-rfcbl" (OuterVolumeSpecName: "kube-api-access-rfcbl") pod "36678078-1658-4edc-a256-3f0bb8d23ed8" (UID: "36678078-1658-4edc-a256-3f0bb8d23ed8"). InnerVolumeSpecName "kube-api-access-rfcbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.423881 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.424453 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.425005 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.425072 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.425449 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.426986 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.429243 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.429314 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.475031 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-config\") pod \"63c3ea8e-9683-45b9-805b-d1049840b0da\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.475677 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-combined-ca-bundle\") pod \"63c3ea8e-9683-45b9-805b-d1049840b0da\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.475769 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqx6r\" (UniqueName: \"kubernetes.io/projected/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-api-access-bqx6r\") pod \"63c3ea8e-9683-45b9-805b-d1049840b0da\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.476051 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-certs\") pod \"63c3ea8e-9683-45b9-805b-d1049840b0da\" (UID: \"63c3ea8e-9683-45b9-805b-d1049840b0da\") " Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.476706 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhdsh\" (UniqueName: \"kubernetes.io/projected/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525-kube-api-access-bhdsh\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.476726 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hl27\" (UniqueName: \"kubernetes.io/projected/2242ad5f-8a7e-4017-8441-6d05b2c94930-kube-api-access-9hl27\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.476740 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfcbl\" (UniqueName: \"kubernetes.io/projected/36678078-1658-4edc-a256-3f0bb8d23ed8-kube-api-access-rfcbl\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.487824 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-api-access-bqx6r" (OuterVolumeSpecName: "kube-api-access-bqx6r") pod "63c3ea8e-9683-45b9-805b-d1049840b0da" (UID: "63c3ea8e-9683-45b9-805b-d1049840b0da"). InnerVolumeSpecName "kube-api-access-bqx6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.507314 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" event={"ID":"2242ad5f-8a7e-4017-8441-6d05b2c94930","Type":"ContainerDied","Data":"82162e312894c7024d5a8960ccd0eb1055e87a8eb4f79ab3a9a1cc489e5a6576"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.507367 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1b03-account-create-update-m2lkt" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.511655 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-a047-account-create-update-qrjwk" event={"ID":"d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525","Type":"ContainerDied","Data":"34abe95868deb87094ecfc507d2e30a4f3b296f2e25ac6896a50280fdc7341bb"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.511778 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-a047-account-create-update-qrjwk" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.515714 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63c3ea8e-9683-45b9-805b-d1049840b0da" (UID: "63c3ea8e-9683-45b9-805b-d1049840b0da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.519302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-031a-account-create-update-ss6cl" event={"ID":"36678078-1658-4edc-a256-3f0bb8d23ed8","Type":"ContainerDied","Data":"81ac023e6b35f76413c8e20a66666b09686bcf823a6a24f94f4a77215fb4b2b2"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.519314 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-031a-account-create-update-ss6cl" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.527601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"925797ff-e1b0-4df7-83db-2091264a4bb8","Type":"ContainerDied","Data":"9491886528ee59a2997b30d600c2b1b7132f56a89a9aaa3a02c1e5325fbb4651"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.527649 4885 scope.go:117] "RemoveContainer" containerID="db89449f6f1c83b68b931621e7020697f142f99f61f956dab42082178b512a3d" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.527823 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.531610 4885 generic.go:334] "Generic (PLEG): container finished" podID="9bbdf164-51e7-4faf-986b-fba5044fad2b" containerID="bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6" exitCode=0 Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.531700 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bbdf164-51e7-4faf-986b-fba5044fad2b","Type":"ContainerDied","Data":"bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.531717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"9bbdf164-51e7-4faf-986b-fba5044fad2b","Type":"ContainerDied","Data":"c280fe9fe13ec4f9da8c09591afb05298340012fc23ed25819dbc3970dce1bc0"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.531737 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.534147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "63c3ea8e-9683-45b9-805b-d1049840b0da" (UID: "63c3ea8e-9683-45b9-805b-d1049840b0da"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.535288 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-744484b5fc-g6mjz" event={"ID":"60f9821e-e554-4594-bfb2-9521cd3c171a","Type":"ContainerDied","Data":"a9b17f7da8fbb915380a49441fc73dc30251abdd34aa8c926a52efea3b64bcdc"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.535421 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-744484b5fc-g6mjz" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.552523 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-f33b-account-create-update-vbmkj" event={"ID":"ce3508e5-7126-47b0-a598-da6515457cb7","Type":"ContainerDied","Data":"92b9ea85ebe6946aa9aeb6553980bbeaa2d164cb127da6a8cd1b41eb2565b1b7"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.552596 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-f33b-account-create-update-vbmkj" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.556103 4885 generic.go:334] "Generic (PLEG): container finished" podID="63c3ea8e-9683-45b9-805b-d1049840b0da" containerID="72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b" exitCode=2 Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.556229 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.556824 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"63c3ea8e-9683-45b9-805b-d1049840b0da","Type":"ContainerDied","Data":"72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.556860 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"63c3ea8e-9683-45b9-805b-d1049840b0da","Type":"ContainerDied","Data":"c766b62f5cceadf0886905919f92953b9185d1630ad51bf23b383898036558fd"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.561208 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerID="46513b3771e23d8ed82d3b5bc73c4d07608c21fefaf4830b5055b5a4a5d6d688" exitCode=0 Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.561241 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerID="c26148668f63c2c808f3994e48705725bbf52e07fae581041f2a8517c972eb19" exitCode=2 Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.561253 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerID="a5a0e7af89f0943433efc0423974aa4157ace5b596adabc6170e4373acc330a7" exitCode=0 Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.561867 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerDied","Data":"46513b3771e23d8ed82d3b5bc73c4d07608c21fefaf4830b5055b5a4a5d6d688"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.561904 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerDied","Data":"c26148668f63c2c808f3994e48705725bbf52e07fae581041f2a8517c972eb19"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.561959 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerDied","Data":"a5a0e7af89f0943433efc0423974aa4157ace5b596adabc6170e4373acc330a7"} Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.562255 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.563270 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "63c3ea8e-9683-45b9-805b-d1049840b0da" (UID: "63c3ea8e-9683-45b9-805b-d1049840b0da"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.578659 4885 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.578697 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.578714 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqx6r\" (UniqueName: \"kubernetes.io/projected/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-api-access-bqx6r\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.578741 4885 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c3ea8e-9683-45b9-805b-d1049840b0da-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.634511 4885 scope.go:117] "RemoveContainer" containerID="81b8543a909b03c12110951d0f4dfaca241eaccbf11cf8dd8e3aa4e40b790556" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.645098 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.684017 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2xgr\" (UniqueName: \"kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.684093 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.684447 4885 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.684514 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts podName:9e649171-680c-445a-b418-734a5c7322e3 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:29.684495924 +0000 UTC m=+1491.080549947 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts") pod "keystone-3705-account-create-update-c72qw" (UID: "9e649171-680c-445a-b418-734a5c7322e3") : configmap "openstack-scripts" not found Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.690180 4885 scope.go:117] "RemoveContainer" containerID="bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.690308 4885 projected.go:194] Error preparing data for projected volume kube-api-access-k2xgr for pod openstack/keystone-3705-account-create-update-c72qw: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.690370 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr podName:9e649171-680c-445a-b418-734a5c7322e3 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:29.69035167 +0000 UTC m=+1491.086405693 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-k2xgr" (UniqueName: "kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr") pod "keystone-3705-account-create-update-c72qw" (UID: "9e649171-680c-445a-b418-734a5c7322e3") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.702837 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-m2lkt"] Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.797994 4885 configmap.go:193] Couldn't get configMap openstack/openstack-cell1-scripts: configmap "openstack-cell1-scripts" not found Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.798060 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts podName:09db13b9-d564-49c9-b383-5fbfe0e43c9b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:30.798047845 +0000 UTC m=+1492.194101868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts") pod "root-account-create-update-bq8dk" (UID: "09db13b9-d564-49c9-b383-5fbfe0e43c9b") : configmap "openstack-cell1-scripts" not found Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.816126 4885 scope.go:117] "RemoveContainer" containerID="bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.816674 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6\": container with ID starting with bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6 not found: ID does not exist" containerID="bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.816719 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6"} err="failed to get container status \"bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6\": rpc error: code = NotFound desc = could not find container \"bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6\": container with ID starting with bedba211c3d7ff77fb4b5d996a6bfb79b36c67d35467c0d4297f81cebd1fdce6 not found: ID does not exist" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.816745 4885 scope.go:117] "RemoveContainer" containerID="52b3a63b99137a90e03814ec54cafbe70d9ceac7f36dc9e81e362bf716f0276b" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.819501 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1b03-account-create-update-m2lkt"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.835705 4885 scope.go:117] "RemoveContainer" containerID="67f1959dc61ea688f5069874c4fb20e1a6cd0f9f33725553c532e43624ce5124" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.841135 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.846955 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.852291 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-744484b5fc-g6mjz"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.859061 4885 scope.go:117] "RemoveContainer" containerID="72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.862237 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-744484b5fc-g6mjz"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.893004 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-f33b-account-create-update-vbmkj"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.898294 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-f33b-account-create-update-vbmkj"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.906843 4885 scope.go:117] "RemoveContainer" containerID="72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b" Mar 08 19:56:28 crc kubenswrapper[4885]: E0308 19:56:28.908797 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b\": container with ID starting with 72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b not found: ID does not exist" containerID="72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.908861 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b"} err="failed to get container status \"72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b\": rpc error: code = NotFound desc = could not find container \"72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b\": container with ID starting with 72784488b6dc97fc7115f8a7cddf96d00da1c214fe5d45c125108db0b751c44b not found: ID does not exist" Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.930319 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-a047-account-create-update-qrjwk"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.939802 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-a047-account-create-update-qrjwk"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.956417 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-031a-account-create-update-ss6cl"] Mar 08 19:56:28 crc kubenswrapper[4885]: I0308 19:56:28.963751 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-031a-account-create-update-ss6cl"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.002008 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.014114 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.014223 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.022618 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.024965 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-58c657b6d6-r4tf7" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-log" probeResult="failure" output="Get \"https://10.217.0.170:8778/\": read tcp 10.217.0.2:60334->10.217.0.170:8778: read: connection reset by peer" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.025190 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/placement-58c657b6d6-r4tf7" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-api" probeResult="failure" output="Get \"https://10.217.0.170:8778/\": read tcp 10.217.0.2:60324->10.217.0.170:8778: read: connection reset by peer" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.028817 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.063185 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.102958 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/619a568c-d0c3-408b-96c1-39a3a769d1ad-operator-scripts\") pod \"619a568c-d0c3-408b-96c1-39a3a769d1ad\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.103041 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqv7f\" (UniqueName: \"kubernetes.io/projected/619a568c-d0c3-408b-96c1-39a3a769d1ad-kube-api-access-hqv7f\") pod \"619a568c-d0c3-408b-96c1-39a3a769d1ad\" (UID: \"619a568c-d0c3-408b-96c1-39a3a769d1ad\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.103246 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gd8sz\" (UniqueName: \"kubernetes.io/projected/09db13b9-d564-49c9-b383-5fbfe0e43c9b-kube-api-access-gd8sz\") pod \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.103354 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts\") pod \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\" (UID: \"09db13b9-d564-49c9-b383-5fbfe0e43c9b\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.103453 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/619a568c-d0c3-408b-96c1-39a3a769d1ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "619a568c-d0c3-408b-96c1-39a3a769d1ad" (UID: "619a568c-d0c3-408b-96c1-39a3a769d1ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.103809 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "09db13b9-d564-49c9-b383-5fbfe0e43c9b" (UID: "09db13b9-d564-49c9-b383-5fbfe0e43c9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.104102 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/09db13b9-d564-49c9-b383-5fbfe0e43c9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.104123 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/619a568c-d0c3-408b-96c1-39a3a769d1ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.108683 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09db13b9-d564-49c9-b383-5fbfe0e43c9b-kube-api-access-gd8sz" (OuterVolumeSpecName: "kube-api-access-gd8sz") pod "09db13b9-d564-49c9-b383-5fbfe0e43c9b" (UID: "09db13b9-d564-49c9-b383-5fbfe0e43c9b"). InnerVolumeSpecName "kube-api-access-gd8sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.111135 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/619a568c-d0c3-408b-96c1-39a3a769d1ad-kube-api-access-hqv7f" (OuterVolumeSpecName: "kube-api-access-hqv7f") pod "619a568c-d0c3-408b-96c1-39a3a769d1ad" (UID: "619a568c-d0c3-408b-96c1-39a3a769d1ad"). InnerVolumeSpecName "kube-api-access-hqv7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.176627 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": read tcp 10.217.0.2:53768->10.217.0.210:8775: read: connection reset by peer" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.176653 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": read tcp 10.217.0.2:53760->10.217.0.210:8775: read: connection reset by peer" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.182540 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.191827 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.168:8776/healthcheck\": read tcp 10.217.0.2:57112->10.217.0.168:8776: read: connection reset by peer" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.205739 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-kolla-config\") pod \"da1d62ba-4033-4906-87c1-d673c1ab8637\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.205909 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-combined-ca-bundle\") pod \"da1d62ba-4033-4906-87c1-d673c1ab8637\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.206041 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-memcached-tls-certs\") pod \"da1d62ba-4033-4906-87c1-d673c1ab8637\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.206093 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q554n\" (UniqueName: \"kubernetes.io/projected/da1d62ba-4033-4906-87c1-d673c1ab8637-kube-api-access-q554n\") pod \"da1d62ba-4033-4906-87c1-d673c1ab8637\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.206138 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-config-data\") pod \"da1d62ba-4033-4906-87c1-d673c1ab8637\" (UID: \"da1d62ba-4033-4906-87c1-d673c1ab8637\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.206575 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqv7f\" (UniqueName: \"kubernetes.io/projected/619a568c-d0c3-408b-96c1-39a3a769d1ad-kube-api-access-hqv7f\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.206588 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gd8sz\" (UniqueName: \"kubernetes.io/projected/09db13b9-d564-49c9-b383-5fbfe0e43c9b-kube-api-access-gd8sz\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.207166 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-config-data" (OuterVolumeSpecName: "config-data") pod "da1d62ba-4033-4906-87c1-d673c1ab8637" (UID: "da1d62ba-4033-4906-87c1-d673c1ab8637"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.213285 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "da1d62ba-4033-4906-87c1-d673c1ab8637" (UID: "da1d62ba-4033-4906-87c1-d673c1ab8637"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.218515 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1d62ba-4033-4906-87c1-d673c1ab8637-kube-api-access-q554n" (OuterVolumeSpecName: "kube-api-access-q554n") pod "da1d62ba-4033-4906-87c1-d673c1ab8637" (UID: "da1d62ba-4033-4906-87c1-d673c1ab8637"). InnerVolumeSpecName "kube-api-access-q554n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.241654 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "da1d62ba-4033-4906-87c1-d673c1ab8637" (UID: "da1d62ba-4033-4906-87c1-d673c1ab8637"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.276419 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "da1d62ba-4033-4906-87c1-d673c1ab8637" (UID: "da1d62ba-4033-4906-87c1-d673c1ab8637"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.307447 4885 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.307478 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q554n\" (UniqueName: \"kubernetes.io/projected/da1d62ba-4033-4906-87c1-d673c1ab8637-kube-api-access-q554n\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.307488 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.307498 4885 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/da1d62ba-4033-4906-87c1-d673c1ab8637-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.307506 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1d62ba-4033-4906-87c1-d673c1ab8637-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.338518 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.382780 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2242ad5f-8a7e-4017-8441-6d05b2c94930" path="/var/lib/kubelet/pods/2242ad5f-8a7e-4017-8441-6d05b2c94930/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.384308 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="321f89cf-ed1f-4f10-a198-e55c23171363" path="/var/lib/kubelet/pods/321f89cf-ed1f-4f10-a198-e55c23171363/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.385674 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36678078-1658-4edc-a256-3f0bb8d23ed8" path="/var/lib/kubelet/pods/36678078-1658-4edc-a256-3f0bb8d23ed8/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.386067 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43dd77c8-6951-423a-9334-502f66c3d1b5" path="/var/lib/kubelet/pods/43dd77c8-6951-423a-9334-502f66c3d1b5/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.387067 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60f9821e-e554-4594-bfb2-9521cd3c171a" path="/var/lib/kubelet/pods/60f9821e-e554-4594-bfb2-9521cd3c171a/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.387598 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63c3ea8e-9683-45b9-805b-d1049840b0da" path="/var/lib/kubelet/pods/63c3ea8e-9683-45b9-805b-d1049840b0da/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.388390 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3418f5-a92a-4fe6-b0ea-929b54ecb052" path="/var/lib/kubelet/pods/8b3418f5-a92a-4fe6-b0ea-929b54ecb052/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.390093 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90fb4d53-4722-4f72-9f1a-99ee2b637f6e" path="/var/lib/kubelet/pods/90fb4d53-4722-4f72-9f1a-99ee2b637f6e/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.392759 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925797ff-e1b0-4df7-83db-2091264a4bb8" path="/var/lib/kubelet/pods/925797ff-e1b0-4df7-83db-2091264a4bb8/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.393303 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bbdf164-51e7-4faf-986b-fba5044fad2b" path="/var/lib/kubelet/pods/9bbdf164-51e7-4faf-986b-fba5044fad2b/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.394301 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce3508e5-7126-47b0-a598-da6515457cb7" path="/var/lib/kubelet/pods/ce3508e5-7126-47b0-a598-da6515457cb7/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.394672 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525" path="/var/lib/kubelet/pods/d5ba6c20-150e-48ca-ac4a-4a6a8ef1f525/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.395290 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d768ed9e-b089-4308-befc-e3bd6aa68683" path="/var/lib/kubelet/pods/d768ed9e-b089-4308-befc-e3bd6aa68683/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.397268 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7884923-e1d5-4b4d-a285-680bfbe38277" path="/var/lib/kubelet/pods/f7884923-e1d5-4b4d-a285-680bfbe38277/volumes" Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.403222 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.408494 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts\") pod \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.408596 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zhx2\" (UniqueName: \"kubernetes.io/projected/a3e6330b-4e2d-44ca-b9be-d36b2f613571-kube-api-access-7zhx2\") pod \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\" (UID: \"a3e6330b-4e2d-44ca-b9be-d36b2f613571\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.411071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a3e6330b-4e2d-44ca-b9be-d36b2f613571" (UID: "a3e6330b-4e2d-44ca-b9be-d36b2f613571"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.415032 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3e6330b-4e2d-44ca-b9be-d36b2f613571-kube-api-access-7zhx2" (OuterVolumeSpecName: "kube-api-access-7zhx2") pod "a3e6330b-4e2d-44ca-b9be-d36b2f613571" (UID: "a3e6330b-4e2d-44ca-b9be-d36b2f613571"). InnerVolumeSpecName "kube-api-access-7zhx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.440379 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.445476 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.445544 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="945717bc-405f-4628-934c-66e4500f56f0" containerName="nova-scheduler-scheduler" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.510544 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a3e6330b-4e2d-44ca-b9be-d36b2f613571-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.510572 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zhx2\" (UniqueName: \"kubernetes.io/projected/a3e6330b-4e2d-44ca-b9be-d36b2f613571-kube-api-access-7zhx2\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.620489 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-bq8dk" event={"ID":"09db13b9-d564-49c9-b383-5fbfe0e43c9b","Type":"ContainerDied","Data":"8eb8421d6a99d9718a9f12940cde328117f97ad0e5bd5e92cde85c15eeb1502b"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.620584 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-bq8dk" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.632087 4885 generic.go:334] "Generic (PLEG): container finished" podID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerID="41d8583c3d498141cc2e38d1ed8623082609a86faa3f087e124f2692ac0c8871" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.632165 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4ca493a-f707-45c3-b457-1a1053c3dfe5","Type":"ContainerDied","Data":"41d8583c3d498141cc2e38d1ed8623082609a86faa3f087e124f2692ac0c8871"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.650308 4885 generic.go:334] "Generic (PLEG): container finished" podID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerID="1b8b8e4856a24e16b23d4c15ef261857dfcc94531017a1b02728028102e1d5ce" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.650395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c657b6d6-r4tf7" event={"ID":"719b68df-d1ac-49e5-ac34-dfa3ba33c97f","Type":"ContainerDied","Data":"1b8b8e4856a24e16b23d4c15ef261857dfcc94531017a1b02728028102e1d5ce"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.650419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-58c657b6d6-r4tf7" event={"ID":"719b68df-d1ac-49e5-ac34-dfa3ba33c97f","Type":"ContainerDied","Data":"91a7898b581f4a0b0c09c7d67b2b320f9e2ef08425d7b081856b5b38c0f51cba"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.650431 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91a7898b581f4a0b0c09c7d67b2b320f9e2ef08425d7b081856b5b38c0f51cba" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.661709 4885 generic.go:334] "Generic (PLEG): container finished" podID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerID="2d8347d05b060d48223bdd22690e18395a24601df91142452e20c85edd93a56a" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.661788 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cdd926a8-442c-4f63-bb36-3e6a425436c2","Type":"ContainerDied","Data":"2d8347d05b060d48223bdd22690e18395a24601df91142452e20c85edd93a56a"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.663314 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-86ea-account-create-update-zb2lr" event={"ID":"619a568c-d0c3-408b-96c1-39a3a769d1ad","Type":"ContainerDied","Data":"ecad256f4e94f24cfab1fe144e2c009ebd76a6803d21f421f8c4cfc3079aa401"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.663415 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-86ea-account-create-update-zb2lr" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.675734 4885 generic.go:334] "Generic (PLEG): container finished" podID="a083a431-5afc-4289-a5cf-625bc619465e" containerID="4621296620fb47f88fa5eba2c20d8a6e4bdfb8042020e5c5c63fda5c073094aa" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.675791 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a083a431-5afc-4289-a5cf-625bc619465e","Type":"ContainerDied","Data":"4621296620fb47f88fa5eba2c20d8a6e4bdfb8042020e5c5c63fda5c073094aa"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.678150 4885 generic.go:334] "Generic (PLEG): container finished" podID="da1d62ba-4033-4906-87c1-d673c1ab8637" containerID="f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.678196 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"da1d62ba-4033-4906-87c1-d673c1ab8637","Type":"ContainerDied","Data":"f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.678212 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"da1d62ba-4033-4906-87c1-d673c1ab8637","Type":"ContainerDied","Data":"d5c7811542e264146da5dbb28e9ad294c9b3d2c5ddb970c427caa3521f6cd065"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.678229 4885 scope.go:117] "RemoveContainer" containerID="f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.678313 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.682966 4885 generic.go:334] "Generic (PLEG): container finished" podID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerID="f5e5f2790360729d9c4394c0e85bb4e8ea8164ab35be9023623e79b3a117f852" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.683027 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50b429e9-fb10-48ba-b15c-ec25d57e707a","Type":"ContainerDied","Data":"f5e5f2790360729d9c4394c0e85bb4e8ea8164ab35be9023623e79b3a117f852"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.698141 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.699610 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cd3f-account-create-update-ccxvz" event={"ID":"a3e6330b-4e2d-44ca-b9be-d36b2f613571","Type":"ContainerDied","Data":"17603ca27d7ecde56e3c1a978d3f1cf7aefc5c4500d55699231533afe9deacd5"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.711635 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.714111 4885 generic.go:334] "Generic (PLEG): container finished" podID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerID="7305f11ea3e6d044101cca24b88547af01f2e1506724f8e566c2a9df42c34dc6" exitCode=0 Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.714195 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64baa35e-d1c2-48fe-a7a1-d0a4d1485908","Type":"ContainerDied","Data":"7305f11ea3e6d044101cca24b88547af01f2e1506724f8e566c2a9df42c34dc6"} Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.714211 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.714605 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2xgr\" (UniqueName: \"kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.714665 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts\") pod \"keystone-3705-account-create-update-c72qw\" (UID: \"9e649171-680c-445a-b418-734a5c7322e3\") " pod="openstack/keystone-3705-account-create-update-c72qw" Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.715026 4885 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.715083 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts podName:9e649171-680c-445a-b418-734a5c7322e3 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:31.715065117 +0000 UTC m=+1493.111119140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts") pod "keystone-3705-account-create-update-c72qw" (UID: "9e649171-680c-445a-b418-734a5c7322e3") : configmap "openstack-scripts" not found Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.718161 4885 projected.go:194] Error preparing data for projected volume kube-api-access-k2xgr for pod openstack/keystone-3705-account-create-update-c72qw: failed to fetch token: serviceaccounts "galera-openstack" not found Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.718218 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr podName:9e649171-680c-445a-b418-734a5c7322e3 nodeName:}" failed. No retries permitted until 2026-03-08 19:56:31.718202421 +0000 UTC m=+1493.114256444 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-k2xgr" (UniqueName: "kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr") pod "keystone-3705-account-create-update-c72qw" (UID: "9e649171-680c-445a-b418-734a5c7322e3") : failed to fetch token: serviceaccounts "galera-openstack" not found Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.726318 4885 scope.go:117] "RemoveContainer" containerID="f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb" Mar 08 19:56:29 crc kubenswrapper[4885]: E0308 19:56:29.726831 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb\": container with ID starting with f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb not found: ID does not exist" containerID="f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.726875 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb"} err="failed to get container status \"f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb\": rpc error: code = NotFound desc = could not find container \"f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb\": container with ID starting with f2754f2c8e86a129702f628cd11b6affb50f3d071536a6cb3bba3e1b0b76f4bb not found: ID does not exist" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.728278 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.772907 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816238 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-logs\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816310 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-config-data\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816350 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44fnl\" (UniqueName: \"kubernetes.io/projected/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-kube-api-access-44fnl\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816517 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-internal-tls-certs\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816587 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-combined-ca-bundle\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816654 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-public-tls-certs\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.816675 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-scripts\") pod \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\" (UID: \"719b68df-d1ac-49e5-ac34-dfa3ba33c97f\") " Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.820619 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-scripts" (OuterVolumeSpecName: "scripts") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.821035 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-logs" (OuterVolumeSpecName: "logs") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.845072 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-ccxvz"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.845111 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-kube-api-access-44fnl" (OuterVolumeSpecName: "kube-api-access-44fnl") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "kube-api-access-44fnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.860956 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cd3f-account-create-update-ccxvz"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.900222 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-config-data" (OuterVolumeSpecName: "config-data") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.911143 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Mar 08 19:56:29 crc kubenswrapper[4885]: I0308 19:56:29.914087 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943314 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w82z5\" (UniqueName: \"kubernetes.io/projected/a083a431-5afc-4289-a5cf-625bc619465e-kube-api-access-w82z5\") pod \"a083a431-5afc-4289-a5cf-625bc619465e\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943407 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-combined-ca-bundle\") pod \"a083a431-5afc-4289-a5cf-625bc619465e\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943470 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-internal-tls-certs\") pod \"a083a431-5afc-4289-a5cf-625bc619465e\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943500 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-nova-metadata-tls-certs\") pod \"cdd926a8-442c-4f63-bb36-3e6a425436c2\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943550 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-public-tls-certs\") pod \"a083a431-5afc-4289-a5cf-625bc619465e\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943639 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdd926a8-442c-4f63-bb36-3e6a425436c2-logs\") pod \"cdd926a8-442c-4f63-bb36-3e6a425436c2\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943694 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-config-data\") pod \"a083a431-5afc-4289-a5cf-625bc619465e\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943803 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-combined-ca-bundle\") pod \"cdd926a8-442c-4f63-bb36-3e6a425436c2\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943834 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083a431-5afc-4289-a5cf-625bc619465e-logs\") pod \"a083a431-5afc-4289-a5cf-625bc619465e\" (UID: \"a083a431-5afc-4289-a5cf-625bc619465e\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943901 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-config-data\") pod \"cdd926a8-442c-4f63-bb36-3e6a425436c2\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.943979 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxxc2\" (UniqueName: \"kubernetes.io/projected/cdd926a8-442c-4f63-bb36-3e6a425436c2-kube-api-access-xxxc2\") pod \"cdd926a8-442c-4f63-bb36-3e6a425436c2\" (UID: \"cdd926a8-442c-4f63-bb36-3e6a425436c2\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.944438 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a083a431-5afc-4289-a5cf-625bc619465e-kube-api-access-w82z5" (OuterVolumeSpecName: "kube-api-access-w82z5") pod "a083a431-5afc-4289-a5cf-625bc619465e" (UID: "a083a431-5afc-4289-a5cf-625bc619465e"). InnerVolumeSpecName "kube-api-access-w82z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.944821 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdd926a8-442c-4f63-bb36-3e6a425436c2-logs" (OuterVolumeSpecName: "logs") pod "cdd926a8-442c-4f63-bb36-3e6a425436c2" (UID: "cdd926a8-442c-4f63-bb36-3e6a425436c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.944985 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdd926a8-442c-4f63-bb36-3e6a425436c2-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.945015 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.945028 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.945038 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.945047 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44fnl\" (UniqueName: \"kubernetes.io/projected/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-kube-api-access-44fnl\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.945057 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w82z5\" (UniqueName: \"kubernetes.io/projected/a083a431-5afc-4289-a5cf-625bc619465e-kube-api-access-w82z5\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.950077 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a083a431-5afc-4289-a5cf-625bc619465e-logs" (OuterVolumeSpecName: "logs") pod "a083a431-5afc-4289-a5cf-625bc619465e" (UID: "a083a431-5afc-4289-a5cf-625bc619465e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.969839 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-86ea-account-create-update-zb2lr"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.973783 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-86ea-account-create-update-zb2lr"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.976790 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdd926a8-442c-4f63-bb36-3e6a425436c2-kube-api-access-xxxc2" (OuterVolumeSpecName: "kube-api-access-xxxc2") pod "cdd926a8-442c-4f63-bb36-3e6a425436c2" (UID: "cdd926a8-442c-4f63-bb36-3e6a425436c2"). InnerVolumeSpecName "kube-api-access-xxxc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.992648 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-bq8dk"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:29.999535 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.005243 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.021146 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.042974 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-bq8dk"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.045752 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9zl6\" (UniqueName: \"kubernetes.io/projected/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-kube-api-access-f9zl6\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.045843 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-combined-ca-bundle\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.045956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-logs\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046002 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-public-tls-certs\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046090 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data-custom\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046153 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-internal-tls-certs\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046244 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-etc-machine-id\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046301 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-scripts\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046339 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data\") pod \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\" (UID: \"64baa35e-d1c2-48fe-a7a1-d0a4d1485908\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.046764 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047064 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-logs" (OuterVolumeSpecName: "logs") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047620 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047650 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083a431-5afc-4289-a5cf-625bc619465e-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047665 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047677 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxxc2\" (UniqueName: \"kubernetes.io/projected/cdd926a8-442c-4f63-bb36-3e6a425436c2-kube-api-access-xxxc2\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047693 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.047706 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.055467 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.055498 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-config-data" (OuterVolumeSpecName: "config-data") pod "a083a431-5afc-4289-a5cf-625bc619465e" (UID: "a083a431-5afc-4289-a5cf-625bc619465e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.087290 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3705-account-create-update-c72qw"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.094875 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3705-account-create-update-c72qw"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.110706 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-796cf584f6-dfmcm" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:48972->10.217.0.167:9311: read: connection reset by peer" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.111055 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-796cf584f6-dfmcm" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.167:9311/healthcheck\": read tcp 10.217.0.2:48956->10.217.0.167:9311: read: connection reset by peer" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.116591 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.128628 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-scripts" (OuterVolumeSpecName: "scripts") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.128770 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-kube-api-access-f9zl6" (OuterVolumeSpecName: "kube-api-access-f9zl6") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "kube-api-access-f9zl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.132541 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdd926a8-442c-4f63-bb36-3e6a425436c2" (UID: "cdd926a8-442c-4f63-bb36-3e6a425436c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.139053 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a083a431-5afc-4289-a5cf-625bc619465e" (UID: "a083a431-5afc-4289-a5cf-625bc619465e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.151429 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-logs\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.151505 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tpp6\" (UniqueName: \"kubernetes.io/projected/e4ca493a-f707-45c3-b457-1a1053c3dfe5-kube-api-access-7tpp6\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.151538 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.151645 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-scripts\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.152071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-logs" (OuterVolumeSpecName: "logs") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.152154 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-httpd-run\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.152842 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-internal-tls-certs\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.152944 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-combined-ca-bundle\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.152992 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-config-data\") pod \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\" (UID: \"e4ca493a-f707-45c3-b457-1a1053c3dfe5\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153066 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153398 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153415 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153425 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153434 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9zl6\" (UniqueName: \"kubernetes.io/projected/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-kube-api-access-f9zl6\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153444 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153453 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2xgr\" (UniqueName: \"kubernetes.io/projected/9e649171-680c-445a-b418-734a5c7322e3-kube-api-access-k2xgr\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153461 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e4ca493a-f707-45c3-b457-1a1053c3dfe5-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153470 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e649171-680c-445a-b418-734a5c7322e3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153478 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.153486 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.168131 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.168294 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4ca493a-f707-45c3-b457-1a1053c3dfe5-kube-api-access-7tpp6" (OuterVolumeSpecName: "kube-api-access-7tpp6") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "kube-api-access-7tpp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.168448 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-scripts" (OuterVolumeSpecName: "scripts") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.209285 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.224781 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.229869 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.235972 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a083a431-5afc-4289-a5cf-625bc619465e" (UID: "a083a431-5afc-4289-a5cf-625bc619465e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.236889 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-config-data" (OuterVolumeSpecName: "config-data") pod "cdd926a8-442c-4f63-bb36-3e6a425436c2" (UID: "cdd926a8-442c-4f63-bb36-3e6a425436c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.247669 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a083a431-5afc-4289-a5cf-625bc619465e" (UID: "a083a431-5afc-4289-a5cf-625bc619465e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.254851 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.254892 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc6jf\" (UniqueName: \"kubernetes.io/projected/93f52f98-0e26-4fc1-a9af-f580531f8550-kube-api-access-jc6jf\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.254950 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-combined-ca-bundle\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.254997 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-generated\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255030 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-kolla-config\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255047 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-scripts\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255096 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255140 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-public-tls-certs\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255166 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-httpd-run\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-operator-scripts\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255209 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-combined-ca-bundle\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255227 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-logs\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255270 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-config-data\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255286 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-galera-tls-certs\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255313 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-default\") pod \"93f52f98-0e26-4fc1-a9af-f580531f8550\" (UID: \"93f52f98-0e26-4fc1-a9af-f580531f8550\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255350 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzmz5\" (UniqueName: \"kubernetes.io/projected/50b429e9-fb10-48ba-b15c-ec25d57e707a-kube-api-access-hzmz5\") pod \"50b429e9-fb10-48ba-b15c-ec25d57e707a\" (UID: \"50b429e9-fb10-48ba-b15c-ec25d57e707a\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255659 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255677 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255686 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255695 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083a431-5afc-4289-a5cf-625bc619465e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255705 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255714 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.255723 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tpp6\" (UniqueName: \"kubernetes.io/projected/e4ca493a-f707-45c3-b457-1a1053c3dfe5-kube-api-access-7tpp6\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.256567 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.256692 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-logs" (OuterVolumeSpecName: "logs") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.257879 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.258684 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.259951 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.262079 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.271356 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.271539 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93f52f98-0e26-4fc1-a9af-f580531f8550-kube-api-access-jc6jf" (OuterVolumeSpecName: "kube-api-access-jc6jf") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "kube-api-access-jc6jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.272212 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b429e9-fb10-48ba-b15c-ec25d57e707a-kube-api-access-hzmz5" (OuterVolumeSpecName: "kube-api-access-hzmz5") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "kube-api-access-hzmz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.280522 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-scripts" (OuterVolumeSpecName: "scripts") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.284413 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.285217 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "cdd926a8-442c-4f63-bb36-3e6a425436c2" (UID: "cdd926a8-442c-4f63-bb36-3e6a425436c2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.291183 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.291544 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "mysql-db") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.312196 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.314330 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.334115 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "719b68df-d1ac-49e5-ac34-dfa3ba33c97f" (UID: "719b68df-d1ac-49e5-ac34-dfa3ba33c97f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.344975 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.346436 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357623 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357650 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzmz5\" (UniqueName: \"kubernetes.io/projected/50b429e9-fb10-48ba-b15c-ec25d57e707a-kube-api-access-hzmz5\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357671 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357680 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc6jf\" (UniqueName: \"kubernetes.io/projected/93f52f98-0e26-4fc1-a9af-f580531f8550-kube-api-access-jc6jf\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357688 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357697 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-generated\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357706 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357715 4885 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-kolla-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357722 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357733 4885 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cdd926a8-442c-4f63-bb36-3e6a425436c2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357741 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357753 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357762 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/719b68df-d1ac-49e5-ac34-dfa3ba33c97f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357770 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357778 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357786 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357794 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357803 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/50b429e9-fb10-48ba-b15c-ec25d57e707a-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.357811 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/93f52f98-0e26-4fc1-a9af-f580531f8550-config-data-default\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.359903 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.369997 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data" (OuterVolumeSpecName: "config-data") pod "64baa35e-d1c2-48fe-a7a1-d0a4d1485908" (UID: "64baa35e-d1c2-48fe-a7a1-d0a4d1485908"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.372034 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-config-data" (OuterVolumeSpecName: "config-data") pod "e4ca493a-f707-45c3-b457-1a1053c3dfe5" (UID: "e4ca493a-f707-45c3-b457-1a1053c3dfe5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.373180 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-config-data" (OuterVolumeSpecName: "config-data") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.374314 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.374630 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "50b429e9-fb10-48ba-b15c-ec25d57e707a" (UID: "50b429e9-fb10-48ba-b15c-ec25d57e707a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.374959 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.387878 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "93f52f98-0e26-4fc1-a9af-f580531f8550" (UID: "93f52f98-0e26-4fc1-a9af-f580531f8550"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.390640 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f1f46cb2-c95d-40f5-9acc-720e094b91bc/ovn-northd/0.log" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.390702 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459391 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-config\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459469 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-northd-tls-certs\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459515 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbl7q\" (UniqueName: \"kubernetes.io/projected/f1f46cb2-c95d-40f5-9acc-720e094b91bc-kube-api-access-pbl7q\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459617 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-metrics-certs-tls-certs\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459775 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-combined-ca-bundle\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459851 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-rundir\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.459902 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-scripts\") pod \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\" (UID: \"f1f46cb2-c95d-40f5-9acc-720e094b91bc\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460450 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64baa35e-d1c2-48fe-a7a1-d0a4d1485908-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460483 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460501 4885 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460526 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4ca493a-f707-45c3-b457-1a1053c3dfe5-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460543 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460560 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93f52f98-0e26-4fc1-a9af-f580531f8550-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460576 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.460594 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/50b429e9-fb10-48ba-b15c-ec25d57e707a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.461320 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-scripts" (OuterVolumeSpecName: "scripts") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.462043 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-config" (OuterVolumeSpecName: "config") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.465143 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.470346 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f46cb2-c95d-40f5-9acc-720e094b91bc-kube-api-access-pbl7q" (OuterVolumeSpecName: "kube-api-access-pbl7q") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "kube-api-access-pbl7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.521042 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.528037 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.528291 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "f1f46cb2-c95d-40f5-9acc-720e094b91bc" (UID: "f1f46cb2-c95d-40f5-9acc-720e094b91bc"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562161 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562203 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-rundir\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562224 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562240 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f46cb2-c95d-40f5-9acc-720e094b91bc-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562298 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562313 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbl7q\" (UniqueName: \"kubernetes.io/projected/f1f46cb2-c95d-40f5-9acc-720e094b91bc-kube-api-access-pbl7q\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.562326 4885 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f1f46cb2-c95d-40f5-9acc-720e094b91bc-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.650392 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.656618 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.661983 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.723824 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a083a431-5afc-4289-a5cf-625bc619465e","Type":"ContainerDied","Data":"5cda51b7f06b2df67ff66ec569970b5a1ee1ed8790ed67119aa132ee93bae076"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.723868 4885 scope.go:117] "RemoveContainer" containerID="4621296620fb47f88fa5eba2c20d8a6e4bdfb8042020e5c5c63fda5c073094aa" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.723995 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.734543 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"50b429e9-fb10-48ba-b15c-ec25d57e707a","Type":"ContainerDied","Data":"8290e58829785cfd7645e5b7ea06bfd203515f9adadd2b8e8b4383fbc9129293"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.734552 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.734763 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.736703 4885 generic.go:334] "Generic (PLEG): container finished" podID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerID="88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67" exitCode=0 Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.736758 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93f52f98-0e26-4fc1-a9af-f580531f8550","Type":"ContainerDied","Data":"88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.736781 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"93f52f98-0e26-4fc1-a9af-f580531f8550","Type":"ContainerDied","Data":"c562a73365a8a4cec4d84c9a8ed8cc0d8747c679cc7180e594fd7812393beab5"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.736817 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.741434 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e4ca493a-f707-45c3-b457-1a1053c3dfe5","Type":"ContainerDied","Data":"d206b01c706625f3b6d24a81cff0491b35107018670692e53d89b4cfafe0b053"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.741512 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.750460 4885 generic.go:334] "Generic (PLEG): container finished" podID="945717bc-405f-4628-934c-66e4500f56f0" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" exitCode=0 Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.750521 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"945717bc-405f-4628-934c-66e4500f56f0","Type":"ContainerDied","Data":"aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.750546 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"945717bc-405f-4628-934c-66e4500f56f0","Type":"ContainerDied","Data":"4a4309458de44ee77bdd68329eb6288f0e4d16b95e6497926e889647bf594a28"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.750597 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.755508 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.755508 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"64baa35e-d1c2-48fe-a7a1-d0a4d1485908","Type":"ContainerDied","Data":"f56864cf75c0bf77d3e8eee5fd2c82834b4c4219c0b0d60077918b5a5fcf0612"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.760810 4885 scope.go:117] "RemoveContainer" containerID="701b0f4ea9f8a0e00b422b3381b168dbb04b81fe4ea92e28f34d36f84301009f" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765334 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13df70e2-1a9e-4d81-b23b-c461291bce93-etc-machine-id\") pod \"13df70e2-1a9e-4d81-b23b-c461291bce93\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765391 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data-custom\") pod \"13df70e2-1a9e-4d81-b23b-c461291bce93\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765411 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-public-tls-certs\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765436 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data-custom\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765456 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-combined-ca-bundle\") pod \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765473 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwjwb\" (UniqueName: \"kubernetes.io/projected/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-kube-api-access-nwjwb\") pod \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765522 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-config-data\") pod \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\" (UID: \"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765567 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-combined-ca-bundle\") pod \"13df70e2-1a9e-4d81-b23b-c461291bce93\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765595 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e55887-f8af-4c57-820d-c46d0ee9cd9f-logs\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765619 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-scripts\") pod \"13df70e2-1a9e-4d81-b23b-c461291bce93\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765638 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data\") pod \"13df70e2-1a9e-4d81-b23b-c461291bce93\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765660 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765683 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-config-data\") pod \"945717bc-405f-4628-934c-66e4500f56f0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765709 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26jxd\" (UniqueName: \"kubernetes.io/projected/35e55887-f8af-4c57-820d-c46d0ee9cd9f-kube-api-access-26jxd\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765750 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-internal-tls-certs\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765785 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-combined-ca-bundle\") pod \"945717bc-405f-4628-934c-66e4500f56f0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765806 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt46p\" (UniqueName: \"kubernetes.io/projected/13df70e2-1a9e-4d81-b23b-c461291bce93-kube-api-access-wt46p\") pod \"13df70e2-1a9e-4d81-b23b-c461291bce93\" (UID: \"13df70e2-1a9e-4d81-b23b-c461291bce93\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765826 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkbq5\" (UniqueName: \"kubernetes.io/projected/945717bc-405f-4628-934c-66e4500f56f0-kube-api-access-nkbq5\") pod \"945717bc-405f-4628-934c-66e4500f56f0\" (UID: \"945717bc-405f-4628-934c-66e4500f56f0\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.765842 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-combined-ca-bundle\") pod \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\" (UID: \"35e55887-f8af-4c57-820d-c46d0ee9cd9f\") " Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.768007 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13df70e2-1a9e-4d81-b23b-c461291bce93-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "13df70e2-1a9e-4d81-b23b-c461291bce93" (UID: "13df70e2-1a9e-4d81-b23b-c461291bce93"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.775147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35e55887-f8af-4c57-820d-c46d0ee9cd9f-logs" (OuterVolumeSpecName: "logs") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.776251 4885 generic.go:334] "Generic (PLEG): container finished" podID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerID="fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12" exitCode=0 Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.777462 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13df70e2-1a9e-4d81-b23b-c461291bce93","Type":"ContainerDied","Data":"fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.777498 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13df70e2-1a9e-4d81-b23b-c461291bce93","Type":"ContainerDied","Data":"4d0a6fa6c058e8e3d990a208a072ec9c1c565777e02360b62370fc36d2e37246"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.777966 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.787619 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.788392 4885 generic.go:334] "Generic (PLEG): container finished" podID="edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" containerID="9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b" exitCode=0 Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.788445 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197","Type":"ContainerDied","Data":"9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.788493 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"edd9ad85-0e13-4d1f-ab0e-ffd5630c6197","Type":"ContainerDied","Data":"e45bf247b9d279f13a7ad3c2bb090db1b20d8e474638af4ca7545e5e6c5bd1a9"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.788536 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.790940 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.791392 4885 scope.go:117] "RemoveContainer" containerID="f5e5f2790360729d9c4394c0e85bb4e8ea8164ab35be9023623e79b3a117f852" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.793616 4885 generic.go:334] "Generic (PLEG): container finished" podID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerID="786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44" exitCode=0 Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.793658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-796cf584f6-dfmcm" event={"ID":"35e55887-f8af-4c57-820d-c46d0ee9cd9f","Type":"ContainerDied","Data":"786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.793698 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-796cf584f6-dfmcm" event={"ID":"35e55887-f8af-4c57-820d-c46d0ee9cd9f","Type":"ContainerDied","Data":"9666e26b13c4933935ee0abcb40c76da8cace1d3e077db5278af8135676f6e1f"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.793739 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-796cf584f6-dfmcm" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.795184 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35e55887-f8af-4c57-820d-c46d0ee9cd9f-kube-api-access-26jxd" (OuterVolumeSpecName: "kube-api-access-26jxd") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "kube-api-access-26jxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.798196 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_f1f46cb2-c95d-40f5-9acc-720e094b91bc/ovn-northd/0.log" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.798242 4885 generic.go:334] "Generic (PLEG): container finished" podID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerID="e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f" exitCode=139 Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.798279 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1f46cb2-c95d-40f5-9acc-720e094b91bc","Type":"ContainerDied","Data":"e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.798297 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"f1f46cb2-c95d-40f5-9acc-720e094b91bc","Type":"ContainerDied","Data":"cd1df5e26dfde01021643639b3d30a9000c123fa83c48692684173ba1b046531"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.798368 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.806269 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-58c657b6d6-r4tf7" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.806941 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/945717bc-405f-4628-934c-66e4500f56f0-kube-api-access-nkbq5" (OuterVolumeSpecName: "kube-api-access-nkbq5") pod "945717bc-405f-4628-934c-66e4500f56f0" (UID: "945717bc-405f-4628-934c-66e4500f56f0"). InnerVolumeSpecName "kube-api-access-nkbq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.807060 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.809677 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-kube-api-access-nwjwb" (OuterVolumeSpecName: "kube-api-access-nwjwb") pod "edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" (UID: "edd9ad85-0e13-4d1f-ab0e-ffd5630c6197"). InnerVolumeSpecName "kube-api-access-nwjwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.811963 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cdd926a8-442c-4f63-bb36-3e6a425436c2","Type":"ContainerDied","Data":"94340869c04149bbf12f6aa9cc506894a0395e25d240510a2463c412aa09d8dc"} Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.813701 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13df70e2-1a9e-4d81-b23b-c461291bce93-kube-api-access-wt46p" (OuterVolumeSpecName: "kube-api-access-wt46p") pod "13df70e2-1a9e-4d81-b23b-c461291bce93" (UID: "13df70e2-1a9e-4d81-b23b-c461291bce93"). InnerVolumeSpecName "kube-api-access-wt46p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.816068 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.832360 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-scripts" (OuterVolumeSpecName: "scripts") pod "13df70e2-1a9e-4d81-b23b-c461291bce93" (UID: "13df70e2-1a9e-4d81-b23b-c461291bce93"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.836142 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13df70e2-1a9e-4d81-b23b-c461291bce93" (UID: "13df70e2-1a9e-4d81-b23b-c461291bce93"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.847965 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.848721 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.855135 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-config-data" (OuterVolumeSpecName: "config-data") pod "945717bc-405f-4628-934c-66e4500f56f0" (UID: "945717bc-405f-4628-934c-66e4500f56f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.862788 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870459 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt46p\" (UniqueName: \"kubernetes.io/projected/13df70e2-1a9e-4d81-b23b-c461291bce93-kube-api-access-wt46p\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870513 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkbq5\" (UniqueName: \"kubernetes.io/projected/945717bc-405f-4628-934c-66e4500f56f0-kube-api-access-nkbq5\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870523 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13df70e2-1a9e-4d81-b23b-c461291bce93-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870531 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870585 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870596 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwjwb\" (UniqueName: \"kubernetes.io/projected/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-kube-api-access-nwjwb\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870604 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/35e55887-f8af-4c57-820d-c46d0ee9cd9f-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870614 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870626 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.870635 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26jxd\" (UniqueName: \"kubernetes.io/projected/35e55887-f8af-4c57-820d-c46d0ee9cd9f-kube-api-access-26jxd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.882011 4885 scope.go:117] "RemoveContainer" containerID="2d562399fb223e806d5f3ddff2425b5e427d18a16330c90ac41a561625d41719" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.882449 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.890632 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.891955 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13df70e2-1a9e-4d81-b23b-c461291bce93" (UID: "13df70e2-1a9e-4d81-b23b-c461291bce93"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.903804 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.913019 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" (UID: "edd9ad85-0e13-4d1f-ab0e-ffd5630c6197"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.914492 4885 scope.go:117] "RemoveContainer" containerID="88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.915825 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-config-data" (OuterVolumeSpecName: "config-data") pod "edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" (UID: "edd9ad85-0e13-4d1f-ab0e-ffd5630c6197"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.917461 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.925321 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.927986 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data" (OuterVolumeSpecName: "config-data") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.947362 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.950131 4885 scope.go:117] "RemoveContainer" containerID="8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.956664 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.956833 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.961762 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "945717bc-405f-4628-934c-66e4500f56f0" (UID: "945717bc-405f-4628-934c-66e4500f56f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.962950 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.968497 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-58c657b6d6-r4tf7"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.969413 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "35e55887-f8af-4c57-820d-c46d0ee9cd9f" (UID: "35e55887-f8af-4c57-820d-c46d0ee9cd9f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972599 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/945717bc-405f-4628-934c-66e4500f56f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972696 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972751 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972802 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972852 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972932 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972992 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.973043 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/35e55887-f8af-4c57-820d-c46d0ee9cd9f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.972973 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-58c657b6d6-r4tf7"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.978979 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.982298 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.982540 4885 scope.go:117] "RemoveContainer" containerID="88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67" Mar 08 19:56:30 crc kubenswrapper[4885]: E0308 19:56:30.983185 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67\": container with ID starting with 88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67 not found: ID does not exist" containerID="88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.983231 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67"} err="failed to get container status \"88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67\": rpc error: code = NotFound desc = could not find container \"88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67\": container with ID starting with 88458e10cbb3199ac29ae11500e150615eb095f2fc665b65783ff6be2f409d67 not found: ID does not exist" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.983280 4885 scope.go:117] "RemoveContainer" containerID="8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2" Mar 08 19:56:30 crc kubenswrapper[4885]: E0308 19:56:30.983639 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2\": container with ID starting with 8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2 not found: ID does not exist" containerID="8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.983674 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2"} err="failed to get container status \"8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2\": rpc error: code = NotFound desc = could not find container \"8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2\": container with ID starting with 8eb4553c393a99dffc90978d04d1f75b2634315f1e6754c8078cca25e6ba2fc2 not found: ID does not exist" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.983699 4885 scope.go:117] "RemoveContainer" containerID="41d8583c3d498141cc2e38d1ed8623082609a86faa3f087e124f2692ac0c8871" Mar 08 19:56:30 crc kubenswrapper[4885]: I0308 19:56:30.992826 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data" (OuterVolumeSpecName: "config-data") pod "13df70e2-1a9e-4d81-b23b-c461291bce93" (UID: "13df70e2-1a9e-4d81-b23b-c461291bce93"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.001613 4885 scope.go:117] "RemoveContainer" containerID="4e4c7d9e4404bd1a5433f1787f2f7abf1d5d2e0fd51aebb6079e0aa7c48cd16e" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.022119 4885 scope.go:117] "RemoveContainer" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.038095 4885 scope.go:117] "RemoveContainer" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.038645 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0\": container with ID starting with aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0 not found: ID does not exist" containerID="aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.038688 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0"} err="failed to get container status \"aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0\": rpc error: code = NotFound desc = could not find container \"aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0\": container with ID starting with aba116144a07a71dd50ff6e22f1dd8b24fa4a98ea3987fc5836d1fd161c96ed0 not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.038721 4885 scope.go:117] "RemoveContainer" containerID="7305f11ea3e6d044101cca24b88547af01f2e1506724f8e566c2a9df42c34dc6" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.059027 4885 scope.go:117] "RemoveContainer" containerID="48e1f046f7d97f16af118173fbff33a7753d9f3ef98b111d3153850bfbfdaf65" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.074087 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13df70e2-1a9e-4d81-b23b-c461291bce93-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.091469 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.097688 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.099632 4885 scope.go:117] "RemoveContainer" containerID="ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.119962 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.137228 4885 scope.go:117] "RemoveContainer" containerID="fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.140122 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.158433 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-796cf584f6-dfmcm"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.168377 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-796cf584f6-dfmcm"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.173639 4885 scope.go:117] "RemoveContainer" containerID="ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.174214 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5\": container with ID starting with ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5 not found: ID does not exist" containerID="ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.174333 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5"} err="failed to get container status \"ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5\": rpc error: code = NotFound desc = could not find container \"ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5\": container with ID starting with ed1c18ca383811d8bf64bcb674178f35313aa3c987f5aeb1f4a91848d0e5abc5 not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.175149 4885 scope.go:117] "RemoveContainer" containerID="fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.175303 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.176243 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12\": container with ID starting with fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12 not found: ID does not exist" containerID="fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.176275 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12"} err="failed to get container status \"fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12\": rpc error: code = NotFound desc = could not find container \"fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12\": container with ID starting with fa534007f4921fe70ebe1ef71240b613d860530254cdfc1c1ca63e88f53c3b12 not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.176295 4885 scope.go:117] "RemoveContainer" containerID="9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.182229 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.215996 4885 scope.go:117] "RemoveContainer" containerID="9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.220032 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b\": container with ID starting with 9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b not found: ID does not exist" containerID="9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.220069 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b"} err="failed to get container status \"9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b\": rpc error: code = NotFound desc = could not find container \"9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b\": container with ID starting with 9b356a8f70f480264c150680fe3c186edbdd52892c93bc98e33f3c90efaf3f5b not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.220091 4885 scope.go:117] "RemoveContainer" containerID="786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.245010 4885 scope.go:117] "RemoveContainer" containerID="d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.261815 4885 scope.go:117] "RemoveContainer" containerID="786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.262289 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44\": container with ID starting with 786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44 not found: ID does not exist" containerID="786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.262336 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44"} err="failed to get container status \"786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44\": rpc error: code = NotFound desc = could not find container \"786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44\": container with ID starting with 786099af458066385c26bdb67e73a819e95e9a9562680a26053205e478a68d44 not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.262368 4885 scope.go:117] "RemoveContainer" containerID="d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.263392 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b\": container with ID starting with d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b not found: ID does not exist" containerID="d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.263421 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b"} err="failed to get container status \"d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b\": rpc error: code = NotFound desc = could not find container \"d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b\": container with ID starting with d065aad28e1d89c0f634a00cca34fcaf89e3c39b43a7c9e6a3bb5938b353e14b not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.263443 4885 scope.go:117] "RemoveContainer" containerID="619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.290719 4885 scope.go:117] "RemoveContainer" containerID="e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.308365 4885 scope.go:117] "RemoveContainer" containerID="619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.309276 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0\": container with ID starting with 619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0 not found: ID does not exist" containerID="619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.309313 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0"} err="failed to get container status \"619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0\": rpc error: code = NotFound desc = could not find container \"619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0\": container with ID starting with 619e89a099a1ff6de25d192322777011897065ffdfa58efb0ad8b5c45f795aa0 not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.309338 4885 scope.go:117] "RemoveContainer" containerID="e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.309630 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f\": container with ID starting with e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f not found: ID does not exist" containerID="e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.309651 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f"} err="failed to get container status \"e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f\": rpc error: code = NotFound desc = could not find container \"e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f\": container with ID starting with e80a02a4973f3faab5efc651d84813fb3a847f046e4a3709d31b749195eace7f not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.309667 4885 scope.go:117] "RemoveContainer" containerID="2d8347d05b060d48223bdd22690e18395a24601df91142452e20c85edd93a56a" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.353530 4885 scope.go:117] "RemoveContainer" containerID="8eea35d6899ebbfa973a2d7d5bbaae841e42e4044906216a557300884b93a37e" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.382185 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09db13b9-d564-49c9-b383-5fbfe0e43c9b" path="/var/lib/kubelet/pods/09db13b9-d564-49c9-b383-5fbfe0e43c9b/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.382381 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.382598 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" path="/var/lib/kubelet/pods/13df70e2-1a9e-4d81-b23b-c461291bce93/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.382720 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data podName:01dc1fd5-4e2f-4129-9452-ed50fa1d182b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:39.382536627 +0000 UTC m=+1500.778590640 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data") pod "rabbitmq-server-0" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b") : configmap "rabbitmq-config-data" not found Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.384382 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" path="/var/lib/kubelet/pods/35e55887-f8af-4c57-820d-c46d0ee9cd9f/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.386519 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" path="/var/lib/kubelet/pods/50b429e9-fb10-48ba-b15c-ec25d57e707a/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.387846 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="619a568c-d0c3-408b-96c1-39a3a769d1ad" path="/var/lib/kubelet/pods/619a568c-d0c3-408b-96c1-39a3a769d1ad/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.388306 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" path="/var/lib/kubelet/pods/64baa35e-d1c2-48fe-a7a1-d0a4d1485908/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.389423 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" path="/var/lib/kubelet/pods/719b68df-d1ac-49e5-ac34-dfa3ba33c97f/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.390903 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" path="/var/lib/kubelet/pods/93f52f98-0e26-4fc1-a9af-f580531f8550/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.391428 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="945717bc-405f-4628-934c-66e4500f56f0" path="/var/lib/kubelet/pods/945717bc-405f-4628-934c-66e4500f56f0/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.392379 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e649171-680c-445a-b418-734a5c7322e3" path="/var/lib/kubelet/pods/9e649171-680c-445a-b418-734a5c7322e3/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.392709 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a083a431-5afc-4289-a5cf-625bc619465e" path="/var/lib/kubelet/pods/a083a431-5afc-4289-a5cf-625bc619465e/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.393427 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3e6330b-4e2d-44ca-b9be-d36b2f613571" path="/var/lib/kubelet/pods/a3e6330b-4e2d-44ca-b9be-d36b2f613571/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.393948 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" path="/var/lib/kubelet/pods/cdd926a8-442c-4f63-bb36-3e6a425436c2/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.395063 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da1d62ba-4033-4906-87c1-d673c1ab8637" path="/var/lib/kubelet/pods/da1d62ba-4033-4906-87c1-d673c1ab8637/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.395634 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" path="/var/lib/kubelet/pods/e4ca493a-f707-45c3-b457-1a1053c3dfe5/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.396346 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" path="/var/lib/kubelet/pods/edd9ad85-0e13-4d1f-ab0e-ffd5630c6197/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.397643 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" path="/var/lib/kubelet/pods/f1f46cb2-c95d-40f5-9acc-720e094b91bc/volumes" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.420862 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483577 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q2mg\" (UniqueName: \"kubernetes.io/projected/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-kube-api-access-2q2mg\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483622 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-public-tls-certs\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483672 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-combined-ca-bundle\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483753 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-fernet-keys\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483824 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-scripts\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483851 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-credential-keys\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483944 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-config-data\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.483964 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-internal-tls-certs\") pod \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\" (UID: \"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc\") " Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.488296 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-kube-api-access-2q2mg" (OuterVolumeSpecName: "kube-api-access-2q2mg") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "kube-api-access-2q2mg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.490021 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-scripts" (OuterVolumeSpecName: "scripts") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.493318 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.495033 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.517533 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-config-data" (OuterVolumeSpecName: "config-data") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.522222 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.539089 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.542540 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" (UID: "1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.585958 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.585993 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.586004 4885 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.586014 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.586023 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.586031 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q2mg\" (UniqueName: \"kubernetes.io/projected/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-kube-api-access-2q2mg\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.586040 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.586050 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.830271 4885 generic.go:334] "Generic (PLEG): container finished" podID="1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" containerID="0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421" exitCode=0 Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.830385 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-574d5c476f-sq4hm" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.830398 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-574d5c476f-sq4hm" event={"ID":"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc","Type":"ContainerDied","Data":"0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421"} Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.830451 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-574d5c476f-sq4hm" event={"ID":"1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc","Type":"ContainerDied","Data":"bb3823d25975ffc500551905492324189bd2643724049b62ce6f52a7469d0c61"} Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.830478 4885 scope.go:117] "RemoveContainer" containerID="0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.880065 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-574d5c476f-sq4hm"] Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.882257 4885 scope.go:117] "RemoveContainer" containerID="0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421" Mar 08 19:56:31 crc kubenswrapper[4885]: E0308 19:56:31.882830 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421\": container with ID starting with 0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421 not found: ID does not exist" containerID="0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.882885 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421"} err="failed to get container status \"0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421\": rpc error: code = NotFound desc = could not find container \"0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421\": container with ID starting with 0e96ae70d60daef0c189643b6506ee568712e82e7d68dd191d6ad6edd73ac421 not found: ID does not exist" Mar 08 19:56:31 crc kubenswrapper[4885]: I0308 19:56:31.883748 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-574d5c476f-sq4hm"] Mar 08 19:56:32 crc kubenswrapper[4885]: E0308 19:56:32.195658 4885 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:32 crc kubenswrapper[4885]: E0308 19:56:32.195791 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data podName:96257eac-42ec-44cf-80be-9be68c0ebb1b nodeName:}" failed. No retries permitted until 2026-03-08 19:56:40.195761657 +0000 UTC m=+1501.591815710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data") pod "rabbitmq-cell1-server-0" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b") : configmap "rabbitmq-cell1-config-data" not found Mar 08 19:56:32 crc kubenswrapper[4885]: I0308 19:56:32.819791 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:56:32 crc kubenswrapper[4885]: I0308 19:56:32.819844 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:56:32 crc kubenswrapper[4885]: I0308 19:56:32.851525 4885 generic.go:334] "Generic (PLEG): container finished" podID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerID="ab00747eae0e5726409cc3faafb18065815833a98680950d3e1962529cb0f73d" exitCode=0 Mar 08 19:56:32 crc kubenswrapper[4885]: I0308 19:56:32.851592 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01dc1fd5-4e2f-4129-9452-ed50fa1d182b","Type":"ContainerDied","Data":"ab00747eae0e5726409cc3faafb18065815833a98680950d3e1962529cb0f73d"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.014568 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111109 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjj2h\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-kube-api-access-wjj2h\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111529 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111558 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-plugins-conf\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111582 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-confd\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111648 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-pod-info\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111690 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-server-conf\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111706 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-tls\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111749 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-plugins\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111779 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-erlang-cookie\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111801 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.111822 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-erlang-cookie-secret\") pod \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\" (UID: \"01dc1fd5-4e2f-4129-9452-ed50fa1d182b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.113470 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.113831 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.116500 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-pod-info" (OuterVolumeSpecName: "pod-info") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.117022 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.117282 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.132201 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-kube-api-access-wjj2h" (OuterVolumeSpecName: "kube-api-access-wjj2h") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "kube-api-access-wjj2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.136307 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.140187 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.155061 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data" (OuterVolumeSpecName: "config-data") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.164303 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-server-conf" (OuterVolumeSpecName: "server-conf") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.200518 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "01dc1fd5-4e2f-4129-9452-ed50fa1d182b" (UID: "01dc1fd5-4e2f-4129-9452-ed50fa1d182b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213352 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213400 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213422 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213442 4885 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213461 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjj2h\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-kube-api-access-wjj2h\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213509 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213527 4885 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213548 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213566 4885 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213582 4885 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.213598 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/01dc1fd5-4e2f-4129-9452-ed50fa1d182b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.221021 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.241942 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.314355 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-plugins-conf\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.314546 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-confd\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.314777 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-server-conf\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.314984 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315117 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315236 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-erlang-cookie\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-tls\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315325 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9bwn\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-kube-api-access-h9bwn\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315361 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96257eac-42ec-44cf-80be-9be68c0ebb1b-pod-info\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315404 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-plugins\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315475 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96257eac-42ec-44cf-80be-9be68c0ebb1b-erlang-cookie-secret\") pod \"96257eac-42ec-44cf-80be-9be68c0ebb1b\" (UID: \"96257eac-42ec-44cf-80be-9be68c0ebb1b\") " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.315221 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.316021 4885 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.316054 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.316208 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.316948 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.317572 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.318048 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.319315 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96257eac-42ec-44cf-80be-9be68c0ebb1b-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.319693 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/96257eac-42ec-44cf-80be-9be68c0ebb1b-pod-info" (OuterVolumeSpecName: "pod-info") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.319869 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-kube-api-access-h9bwn" (OuterVolumeSpecName: "kube-api-access-h9bwn") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "kube-api-access-h9bwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.334786 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data" (OuterVolumeSpecName: "config-data") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.359202 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-server-conf" (OuterVolumeSpecName: "server-conf") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.376328 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "96257eac-42ec-44cf-80be-9be68c0ebb1b" (UID: "96257eac-42ec-44cf-80be-9be68c0ebb1b"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.380408 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" path="/var/lib/kubelet/pods/1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc/volumes" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418213 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418250 4885 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-server-conf\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418263 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96257eac-42ec-44cf-80be-9be68c0ebb1b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418679 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418697 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418713 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418750 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9bwn\" (UniqueName: \"kubernetes.io/projected/96257eac-42ec-44cf-80be-9be68c0ebb1b-kube-api-access-h9bwn\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.418716 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418763 4885 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96257eac-42ec-44cf-80be-9be68c0ebb1b-pod-info\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418893 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96257eac-42ec-44cf-80be-9be68c0ebb1b-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.418967 4885 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96257eac-42ec-44cf-80be-9be68c0ebb1b-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.419141 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.419623 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.419691 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.420245 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.421519 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.423666 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.423698 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.441745 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.528706 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.885944 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a083cf5-4ca2-440c-840a-6b159151609f" containerID="b8d53aa1399bba98dc12433735d0a8b3cb69b3036f3c8fb648dbc900fdb658b2" exitCode=0 Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.886022 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" event={"ID":"2a083cf5-4ca2-440c-840a-6b159151609f","Type":"ContainerDied","Data":"b8d53aa1399bba98dc12433735d0a8b3cb69b3036f3c8fb648dbc900fdb658b2"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.887509 4885 generic.go:334] "Generic (PLEG): container finished" podID="a7268474-e124-4139-bf24-6b3f605b9511" containerID="6e7be97046549290741b9a7850306bb8d9be298e24617283ccb5d04dda12497f" exitCode=0 Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.887577 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" event={"ID":"a7268474-e124-4139-bf24-6b3f605b9511","Type":"ContainerDied","Data":"6e7be97046549290741b9a7850306bb8d9be298e24617283ccb5d04dda12497f"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.889626 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerID="7e93f87815197e303bd6f0ad768ac092887798b010a49cf9460b37861d1fc6db" exitCode=0 Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.889694 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerDied","Data":"7e93f87815197e303bd6f0ad768ac092887798b010a49cf9460b37861d1fc6db"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.891150 4885 generic.go:334] "Generic (PLEG): container finished" podID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerID="c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e" exitCode=0 Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.891211 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"96257eac-42ec-44cf-80be-9be68c0ebb1b","Type":"ContainerDied","Data":"c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.891243 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"96257eac-42ec-44cf-80be-9be68c0ebb1b","Type":"ContainerDied","Data":"654fe72412f8a73fefea3c7f4b820f3cc3985166e76d795cc9ca36d6cf741354"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.891266 4885 scope.go:117] "RemoveContainer" containerID="c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.891288 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.894085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"01dc1fd5-4e2f-4129-9452-ed50fa1d182b","Type":"ContainerDied","Data":"8b6317b734453e8868ed75e3450ff9740fcef0ac699a30e38e0adad2d77d26bc"} Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.894176 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.922730 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.928817 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.941201 4885 scope.go:117] "RemoveContainer" containerID="67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.946122 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.959480 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.980125 4885 scope.go:117] "RemoveContainer" containerID="c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e" Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.980968 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e\": container with ID starting with c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e not found: ID does not exist" containerID="c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.981013 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e"} err="failed to get container status \"c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e\": rpc error: code = NotFound desc = could not find container \"c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e\": container with ID starting with c0147db28ccfd2f8e82db9e660773ae1aa721c9ab9fb978304fac892d1d4849e not found: ID does not exist" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.981041 4885 scope.go:117] "RemoveContainer" containerID="67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0" Mar 08 19:56:33 crc kubenswrapper[4885]: E0308 19:56:33.984196 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0\": container with ID starting with 67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0 not found: ID does not exist" containerID="67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.984230 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0"} err="failed to get container status \"67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0\": rpc error: code = NotFound desc = could not find container \"67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0\": container with ID starting with 67f9e5e39ed5b688e9c646ef792f61f055e49c49dbf6d5d0bf7544f7f93563d0 not found: ID does not exist" Mar 08 19:56:33 crc kubenswrapper[4885]: I0308 19:56:33.984252 4885 scope.go:117] "RemoveContainer" containerID="ab00747eae0e5726409cc3faafb18065815833a98680950d3e1962529cb0f73d" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.015138 4885 scope.go:117] "RemoveContainer" containerID="f7d40d12aee399534fa9d02af86ea25978b99ea1398acccdac988f16615d42dd" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.090442 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.135968 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n89rb\" (UniqueName: \"kubernetes.io/projected/6a1f465c-123b-455f-8bd8-720d3f8a4bef-kube-api-access-n89rb\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136015 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-ceilometer-tls-certs\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136035 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-combined-ca-bundle\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136685 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-run-httpd\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136827 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-config-data\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136854 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-sg-core-conf-yaml\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136894 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-log-httpd\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.136986 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-scripts\") pod \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\" (UID: \"6a1f465c-123b-455f-8bd8-720d3f8a4bef\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.137104 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.137474 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.141870 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-scripts" (OuterVolumeSpecName: "scripts") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.142197 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.146445 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a1f465c-123b-455f-8bd8-720d3f8a4bef-kube-api-access-n89rb" (OuterVolumeSpecName: "kube-api-access-n89rb") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "kube-api-access-n89rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.155025 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.162370 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.166532 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.211130 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.211402 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.232116 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-config-data" (OuterVolumeSpecName: "config-data") pod "6a1f465c-123b-455f-8bd8-720d3f8a4bef" (UID: "6a1f465c-123b-455f-8bd8-720d3f8a4bef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238143 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7268474-e124-4139-bf24-6b3f605b9511-logs\") pod \"a7268474-e124-4139-bf24-6b3f605b9511\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238189 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-combined-ca-bundle\") pod \"2a083cf5-4ca2-440c-840a-6b159151609f\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238222 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td4rc\" (UniqueName: \"kubernetes.io/projected/a7268474-e124-4139-bf24-6b3f605b9511-kube-api-access-td4rc\") pod \"a7268474-e124-4139-bf24-6b3f605b9511\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238255 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfdxd\" (UniqueName: \"kubernetes.io/projected/2a083cf5-4ca2-440c-840a-6b159151609f-kube-api-access-gfdxd\") pod \"2a083cf5-4ca2-440c-840a-6b159151609f\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238305 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-combined-ca-bundle\") pod \"a7268474-e124-4139-bf24-6b3f605b9511\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238332 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data-custom\") pod \"a7268474-e124-4139-bf24-6b3f605b9511\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238350 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a083cf5-4ca2-440c-840a-6b159151609f-logs\") pod \"2a083cf5-4ca2-440c-840a-6b159151609f\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238385 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data\") pod \"a7268474-e124-4139-bf24-6b3f605b9511\" (UID: \"a7268474-e124-4139-bf24-6b3f605b9511\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238420 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data-custom\") pod \"2a083cf5-4ca2-440c-840a-6b159151609f\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238453 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data\") pod \"2a083cf5-4ca2-440c-840a-6b159151609f\" (UID: \"2a083cf5-4ca2-440c-840a-6b159151609f\") " Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.238753 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7268474-e124-4139-bf24-6b3f605b9511-logs" (OuterVolumeSpecName: "logs") pod "a7268474-e124-4139-bf24-6b3f605b9511" (UID: "a7268474-e124-4139-bf24-6b3f605b9511"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239024 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a083cf5-4ca2-440c-840a-6b159151609f-logs" (OuterVolumeSpecName: "logs") pod "2a083cf5-4ca2-440c-840a-6b159151609f" (UID: "2a083cf5-4ca2-440c-840a-6b159151609f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239683 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239701 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239711 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6a1f465c-123b-455f-8bd8-720d3f8a4bef-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239738 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2a083cf5-4ca2-440c-840a-6b159151609f-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239748 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239757 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n89rb\" (UniqueName: \"kubernetes.io/projected/6a1f465c-123b-455f-8bd8-720d3f8a4bef-kube-api-access-n89rb\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239765 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239774 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a1f465c-123b-455f-8bd8-720d3f8a4bef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.239782 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7268474-e124-4139-bf24-6b3f605b9511-logs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.240835 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a083cf5-4ca2-440c-840a-6b159151609f-kube-api-access-gfdxd" (OuterVolumeSpecName: "kube-api-access-gfdxd") pod "2a083cf5-4ca2-440c-840a-6b159151609f" (UID: "2a083cf5-4ca2-440c-840a-6b159151609f"). InnerVolumeSpecName "kube-api-access-gfdxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.241892 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7268474-e124-4139-bf24-6b3f605b9511-kube-api-access-td4rc" (OuterVolumeSpecName: "kube-api-access-td4rc") pod "a7268474-e124-4139-bf24-6b3f605b9511" (UID: "a7268474-e124-4139-bf24-6b3f605b9511"). InnerVolumeSpecName "kube-api-access-td4rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.243696 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2a083cf5-4ca2-440c-840a-6b159151609f" (UID: "2a083cf5-4ca2-440c-840a-6b159151609f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.245942 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a7268474-e124-4139-bf24-6b3f605b9511" (UID: "a7268474-e124-4139-bf24-6b3f605b9511"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.254788 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a083cf5-4ca2-440c-840a-6b159151609f" (UID: "2a083cf5-4ca2-440c-840a-6b159151609f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.259378 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7268474-e124-4139-bf24-6b3f605b9511" (UID: "a7268474-e124-4139-bf24-6b3f605b9511"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.272260 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data" (OuterVolumeSpecName: "config-data") pod "2a083cf5-4ca2-440c-840a-6b159151609f" (UID: "2a083cf5-4ca2-440c-840a-6b159151609f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.276292 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data" (OuterVolumeSpecName: "config-data") pod "a7268474-e124-4139-bf24-6b3f605b9511" (UID: "a7268474-e124-4139-bf24-6b3f605b9511"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341303 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfdxd\" (UniqueName: \"kubernetes.io/projected/2a083cf5-4ca2-440c-840a-6b159151609f-kube-api-access-gfdxd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341346 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341364 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341381 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7268474-e124-4139-bf24-6b3f605b9511-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341398 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341415 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341433 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a083cf5-4ca2-440c-840a-6b159151609f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.341451 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td4rc\" (UniqueName: \"kubernetes.io/projected/a7268474-e124-4139-bf24-6b3f605b9511-kube-api-access-td4rc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.915083 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" event={"ID":"2a083cf5-4ca2-440c-840a-6b159151609f","Type":"ContainerDied","Data":"7e0b4a7b5579c233c2e49f4395b1b83ad7591cab769b791a33fa19e09b808340"} Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.915175 4885 scope.go:117] "RemoveContainer" containerID="b8d53aa1399bba98dc12433735d0a8b3cb69b3036f3c8fb648dbc900fdb658b2" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.915108 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7dfc6b7fcc-dpq7t" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.922074 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" event={"ID":"a7268474-e124-4139-bf24-6b3f605b9511","Type":"ContainerDied","Data":"7d126567b856b73925e9e50b783a515a23fdff84d4ca27fd2089e38d86b58980"} Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.922218 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5b88496c9d-2g95h" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.927846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6a1f465c-123b-455f-8bd8-720d3f8a4bef","Type":"ContainerDied","Data":"a1d4f26c989a88dcb6b8292fefbf9776a8ae2f04c4981fe1a6564019613a69ba"} Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.928067 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.959651 4885 scope.go:117] "RemoveContainer" containerID="d5cd5c3527dc17515d5a33bed3c5118e0fcbd6d15187bcfb409883f29afc80a6" Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.983025 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-7dfc6b7fcc-dpq7t"] Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.988820 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-7dfc6b7fcc-dpq7t"] Mar 08 19:56:34 crc kubenswrapper[4885]: I0308 19:56:34.997180 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.005742 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.012144 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-5b88496c9d-2g95h"] Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.017573 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-5b88496c9d-2g95h"] Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.023621 4885 scope.go:117] "RemoveContainer" containerID="6e7be97046549290741b9a7850306bb8d9be298e24617283ccb5d04dda12497f" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.045703 4885 scope.go:117] "RemoveContainer" containerID="3a20cf21bbfb4da8c71131e4075d64b83bae96d5c5020bc3cfadcf8d7226f8bc" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.073064 4885 scope.go:117] "RemoveContainer" containerID="46513b3771e23d8ed82d3b5bc73c4d07608c21fefaf4830b5055b5a4a5d6d688" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.094599 4885 scope.go:117] "RemoveContainer" containerID="c26148668f63c2c808f3994e48705725bbf52e07fae581041f2a8517c972eb19" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.116902 4885 scope.go:117] "RemoveContainer" containerID="7e93f87815197e303bd6f0ad768ac092887798b010a49cf9460b37861d1fc6db" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.136149 4885 scope.go:117] "RemoveContainer" containerID="a5a0e7af89f0943433efc0423974aa4157ace5b596adabc6170e4373acc330a7" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.386414 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" path="/var/lib/kubelet/pods/01dc1fd5-4e2f-4129-9452-ed50fa1d182b/volumes" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.387153 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" path="/var/lib/kubelet/pods/2a083cf5-4ca2-440c-840a-6b159151609f/volumes" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.392512 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" path="/var/lib/kubelet/pods/6a1f465c-123b-455f-8bd8-720d3f8a4bef/volumes" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.395617 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" path="/var/lib/kubelet/pods/96257eac-42ec-44cf-80be-9be68c0ebb1b/volumes" Mar 08 19:56:35 crc kubenswrapper[4885]: I0308 19:56:35.397298 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7268474-e124-4139-bf24-6b3f605b9511" path="/var/lib/kubelet/pods/a7268474-e124-4139-bf24-6b3f605b9511/volumes" Mar 08 19:56:38 crc kubenswrapper[4885]: I0308 19:56:38.006068 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: i/o timeout" Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.418765 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.419597 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.419901 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.420033 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.420217 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.421260 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.424175 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:38 crc kubenswrapper[4885]: E0308 19:56:38.424221 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.021621 4885 generic.go:334] "Generic (PLEG): container finished" podID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerID="17e37a1234fac68b042cb982b6be421ba7a3bd54c84d93b8bbb1842a9f1fa332" exitCode=0 Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.021734 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb5b9c587-nd8hp" event={"ID":"d1b91750-253e-46eb-9a1c-f7208dab2496","Type":"ContainerDied","Data":"17e37a1234fac68b042cb982b6be421ba7a3bd54c84d93b8bbb1842a9f1fa332"} Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.022416 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5bb5b9c587-nd8hp" event={"ID":"d1b91750-253e-46eb-9a1c-f7208dab2496","Type":"ContainerDied","Data":"021bd0d886601b9f55240e0e88eca80cd21300113e390352e26c76ca5a8592dc"} Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.022437 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="021bd0d886601b9f55240e0e88eca80cd21300113e390352e26c76ca5a8592dc" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.109790 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165550 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-ovndb-tls-certs\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165617 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-combined-ca-bundle\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165647 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdbrd\" (UniqueName: \"kubernetes.io/projected/d1b91750-253e-46eb-9a1c-f7208dab2496-kube-api-access-sdbrd\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165728 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-config\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165801 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-public-tls-certs\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165831 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-httpd-config\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.165893 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-internal-tls-certs\") pod \"d1b91750-253e-46eb-9a1c-f7208dab2496\" (UID: \"d1b91750-253e-46eb-9a1c-f7208dab2496\") " Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.191792 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b91750-253e-46eb-9a1c-f7208dab2496-kube-api-access-sdbrd" (OuterVolumeSpecName: "kube-api-access-sdbrd") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "kube-api-access-sdbrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.193298 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.220796 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.230848 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.240056 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-config" (OuterVolumeSpecName: "config") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.240091 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.240347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "d1b91750-253e-46eb-9a1c-f7208dab2496" (UID: "d1b91750-253e-46eb-9a1c-f7208dab2496"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267713 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267771 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdbrd\" (UniqueName: \"kubernetes.io/projected/d1b91750-253e-46eb-9a1c-f7208dab2496-kube-api-access-sdbrd\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267794 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267815 4885 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267836 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267854 4885 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:42 crc kubenswrapper[4885]: I0308 19:56:42.267872 4885 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1b91750-253e-46eb-9a1c-f7208dab2496-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:43 crc kubenswrapper[4885]: I0308 19:56:43.035968 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5bb5b9c587-nd8hp" Mar 08 19:56:43 crc kubenswrapper[4885]: I0308 19:56:43.095002 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5bb5b9c587-nd8hp"] Mar 08 19:56:43 crc kubenswrapper[4885]: I0308 19:56:43.101100 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5bb5b9c587-nd8hp"] Mar 08 19:56:43 crc kubenswrapper[4885]: I0308 19:56:43.393739 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" path="/var/lib/kubelet/pods/d1b91750-253e-46eb-9a1c-f7208dab2496/volumes" Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.419443 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.420265 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.420799 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.420852 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.423707 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.427368 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.429988 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:43 crc kubenswrapper[4885]: E0308 19:56:43.430101 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.419190 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.420511 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.421289 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.421396 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.424652 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.427437 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.429741 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:48 crc kubenswrapper[4885]: E0308 19:56:48.429913 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.419846 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.422142 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.422173 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.422977 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.423074 4885 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.423622 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.425411 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Mar 08 19:56:53 crc kubenswrapper[4885]: E0308 19:56:53.425470 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-pp4rs" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.189691 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pp4rs_88c2918a-548b-4b78-a34c-2aa2969ee2cd/ovs-vswitchd/0.log" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.191757 4885 generic.go:334] "Generic (PLEG): container finished" podID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" exitCode=137 Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.191826 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerDied","Data":"e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1"} Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.545486 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pp4rs_88c2918a-548b-4b78-a34c-2aa2969ee2cd/ovs-vswitchd/0.log" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.546123 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622517 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-lib\") pod \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622558 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-log\") pod \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622589 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-run\") pod \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622621 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66rvx\" (UniqueName: \"kubernetes.io/projected/88c2918a-548b-4b78-a34c-2aa2969ee2cd-kube-api-access-66rvx\") pod \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622643 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-etc-ovs\") pod \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622662 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-lib" (OuterVolumeSpecName: "var-lib") pod "88c2918a-548b-4b78-a34c-2aa2969ee2cd" (UID: "88c2918a-548b-4b78-a34c-2aa2969ee2cd"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622673 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-log" (OuterVolumeSpecName: "var-log") pod "88c2918a-548b-4b78-a34c-2aa2969ee2cd" (UID: "88c2918a-548b-4b78-a34c-2aa2969ee2cd"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622693 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-run" (OuterVolumeSpecName: "var-run") pod "88c2918a-548b-4b78-a34c-2aa2969ee2cd" (UID: "88c2918a-548b-4b78-a34c-2aa2969ee2cd"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622736 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88c2918a-548b-4b78-a34c-2aa2969ee2cd-scripts\") pod \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\" (UID: \"88c2918a-548b-4b78-a34c-2aa2969ee2cd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.622758 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "88c2918a-548b-4b78-a34c-2aa2969ee2cd" (UID: "88c2918a-548b-4b78-a34c-2aa2969ee2cd"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.623046 4885 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-log\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.623065 4885 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-run\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.623075 4885 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-etc-ovs\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.623085 4885 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/88c2918a-548b-4b78-a34c-2aa2969ee2cd-var-lib\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.623893 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88c2918a-548b-4b78-a34c-2aa2969ee2cd-scripts" (OuterVolumeSpecName: "scripts") pod "88c2918a-548b-4b78-a34c-2aa2969ee2cd" (UID: "88c2918a-548b-4b78-a34c-2aa2969ee2cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.640205 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88c2918a-548b-4b78-a34c-2aa2969ee2cd-kube-api-access-66rvx" (OuterVolumeSpecName: "kube-api-access-66rvx") pod "88c2918a-548b-4b78-a34c-2aa2969ee2cd" (UID: "88c2918a-548b-4b78-a34c-2aa2969ee2cd"). InnerVolumeSpecName "kube-api-access-66rvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.708283 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.723909 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66rvx\" (UniqueName: \"kubernetes.io/projected/88c2918a-548b-4b78-a34c-2aa2969ee2cd-kube-api-access-66rvx\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.723952 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88c2918a-548b-4b78-a34c-2aa2969ee2cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.825499 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-lock\") pod \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.825549 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-combined-ca-bundle\") pod \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.825602 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-cache\") pod \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.825658 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.825717 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22mr6\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-kube-api-access-22mr6\") pod \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.825764 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") pod \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\" (UID: \"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd\") " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.826384 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-lock" (OuterVolumeSpecName: "lock") pod "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.826759 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-cache" (OuterVolumeSpecName: "cache") pod "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.829699 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.829956 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-kube-api-access-22mr6" (OuterVolumeSpecName: "kube-api-access-22mr6") pod "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd"). InnerVolumeSpecName "kube-api-access-22mr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.831603 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "swift") pod "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.927996 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.928051 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22mr6\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-kube-api-access-22mr6\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.928075 4885 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.928099 4885 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-lock\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.928120 4885 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-cache\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:55 crc kubenswrapper[4885]: I0308 19:56:55.956343 4885 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.030254 4885 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.182557 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" (UID: "aa276a05-ab6a-4aa1-9a9f-a990dc1513bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.204852 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-pp4rs_88c2918a-548b-4b78-a34c-2aa2969ee2cd/ovs-vswitchd/0.log" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.206407 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-pp4rs" event={"ID":"88c2918a-548b-4b78-a34c-2aa2969ee2cd","Type":"ContainerDied","Data":"e8b9e6003711ba0073f7cced036d1550a5aac01aa3276b7f4a1f8ca2c14ba942"} Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.206464 4885 scope.go:117] "RemoveContainer" containerID="e486a750827eb2eb342332d908a97bc380c80e617e5589f7fec3f1c93c87afd1" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.206486 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-pp4rs" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.219983 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerID="9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2" exitCode=137 Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.220024 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.220050 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2"} Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.220093 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"aa276a05-ab6a-4aa1-9a9f-a990dc1513bd","Type":"ContainerDied","Data":"b1e0ba5e2e1e5dbd0a9f1e5b591c441cf6aacd9c13c59e7ce49d3304923b9158"} Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.239561 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.241569 4885 scope.go:117] "RemoveContainer" containerID="a49af92979d07d1a584d16332ff6123ce05b561e739dd973b533f7cfe87fb03f" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.284995 4885 scope.go:117] "RemoveContainer" containerID="4d70b99d630277ded10493eacfddd38fddedced2d880750db49b6b3f39017dba" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.291768 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.298911 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.306224 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-pp4rs"] Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.310411 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-pp4rs"] Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.322815 4885 scope.go:117] "RemoveContainer" containerID="9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.351509 4885 scope.go:117] "RemoveContainer" containerID="64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.374899 4885 scope.go:117] "RemoveContainer" containerID="7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.397591 4885 scope.go:117] "RemoveContainer" containerID="6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.422245 4885 scope.go:117] "RemoveContainer" containerID="37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.453822 4885 scope.go:117] "RemoveContainer" containerID="d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.481710 4885 scope.go:117] "RemoveContainer" containerID="2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.512579 4885 scope.go:117] "RemoveContainer" containerID="e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.534403 4885 scope.go:117] "RemoveContainer" containerID="626657923ce6ed6491828eb9e3d29e03cb9ceee45223fbdf56fc2006030e8b1d" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.546491 4885 scope.go:117] "RemoveContainer" containerID="6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.627911 4885 scope.go:117] "RemoveContainer" containerID="25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.672145 4885 scope.go:117] "RemoveContainer" containerID="a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.706092 4885 scope.go:117] "RemoveContainer" containerID="3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.749975 4885 scope.go:117] "RemoveContainer" containerID="904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.781385 4885 scope.go:117] "RemoveContainer" containerID="5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.820284 4885 scope.go:117] "RemoveContainer" containerID="803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.853448 4885 scope.go:117] "RemoveContainer" containerID="9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.854108 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2\": container with ID starting with 9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2 not found: ID does not exist" containerID="9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.854177 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2"} err="failed to get container status \"9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2\": rpc error: code = NotFound desc = could not find container \"9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2\": container with ID starting with 9752f7be4d2bfd1b2ecfc7f4e7e03144dded626a8cff61a8989abf85a00541b2 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.854223 4885 scope.go:117] "RemoveContainer" containerID="64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.854617 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb\": container with ID starting with 64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb not found: ID does not exist" containerID="64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.854672 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb"} err="failed to get container status \"64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb\": rpc error: code = NotFound desc = could not find container \"64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb\": container with ID starting with 64b7f4f8e5649581d0a2ecc7dc17957eeec1a484ead87c9f815d9afdd56721fb not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.854713 4885 scope.go:117] "RemoveContainer" containerID="7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.855144 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca\": container with ID starting with 7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca not found: ID does not exist" containerID="7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.855182 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca"} err="failed to get container status \"7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca\": rpc error: code = NotFound desc = could not find container \"7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca\": container with ID starting with 7c4b1185c59127aad825c13beb7a2b0eead283cff9302affd35119ae418f8dca not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.855210 4885 scope.go:117] "RemoveContainer" containerID="6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.855591 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474\": container with ID starting with 6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474 not found: ID does not exist" containerID="6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.855639 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474"} err="failed to get container status \"6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474\": rpc error: code = NotFound desc = could not find container \"6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474\": container with ID starting with 6247d03d3b0f1e5c32c3c8f55a86bda14c0039e692178790428900accd2ea474 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.855668 4885 scope.go:117] "RemoveContainer" containerID="37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.856064 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a\": container with ID starting with 37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a not found: ID does not exist" containerID="37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.856098 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a"} err="failed to get container status \"37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a\": rpc error: code = NotFound desc = could not find container \"37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a\": container with ID starting with 37124eefab54ddf06207d2b2219d047836b26db38a09c3cac8f17b96b8bd128a not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.856121 4885 scope.go:117] "RemoveContainer" containerID="d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.856503 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3\": container with ID starting with d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3 not found: ID does not exist" containerID="d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.856553 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3"} err="failed to get container status \"d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3\": rpc error: code = NotFound desc = could not find container \"d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3\": container with ID starting with d29c75eaa217cf7c8cd5bc3b019dd76a7e29b74f767f234a3acfdfdbe4e4a0c3 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.856583 4885 scope.go:117] "RemoveContainer" containerID="2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.857085 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd\": container with ID starting with 2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd not found: ID does not exist" containerID="2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.857144 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd"} err="failed to get container status \"2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd\": rpc error: code = NotFound desc = could not find container \"2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd\": container with ID starting with 2b90f52de1885075a1ff8ad855fff3ebafbcb3d5484d88dd7e37c566f7c6aadd not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.857183 4885 scope.go:117] "RemoveContainer" containerID="e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.857716 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f\": container with ID starting with e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f not found: ID does not exist" containerID="e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.857763 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f"} err="failed to get container status \"e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f\": rpc error: code = NotFound desc = could not find container \"e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f\": container with ID starting with e2e295a748812a625d4c6d82b1917ed29370b69a19d0e7aa07c3e11b48bc119f not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.857789 4885 scope.go:117] "RemoveContainer" containerID="6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.858332 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81\": container with ID starting with 6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81 not found: ID does not exist" containerID="6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.858387 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81"} err="failed to get container status \"6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81\": rpc error: code = NotFound desc = could not find container \"6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81\": container with ID starting with 6e3bbf4f9454c356bdec473d96a5a2048f7f5e08653256227621c9e22615bb81 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.858430 4885 scope.go:117] "RemoveContainer" containerID="25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.859273 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd\": container with ID starting with 25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd not found: ID does not exist" containerID="25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.859337 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd"} err="failed to get container status \"25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd\": rpc error: code = NotFound desc = could not find container \"25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd\": container with ID starting with 25adc369a110c8e25f4c3f2b767d0c5c39fbd0f62b29555e1cc17138cf1eaecd not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.859370 4885 scope.go:117] "RemoveContainer" containerID="a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.859775 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc\": container with ID starting with a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc not found: ID does not exist" containerID="a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.859966 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc"} err="failed to get container status \"a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc\": rpc error: code = NotFound desc = could not find container \"a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc\": container with ID starting with a49cb08d9d728dea53e8fae54ec355f9268840701ffcc76f7d04cb686df01fdc not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.860122 4885 scope.go:117] "RemoveContainer" containerID="3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.861510 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653\": container with ID starting with 3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653 not found: ID does not exist" containerID="3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.861565 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653"} err="failed to get container status \"3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653\": rpc error: code = NotFound desc = could not find container \"3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653\": container with ID starting with 3df10d746942481531f713ad362c5525509b2c51c9fa048e6c23c6ac96792653 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.861604 4885 scope.go:117] "RemoveContainer" containerID="904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.862140 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6\": container with ID starting with 904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6 not found: ID does not exist" containerID="904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.862178 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6"} err="failed to get container status \"904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6\": rpc error: code = NotFound desc = could not find container \"904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6\": container with ID starting with 904ec9e3624448bc12e1d948ed9654775188d9eaf6914090515ac3b861d7f9e6 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.862203 4885 scope.go:117] "RemoveContainer" containerID="5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.862569 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82\": container with ID starting with 5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82 not found: ID does not exist" containerID="5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.862617 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82"} err="failed to get container status \"5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82\": rpc error: code = NotFound desc = could not find container \"5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82\": container with ID starting with 5094af81d927c4f270c5219115108c88f00eb7c557dfe1cbd2c5e14784b24e82 not found: ID does not exist" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.862647 4885 scope.go:117] "RemoveContainer" containerID="803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542" Mar 08 19:56:56 crc kubenswrapper[4885]: E0308 19:56:56.863027 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542\": container with ID starting with 803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542 not found: ID does not exist" containerID="803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542" Mar 08 19:56:56 crc kubenswrapper[4885]: I0308 19:56:56.863070 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542"} err="failed to get container status \"803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542\": rpc error: code = NotFound desc = could not find container \"803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542\": container with ID starting with 803c56b6db06b51fc9ed5670b0a451f66696c5be08bb30cbd75981e4f204c542 not found: ID does not exist" Mar 08 19:56:57 crc kubenswrapper[4885]: I0308 19:56:57.390480 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" path="/var/lib/kubelet/pods/88c2918a-548b-4b78-a34c-2aa2969ee2cd/volumes" Mar 08 19:56:57 crc kubenswrapper[4885]: I0308 19:56:57.392451 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" path="/var/lib/kubelet/pods/aa276a05-ab6a-4aa1-9a9f-a990dc1513bd/volumes" Mar 08 19:56:59 crc kubenswrapper[4885]: I0308 19:56:59.568324 4885 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podda1d62ba-4033-4906-87c1-d673c1ab8637"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podda1d62ba-4033-4906-87c1-d673c1ab8637] : Timed out while waiting for systemd to remove kubepods-besteffort-podda1d62ba_4033_4906_87c1_d673c1ab8637.slice" Mar 08 19:56:59 crc kubenswrapper[4885]: I0308 19:56:59.687827 4885 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod09db13b9-d564-49c9-b383-5fbfe0e43c9b"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod09db13b9-d564-49c9-b383-5fbfe0e43c9b] : Timed out while waiting for systemd to remove kubepods-besteffort-pod09db13b9_d564_49c9_b383_5fbfe0e43c9b.slice" Mar 08 19:56:59 crc kubenswrapper[4885]: I0308 19:56:59.693053 4885 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod619a568c-d0c3-408b-96c1-39a3a769d1ad"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod619a568c-d0c3-408b-96c1-39a3a769d1ad] : Timed out while waiting for systemd to remove kubepods-besteffort-pod619a568c_d0c3_408b_96c1_39a3a769d1ad.slice" Mar 08 19:57:02 crc kubenswrapper[4885]: I0308 19:57:02.818162 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:57:02 crc kubenswrapper[4885]: I0308 19:57:02.818571 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:57:32 crc kubenswrapper[4885]: I0308 19:57:32.818634 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 19:57:32 crc kubenswrapper[4885]: I0308 19:57:32.819387 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 19:57:32 crc kubenswrapper[4885]: I0308 19:57:32.819463 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 19:57:32 crc kubenswrapper[4885]: I0308 19:57:32.820362 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b0670bdd5ae4a6193bbd77e528520a487b041af774d5305f09762277548bcda8"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 19:57:32 crc kubenswrapper[4885]: I0308 19:57:32.820477 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://b0670bdd5ae4a6193bbd77e528520a487b041af774d5305f09762277548bcda8" gracePeriod=600 Mar 08 19:57:33 crc kubenswrapper[4885]: I0308 19:57:33.680597 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="b0670bdd5ae4a6193bbd77e528520a487b041af774d5305f09762277548bcda8" exitCode=0 Mar 08 19:57:33 crc kubenswrapper[4885]: I0308 19:57:33.680720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"b0670bdd5ae4a6193bbd77e528520a487b041af774d5305f09762277548bcda8"} Mar 08 19:57:33 crc kubenswrapper[4885]: I0308 19:57:33.681367 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400"} Mar 08 19:57:33 crc kubenswrapper[4885]: I0308 19:57:33.681401 4885 scope.go:117] "RemoveContainer" containerID="c24a30299a18630f198121b61248ad8d1e3d9e8acd806e23d5c1d953fe5cfa83" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.504823 4885 scope.go:117] "RemoveContainer" containerID="3cb04d8216824e70d6b5ea33718713bb6914ece1b0e3362b1186f648f1502b81" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.549015 4885 scope.go:117] "RemoveContainer" containerID="58b318e6af3a5db8b09b96a9de226a379d7375fee61bd37b949548ceef13806c" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.589641 4885 scope.go:117] "RemoveContainer" containerID="d990977988383de183ee74b10460a2aef417ed74ff41f049c648f4b0922ddb17" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.624375 4885 scope.go:117] "RemoveContainer" containerID="57d8097d34b17ff81e694e75a211c6042455808aeca7d092f8501d703a78d088" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.649874 4885 scope.go:117] "RemoveContainer" containerID="2f31946378ed0ae4efcfd55a18f638cc84b0a18f97193739711ef28dac2174f9" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.675636 4885 scope.go:117] "RemoveContainer" containerID="2ff4df6777cb04e247eca00bf1613dce65653cf286ef17867253f3e89e727d13" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.702689 4885 scope.go:117] "RemoveContainer" containerID="e5feabe92d49eb8fd4bb48801094df276f9bf1fc07181b4b0ee0908d604394fb" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.735878 4885 scope.go:117] "RemoveContainer" containerID="b4398eab96435c81b8a2366ba9291b7b0c13edf908fc801823865f8458709b7a" Mar 08 19:57:57 crc kubenswrapper[4885]: I0308 19:57:57.760214 4885 scope.go:117] "RemoveContainer" containerID="ca2add6996115e29bd86a097fbce1cceadad7160db189d6c7e405a523a1ccb6e" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158113 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29549998-97pgs"] Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158532 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63c3ea8e-9683-45b9-805b-d1049840b0da" containerName="kube-state-metrics" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158553 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="63c3ea8e-9683-45b9-805b-d1049840b0da" containerName="kube-state-metrics" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158578 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158592 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158609 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158643 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158667 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="proxy-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158680 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="proxy-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158725 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158738 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158758 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-metadata" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158769 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-metadata" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158784 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158796 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-api" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158815 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="probe" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158827 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="probe" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158841 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158853 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158875 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158887 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158903 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158915 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158957 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-reaper" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158969 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-reaper" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.158988 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.158999 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-server" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159015 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-updater" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159027 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-updater" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159042 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159054 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159067 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159078 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159104 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159116 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-server" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159136 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159147 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159162 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="rsync" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159173 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="rsync" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159193 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159205 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159224 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-expirer" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159238 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-expirer" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159252 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="swift-recon-cron" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159264 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="swift-recon-cron" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159283 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerName="rabbitmq" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159294 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerName="rabbitmq" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159310 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159322 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159342 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="ovn-northd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159353 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="ovn-northd" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159373 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server-init" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159385 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server-init" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159406 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159418 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-api" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159436 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159448 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-server" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159468 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerName="setup-container" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159480 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerName="setup-container" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159498 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159509 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159532 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="openstack-network-exporter" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159543 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="openstack-network-exporter" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159557 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159568 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-api" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159592 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-notification-agent" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159603 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-notification-agent" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159621 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="cinder-scheduler" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159633 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="cinder-scheduler" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159651 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159663 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159684 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" containerName="nova-cell0-conductor-conductor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159696 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" containerName="nova-cell0-conductor-conductor" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159713 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159725 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159741 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159752 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159767 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="rabbitmq" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159779 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="rabbitmq" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159793 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159804 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159825 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerName="galera" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159836 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerName="galera" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159851 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159863 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159878 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" containerName="keystone-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159890 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" containerName="keystone-api" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.159907 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="sg-core" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.159991 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="sg-core" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160012 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160027 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160042 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160054 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160070 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160160 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160220 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="setup-container" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160235 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="setup-container" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160259 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1d62ba-4033-4906-87c1-d673c1ab8637" containerName="memcached" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160272 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1d62ba-4033-4906-87c1-d673c1ab8637" containerName="memcached" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160292 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-updater" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160307 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-updater" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160325 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160338 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160365 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerName="mysql-bootstrap" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160377 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerName="mysql-bootstrap" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160398 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160410 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160433 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160447 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160470 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bbdf164-51e7-4faf-986b-fba5044fad2b" containerName="nova-cell1-conductor-conductor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160483 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bbdf164-51e7-4faf-986b-fba5044fad2b" containerName="nova-cell1-conductor-conductor" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160505 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="945717bc-405f-4628-934c-66e4500f56f0" containerName="nova-scheduler-scheduler" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160522 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="945717bc-405f-4628-934c-66e4500f56f0" containerName="nova-scheduler-scheduler" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160545 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-central-agent" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160558 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-central-agent" Mar 08 19:58:00 crc kubenswrapper[4885]: E0308 19:58:00.160576 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.160590 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161032 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-metadata" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161058 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161073 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-updater" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161090 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-updater" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161106 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="proxy-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161120 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bbdf164-51e7-4faf-986b-fba5044fad2b" containerName="nova-cell1-conductor-conductor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161144 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161158 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-notification-agent" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161176 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovs-vswitchd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161199 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1d62ba-4033-4906-87c1-d673c1ab8637" containerName="memcached" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161221 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161242 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161262 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="openstack-network-exporter" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161275 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="35e55887-f8af-4c57-820d-c46d0ee9cd9f" containerName="barbican-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161298 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="rsync" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161321 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161339 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="96257eac-42ec-44cf-80be-9be68c0ebb1b" containerName="rabbitmq" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161357 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161375 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="63c3ea8e-9683-45b9-805b-d1049840b0da" containerName="kube-state-metrics" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161390 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="swift-recon-cron" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161413 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="64baa35e-d1c2-48fe-a7a1-d0a4d1485908" containerName="cinder-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161435 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-reaper" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161452 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd9ad85-0e13-4d1f-ab0e-ffd5630c6197" containerName="nova-cell0-conductor-conductor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161471 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161498 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="88c2918a-548b-4b78-a34c-2aa2969ee2cd" containerName="ovsdb-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161518 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161541 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7268474-e124-4139-bf24-6b3f605b9511" containerName="barbican-keystone-listener-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161560 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd926a8-442c-4f63-bb36-3e6a425436c2" containerName="nova-metadata-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161577 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161593 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="945717bc-405f-4628-934c-66e4500f56f0" containerName="nova-scheduler-scheduler" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161616 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="sg-core" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161640 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-expirer" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161662 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a28c270-c9ef-4b8c-a8e7-bcc69a1419cc" containerName="keystone-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161682 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161705 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f46cb2-c95d-40f5-9acc-720e094b91bc" containerName="ovn-northd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161721 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="93f52f98-0e26-4fc1-a9af-f580531f8550" containerName="galera" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161743 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161765 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="container-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161784 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161806 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a083a431-5afc-4289-a5cf-625bc619465e" containerName="nova-api-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161827 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a083cf5-4ca2-440c-840a-6b159151609f" containerName="barbican-worker" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161847 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1b91750-253e-46eb-9a1c-f7208dab2496" containerName="neutron-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161864 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-auditor" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161886 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="account-replicator" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161903 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="01dc1fd5-4e2f-4129-9452-ed50fa1d182b" containerName="rabbitmq" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161944 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-httpd" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161963 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a1f465c-123b-455f-8bd8-720d3f8a4bef" containerName="ceilometer-central-agent" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.161981 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="cinder-scheduler" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162001 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b429e9-fb10-48ba-b15c-ec25d57e707a" containerName="glance-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162021 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa276a05-ab6a-4aa1-9a9f-a990dc1513bd" containerName="object-server" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162038 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162057 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="719b68df-d1ac-49e5-ac34-dfa3ba33c97f" containerName="placement-api" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162074 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="13df70e2-1a9e-4d81-b23b-c461291bce93" containerName="probe" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162095 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4ca493a-f707-45c3-b457-1a1053c3dfe5" containerName="glance-log" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.162976 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.168171 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.168452 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.168572 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.175477 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549998-97pgs"] Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.287185 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lggxf\" (UniqueName: \"kubernetes.io/projected/0f78a7ad-7933-489d-8395-4bb334007a30-kube-api-access-lggxf\") pod \"auto-csr-approver-29549998-97pgs\" (UID: \"0f78a7ad-7933-489d-8395-4bb334007a30\") " pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.388578 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lggxf\" (UniqueName: \"kubernetes.io/projected/0f78a7ad-7933-489d-8395-4bb334007a30-kube-api-access-lggxf\") pod \"auto-csr-approver-29549998-97pgs\" (UID: \"0f78a7ad-7933-489d-8395-4bb334007a30\") " pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.420235 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lggxf\" (UniqueName: \"kubernetes.io/projected/0f78a7ad-7933-489d-8395-4bb334007a30-kube-api-access-lggxf\") pod \"auto-csr-approver-29549998-97pgs\" (UID: \"0f78a7ad-7933-489d-8395-4bb334007a30\") " pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.485453 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:00 crc kubenswrapper[4885]: I0308 19:58:00.997517 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29549998-97pgs"] Mar 08 19:58:01 crc kubenswrapper[4885]: I0308 19:58:01.006370 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 19:58:01 crc kubenswrapper[4885]: I0308 19:58:01.983315 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549998-97pgs" event={"ID":"0f78a7ad-7933-489d-8395-4bb334007a30","Type":"ContainerStarted","Data":"5e381f1cfe000adb0e79f7dc8276821a3fe78fb5f86c6c5a0ea8fd21de6522c2"} Mar 08 19:58:02 crc kubenswrapper[4885]: I0308 19:58:02.998158 4885 generic.go:334] "Generic (PLEG): container finished" podID="0f78a7ad-7933-489d-8395-4bb334007a30" containerID="f1ee7ab75e6cdb54c44da03961cfd9f0079aa1cd90d1e18350bad8572cfd08fa" exitCode=0 Mar 08 19:58:02 crc kubenswrapper[4885]: I0308 19:58:02.998400 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549998-97pgs" event={"ID":"0f78a7ad-7933-489d-8395-4bb334007a30","Type":"ContainerDied","Data":"f1ee7ab75e6cdb54c44da03961cfd9f0079aa1cd90d1e18350bad8572cfd08fa"} Mar 08 19:58:04 crc kubenswrapper[4885]: I0308 19:58:04.362438 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:04 crc kubenswrapper[4885]: I0308 19:58:04.451795 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lggxf\" (UniqueName: \"kubernetes.io/projected/0f78a7ad-7933-489d-8395-4bb334007a30-kube-api-access-lggxf\") pod \"0f78a7ad-7933-489d-8395-4bb334007a30\" (UID: \"0f78a7ad-7933-489d-8395-4bb334007a30\") " Mar 08 19:58:04 crc kubenswrapper[4885]: I0308 19:58:04.460972 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f78a7ad-7933-489d-8395-4bb334007a30-kube-api-access-lggxf" (OuterVolumeSpecName: "kube-api-access-lggxf") pod "0f78a7ad-7933-489d-8395-4bb334007a30" (UID: "0f78a7ad-7933-489d-8395-4bb334007a30"). InnerVolumeSpecName "kube-api-access-lggxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 19:58:04 crc kubenswrapper[4885]: I0308 19:58:04.553446 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lggxf\" (UniqueName: \"kubernetes.io/projected/0f78a7ad-7933-489d-8395-4bb334007a30-kube-api-access-lggxf\") on node \"crc\" DevicePath \"\"" Mar 08 19:58:05 crc kubenswrapper[4885]: I0308 19:58:05.018394 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29549998-97pgs" event={"ID":"0f78a7ad-7933-489d-8395-4bb334007a30","Type":"ContainerDied","Data":"5e381f1cfe000adb0e79f7dc8276821a3fe78fb5f86c6c5a0ea8fd21de6522c2"} Mar 08 19:58:05 crc kubenswrapper[4885]: I0308 19:58:05.018789 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e381f1cfe000adb0e79f7dc8276821a3fe78fb5f86c6c5a0ea8fd21de6522c2" Mar 08 19:58:05 crc kubenswrapper[4885]: I0308 19:58:05.018513 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29549998-97pgs" Mar 08 19:58:05 crc kubenswrapper[4885]: I0308 19:58:05.505494 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549992-pzbpd"] Mar 08 19:58:05 crc kubenswrapper[4885]: I0308 19:58:05.528036 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549992-pzbpd"] Mar 08 19:58:07 crc kubenswrapper[4885]: I0308 19:58:07.383422 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec" path="/var/lib/kubelet/pods/deb9e9e0-35d3-4473-ad7b-7b44fd44e8ec/volumes" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.019552 4885 scope.go:117] "RemoveContainer" containerID="fd8d322616b5f6a1a40bb29dccbdca346cc9726c81d97364f599af639e9e8eb7" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.058466 4885 scope.go:117] "RemoveContainer" containerID="17d561daa3a3a15f18cf22c1e06443b53b3323129a45f06fd40855d4bb9fbf6a" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.088729 4885 scope.go:117] "RemoveContainer" containerID="f13c64b3a8cac3c8bcb02e4b62a77799ac33e44598003cd3a842dd2a34fd0963" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.142033 4885 scope.go:117] "RemoveContainer" containerID="41a09112d08bd7901521db3ad7a70721bb9ad48344056086b5a05f6b55d65d91" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.176712 4885 scope.go:117] "RemoveContainer" containerID="74b67198f27ef75d7d190ca202d1aaac73c4511ab6579ab6cc6cd813c7ff04f7" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.229406 4885 scope.go:117] "RemoveContainer" containerID="d9a0dae6743044b0ee2ed3030e29d6fe34bb42caf427155033310333a42d0a5a" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.259282 4885 scope.go:117] "RemoveContainer" containerID="e06b6952cf7f49bba090c40e1251201f80874fce311561d18f2cd3c7169feb77" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.314141 4885 scope.go:117] "RemoveContainer" containerID="a1a66f9e3c39e6448e08179a06c354d7f53b5cb971ddf727953fa9e3689c988d" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.344081 4885 scope.go:117] "RemoveContainer" containerID="1f2b4371c693a384eeb8722a9c031b14ca5b16214abf49aab649bdf051aaa6a9" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.389875 4885 scope.go:117] "RemoveContainer" containerID="88f0fd52df3aa60bc754c49bef747bbf48ae9a2eeb839f1af08e43921bc83090" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.442278 4885 scope.go:117] "RemoveContainer" containerID="6c5fd0c87fdc37dc689d9957740eb80226bab6f4a5010aca5f3d0a66ab0c82c3" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.470595 4885 scope.go:117] "RemoveContainer" containerID="6bdf4492dc9ff59a23eeb3289e91f60d8a1697795948d983180a4ac75c5e122e" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.510212 4885 scope.go:117] "RemoveContainer" containerID="302d122e7028362942b84bde8688589c00dd224f41e987890dd32bc866af958e" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.556063 4885 scope.go:117] "RemoveContainer" containerID="d9ea1c70756e397df6785ca6ac5c032d1dcba35d8ce3a74fd9e9a044ec85b1ad" Mar 08 19:58:58 crc kubenswrapper[4885]: I0308 19:58:58.584763 4885 scope.go:117] "RemoveContainer" containerID="33bf72d09b758f81e7db370aead1484095f9a13481bdaed0653c1631df7c254b" Mar 08 19:59:58 crc kubenswrapper[4885]: I0308 19:59:58.867469 4885 scope.go:117] "RemoveContainer" containerID="0fd5040bc376c8f684c8ba84911a21e03723dd7d09ccc7b3d5b40d2f11712a3d" Mar 08 19:59:58 crc kubenswrapper[4885]: I0308 19:59:58.903561 4885 scope.go:117] "RemoveContainer" containerID="46919954f7a8695f89f60ffd2c95fd19f9f50cf97e2bbb06931bbceff7c47a47" Mar 08 19:59:58 crc kubenswrapper[4885]: I0308 19:59:58.952113 4885 scope.go:117] "RemoveContainer" containerID="ce47c98e58f66c2a55840d70bb55bfd25b6d54a9fa04407857b7919987c1acd6" Mar 08 19:59:58 crc kubenswrapper[4885]: I0308 19:59:58.981732 4885 scope.go:117] "RemoveContainer" containerID="30e722bde831d03eed4bc7a2ac2c7c561a897a0bc5aee76137806f8c867c31e9" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.009532 4885 scope.go:117] "RemoveContainer" containerID="17e37a1234fac68b042cb982b6be421ba7a3bd54c84d93b8bbb1842a9f1fa332" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.026737 4885 scope.go:117] "RemoveContainer" containerID="83c825c6a12d2141eb0dfe1368babc2f8bfb90700bef146c412cb41b76f028b3" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.044286 4885 scope.go:117] "RemoveContainer" containerID="db7de2bfb2402bc7c35eeb0e3a0a80a212c00dd48e7b95c320538d854040bceb" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.065101 4885 scope.go:117] "RemoveContainer" containerID="1b8b8e4856a24e16b23d4c15ef261857dfcc94531017a1b02728028102e1d5ce" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.095899 4885 scope.go:117] "RemoveContainer" containerID="eb80fb2a1922a32d725b4ee5e3cc391924d843e1dfc770a23f4293be00620e5f" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.116185 4885 scope.go:117] "RemoveContainer" containerID="7cf70af4753fbcc177c169967bfd0633e149f5d98df36cb2d6ff676d0a215e21" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.140255 4885 scope.go:117] "RemoveContainer" containerID="9585f2e0b3d9045954e289a5ad0191eb4ab1e2632be8da00e467a511a692dd4f" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.163609 4885 scope.go:117] "RemoveContainer" containerID="4cffd27a9b7724e448f78dd5d8fc02f0f0058f5575262e988c93602105c6d597" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.180144 4885 scope.go:117] "RemoveContainer" containerID="155731b1565c2836cebbf6fadafab50001c261430bf9d84221bbe681fb56634d" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.206307 4885 scope.go:117] "RemoveContainer" containerID="43662ed70d9fce30619b2928a293996c741d8618375e00a25c69cc3ec2f8804c" Mar 08 19:59:59 crc kubenswrapper[4885]: I0308 19:59:59.238831 4885 scope.go:117] "RemoveContainer" containerID="2ea3e6e51d477fb7795967def43fe0be522063fbf11d9053ce41fa22a8bf42b3" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.173802 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550000-bp85d"] Mar 08 20:00:00 crc kubenswrapper[4885]: E0308 20:00:00.174321 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f78a7ad-7933-489d-8395-4bb334007a30" containerName="oc" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.174342 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f78a7ad-7933-489d-8395-4bb334007a30" containerName="oc" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.174593 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f78a7ad-7933-489d-8395-4bb334007a30" containerName="oc" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.175349 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.179504 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.180370 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.181741 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.182112 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll"] Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.183066 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.187074 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.190307 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.191913 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550000-bp85d"] Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.200162 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll"] Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.315369 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0336d864-07a3-41ed-9327-8a39d16d667f-secret-volume\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.315442 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0336d864-07a3-41ed-9327-8a39d16d667f-config-volume\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.315495 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tcst\" (UniqueName: \"kubernetes.io/projected/d589285c-60a3-4871-9149-7f1f99fc35ee-kube-api-access-8tcst\") pod \"auto-csr-approver-29550000-bp85d\" (UID: \"d589285c-60a3-4871-9149-7f1f99fc35ee\") " pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.315572 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfdl2\" (UniqueName: \"kubernetes.io/projected/0336d864-07a3-41ed-9327-8a39d16d667f-kube-api-access-sfdl2\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.417035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0336d864-07a3-41ed-9327-8a39d16d667f-secret-volume\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.417100 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0336d864-07a3-41ed-9327-8a39d16d667f-config-volume\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.417148 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tcst\" (UniqueName: \"kubernetes.io/projected/d589285c-60a3-4871-9149-7f1f99fc35ee-kube-api-access-8tcst\") pod \"auto-csr-approver-29550000-bp85d\" (UID: \"d589285c-60a3-4871-9149-7f1f99fc35ee\") " pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.417231 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfdl2\" (UniqueName: \"kubernetes.io/projected/0336d864-07a3-41ed-9327-8a39d16d667f-kube-api-access-sfdl2\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.419055 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0336d864-07a3-41ed-9327-8a39d16d667f-config-volume\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.426554 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0336d864-07a3-41ed-9327-8a39d16d667f-secret-volume\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.446299 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tcst\" (UniqueName: \"kubernetes.io/projected/d589285c-60a3-4871-9149-7f1f99fc35ee-kube-api-access-8tcst\") pod \"auto-csr-approver-29550000-bp85d\" (UID: \"d589285c-60a3-4871-9149-7f1f99fc35ee\") " pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.447645 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfdl2\" (UniqueName: \"kubernetes.io/projected/0336d864-07a3-41ed-9327-8a39d16d667f-kube-api-access-sfdl2\") pod \"collect-profiles-29550000-m9vll\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.502038 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.519032 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.871373 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll"] Mar 08 20:00:00 crc kubenswrapper[4885]: I0308 20:00:00.956756 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550000-bp85d"] Mar 08 20:00:00 crc kubenswrapper[4885]: W0308 20:00:00.973516 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd589285c_60a3_4871_9149_7f1f99fc35ee.slice/crio-b7651c7919dd5db0a014713dbc9f335283d2d9fcd69c44b3d25aad3c32d08f85 WatchSource:0}: Error finding container b7651c7919dd5db0a014713dbc9f335283d2d9fcd69c44b3d25aad3c32d08f85: Status 404 returned error can't find the container with id b7651c7919dd5db0a014713dbc9f335283d2d9fcd69c44b3d25aad3c32d08f85 Mar 08 20:00:01 crc kubenswrapper[4885]: I0308 20:00:01.180806 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" event={"ID":"0336d864-07a3-41ed-9327-8a39d16d667f","Type":"ContainerStarted","Data":"04ce20a75f575125843cdf885d5d1cfa9b696f27d4253665b2071884d88ab3e4"} Mar 08 20:00:01 crc kubenswrapper[4885]: I0308 20:00:01.181139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" event={"ID":"0336d864-07a3-41ed-9327-8a39d16d667f","Type":"ContainerStarted","Data":"ee55d194889b7e5bace9e52f4bf9b0fb7a88fa921d15062bc85ecfda7d18c98e"} Mar 08 20:00:01 crc kubenswrapper[4885]: I0308 20:00:01.183663 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550000-bp85d" event={"ID":"d589285c-60a3-4871-9149-7f1f99fc35ee","Type":"ContainerStarted","Data":"b7651c7919dd5db0a014713dbc9f335283d2d9fcd69c44b3d25aad3c32d08f85"} Mar 08 20:00:01 crc kubenswrapper[4885]: I0308 20:00:01.203790 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" podStartSLOduration=1.203769232 podStartE2EDuration="1.203769232s" podCreationTimestamp="2026-03-08 20:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:00:01.20215276 +0000 UTC m=+1702.598206793" watchObservedRunningTime="2026-03-08 20:00:01.203769232 +0000 UTC m=+1702.599823265" Mar 08 20:00:02 crc kubenswrapper[4885]: I0308 20:00:02.195755 4885 generic.go:334] "Generic (PLEG): container finished" podID="0336d864-07a3-41ed-9327-8a39d16d667f" containerID="04ce20a75f575125843cdf885d5d1cfa9b696f27d4253665b2071884d88ab3e4" exitCode=0 Mar 08 20:00:02 crc kubenswrapper[4885]: I0308 20:00:02.195801 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" event={"ID":"0336d864-07a3-41ed-9327-8a39d16d667f","Type":"ContainerDied","Data":"04ce20a75f575125843cdf885d5d1cfa9b696f27d4253665b2071884d88ab3e4"} Mar 08 20:00:02 crc kubenswrapper[4885]: I0308 20:00:02.818663 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:00:02 crc kubenswrapper[4885]: I0308 20:00:02.818788 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.550429 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.691013 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0336d864-07a3-41ed-9327-8a39d16d667f-config-volume\") pod \"0336d864-07a3-41ed-9327-8a39d16d667f\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.691081 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfdl2\" (UniqueName: \"kubernetes.io/projected/0336d864-07a3-41ed-9327-8a39d16d667f-kube-api-access-sfdl2\") pod \"0336d864-07a3-41ed-9327-8a39d16d667f\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.691121 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0336d864-07a3-41ed-9327-8a39d16d667f-secret-volume\") pod \"0336d864-07a3-41ed-9327-8a39d16d667f\" (UID: \"0336d864-07a3-41ed-9327-8a39d16d667f\") " Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.691668 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0336d864-07a3-41ed-9327-8a39d16d667f-config-volume" (OuterVolumeSpecName: "config-volume") pod "0336d864-07a3-41ed-9327-8a39d16d667f" (UID: "0336d864-07a3-41ed-9327-8a39d16d667f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.696905 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0336d864-07a3-41ed-9327-8a39d16d667f-kube-api-access-sfdl2" (OuterVolumeSpecName: "kube-api-access-sfdl2") pod "0336d864-07a3-41ed-9327-8a39d16d667f" (UID: "0336d864-07a3-41ed-9327-8a39d16d667f"). InnerVolumeSpecName "kube-api-access-sfdl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.699067 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0336d864-07a3-41ed-9327-8a39d16d667f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0336d864-07a3-41ed-9327-8a39d16d667f" (UID: "0336d864-07a3-41ed-9327-8a39d16d667f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.792902 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0336d864-07a3-41ed-9327-8a39d16d667f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.792983 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfdl2\" (UniqueName: \"kubernetes.io/projected/0336d864-07a3-41ed-9327-8a39d16d667f-kube-api-access-sfdl2\") on node \"crc\" DevicePath \"\"" Mar 08 20:00:03 crc kubenswrapper[4885]: I0308 20:00:03.793007 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0336d864-07a3-41ed-9327-8a39d16d667f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:00:04 crc kubenswrapper[4885]: I0308 20:00:04.217126 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" event={"ID":"0336d864-07a3-41ed-9327-8a39d16d667f","Type":"ContainerDied","Data":"ee55d194889b7e5bace9e52f4bf9b0fb7a88fa921d15062bc85ecfda7d18c98e"} Mar 08 20:00:04 crc kubenswrapper[4885]: I0308 20:00:04.217483 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee55d194889b7e5bace9e52f4bf9b0fb7a88fa921d15062bc85ecfda7d18c98e" Mar 08 20:00:04 crc kubenswrapper[4885]: I0308 20:00:04.217232 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll" Mar 08 20:00:24 crc kubenswrapper[4885]: I0308 20:00:24.420000 4885 generic.go:334] "Generic (PLEG): container finished" podID="d589285c-60a3-4871-9149-7f1f99fc35ee" containerID="a1918622a7f691d5b0978579d743cac5d40266346f9de21b0dbb76cf8ca3f823" exitCode=0 Mar 08 20:00:24 crc kubenswrapper[4885]: I0308 20:00:24.420114 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550000-bp85d" event={"ID":"d589285c-60a3-4871-9149-7f1f99fc35ee","Type":"ContainerDied","Data":"a1918622a7f691d5b0978579d743cac5d40266346f9de21b0dbb76cf8ca3f823"} Mar 08 20:00:25 crc kubenswrapper[4885]: I0308 20:00:25.802583 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:25 crc kubenswrapper[4885]: I0308 20:00:25.987669 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tcst\" (UniqueName: \"kubernetes.io/projected/d589285c-60a3-4871-9149-7f1f99fc35ee-kube-api-access-8tcst\") pod \"d589285c-60a3-4871-9149-7f1f99fc35ee\" (UID: \"d589285c-60a3-4871-9149-7f1f99fc35ee\") " Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.001140 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d589285c-60a3-4871-9149-7f1f99fc35ee-kube-api-access-8tcst" (OuterVolumeSpecName: "kube-api-access-8tcst") pod "d589285c-60a3-4871-9149-7f1f99fc35ee" (UID: "d589285c-60a3-4871-9149-7f1f99fc35ee"). InnerVolumeSpecName "kube-api-access-8tcst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.091004 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tcst\" (UniqueName: \"kubernetes.io/projected/d589285c-60a3-4871-9149-7f1f99fc35ee-kube-api-access-8tcst\") on node \"crc\" DevicePath \"\"" Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.443562 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550000-bp85d" event={"ID":"d589285c-60a3-4871-9149-7f1f99fc35ee","Type":"ContainerDied","Data":"b7651c7919dd5db0a014713dbc9f335283d2d9fcd69c44b3d25aad3c32d08f85"} Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.443622 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7651c7919dd5db0a014713dbc9f335283d2d9fcd69c44b3d25aad3c32d08f85" Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.443697 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550000-bp85d" Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.873900 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549994-6zc6p"] Mar 08 20:00:26 crc kubenswrapper[4885]: I0308 20:00:26.880808 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549994-6zc6p"] Mar 08 20:00:27 crc kubenswrapper[4885]: I0308 20:00:27.382445 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e836afb-bb6f-4e67-9df6-5bef0273a523" path="/var/lib/kubelet/pods/5e836afb-bb6f-4e67-9df6-5bef0273a523/volumes" Mar 08 20:00:32 crc kubenswrapper[4885]: I0308 20:00:32.838313 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:00:32 crc kubenswrapper[4885]: I0308 20:00:32.839024 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:00:59 crc kubenswrapper[4885]: I0308 20:00:59.387294 4885 scope.go:117] "RemoveContainer" containerID="0d00454c184e09bd4a156eebaa35bb3bcacf94bedd622a0c71e0954aef720385" Mar 08 20:00:59 crc kubenswrapper[4885]: I0308 20:00:59.465631 4885 scope.go:117] "RemoveContainer" containerID="72f8ee44be245d4136285cfdfda421e5c74196d06b96d20eec24f989618614f0" Mar 08 20:00:59 crc kubenswrapper[4885]: I0308 20:00:59.530116 4885 scope.go:117] "RemoveContainer" containerID="6794573adf705b25644869fa29cf7b121bfba4201d8cc4c2b8d4234d9e883a1d" Mar 08 20:00:59 crc kubenswrapper[4885]: I0308 20:00:59.588877 4885 scope.go:117] "RemoveContainer" containerID="5bc08b9d58402236103943567dfa278b8697894bc6f9fe1ef5bb281393c8f6d5" Mar 08 20:00:59 crc kubenswrapper[4885]: I0308 20:00:59.649436 4885 scope.go:117] "RemoveContainer" containerID="9348c59006c229d5addb1edf8ed7baf6b6d89cc79e7937bbe98f9278bc9d36c3" Mar 08 20:01:02 crc kubenswrapper[4885]: I0308 20:01:02.818509 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:01:02 crc kubenswrapper[4885]: I0308 20:01:02.818896 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:01:02 crc kubenswrapper[4885]: I0308 20:01:02.818984 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:01:02 crc kubenswrapper[4885]: I0308 20:01:02.819871 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:01:02 crc kubenswrapper[4885]: I0308 20:01:02.819999 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" gracePeriod=600 Mar 08 20:01:02 crc kubenswrapper[4885]: E0308 20:01:02.948358 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:01:03 crc kubenswrapper[4885]: I0308 20:01:03.821661 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" exitCode=0 Mar 08 20:01:03 crc kubenswrapper[4885]: I0308 20:01:03.821741 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400"} Mar 08 20:01:03 crc kubenswrapper[4885]: I0308 20:01:03.822156 4885 scope.go:117] "RemoveContainer" containerID="b0670bdd5ae4a6193bbd77e528520a487b041af774d5305f09762277548bcda8" Mar 08 20:01:03 crc kubenswrapper[4885]: I0308 20:01:03.822702 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:01:03 crc kubenswrapper[4885]: E0308 20:01:03.822991 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:01:16 crc kubenswrapper[4885]: I0308 20:01:16.368414 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:01:16 crc kubenswrapper[4885]: E0308 20:01:16.369090 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:01:30 crc kubenswrapper[4885]: I0308 20:01:30.368853 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:01:30 crc kubenswrapper[4885]: E0308 20:01:30.369845 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:01:43 crc kubenswrapper[4885]: I0308 20:01:43.368581 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:01:43 crc kubenswrapper[4885]: E0308 20:01:43.369686 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:01:56 crc kubenswrapper[4885]: I0308 20:01:56.369278 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:01:56 crc kubenswrapper[4885]: E0308 20:01:56.370220 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:01:59 crc kubenswrapper[4885]: I0308 20:01:59.784421 4885 scope.go:117] "RemoveContainer" containerID="52341d43266cc07b81a420fcef2575f373be02c72919fe4d7ea1b1bbdd8174c2" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.162826 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550002-jw6jr"] Mar 08 20:02:00 crc kubenswrapper[4885]: E0308 20:02:00.163782 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0336d864-07a3-41ed-9327-8a39d16d667f" containerName="collect-profiles" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.163815 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0336d864-07a3-41ed-9327-8a39d16d667f" containerName="collect-profiles" Mar 08 20:02:00 crc kubenswrapper[4885]: E0308 20:02:00.163845 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d589285c-60a3-4871-9149-7f1f99fc35ee" containerName="oc" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.163858 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d589285c-60a3-4871-9149-7f1f99fc35ee" containerName="oc" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.164263 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0336d864-07a3-41ed-9327-8a39d16d667f" containerName="collect-profiles" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.164318 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d589285c-60a3-4871-9149-7f1f99fc35ee" containerName="oc" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.165281 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.168348 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.170641 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.174538 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.177200 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550002-jw6jr"] Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.199627 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc7pl\" (UniqueName: \"kubernetes.io/projected/cdb49670-90cc-4dfa-9e28-1ae44ed1104b-kube-api-access-zc7pl\") pod \"auto-csr-approver-29550002-jw6jr\" (UID: \"cdb49670-90cc-4dfa-9e28-1ae44ed1104b\") " pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.301121 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc7pl\" (UniqueName: \"kubernetes.io/projected/cdb49670-90cc-4dfa-9e28-1ae44ed1104b-kube-api-access-zc7pl\") pod \"auto-csr-approver-29550002-jw6jr\" (UID: \"cdb49670-90cc-4dfa-9e28-1ae44ed1104b\") " pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.323964 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc7pl\" (UniqueName: \"kubernetes.io/projected/cdb49670-90cc-4dfa-9e28-1ae44ed1104b-kube-api-access-zc7pl\") pod \"auto-csr-approver-29550002-jw6jr\" (UID: \"cdb49670-90cc-4dfa-9e28-1ae44ed1104b\") " pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.498617 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:00 crc kubenswrapper[4885]: I0308 20:02:00.794457 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550002-jw6jr"] Mar 08 20:02:00 crc kubenswrapper[4885]: W0308 20:02:00.804779 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdb49670_90cc_4dfa_9e28_1ae44ed1104b.slice/crio-a26dfd0763605ab3866fd64f399ffc941af7eaf483ceb017c0ed8b550327f4cd WatchSource:0}: Error finding container a26dfd0763605ab3866fd64f399ffc941af7eaf483ceb017c0ed8b550327f4cd: Status 404 returned error can't find the container with id a26dfd0763605ab3866fd64f399ffc941af7eaf483ceb017c0ed8b550327f4cd Mar 08 20:02:01 crc kubenswrapper[4885]: I0308 20:02:01.339097 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" event={"ID":"cdb49670-90cc-4dfa-9e28-1ae44ed1104b","Type":"ContainerStarted","Data":"a26dfd0763605ab3866fd64f399ffc941af7eaf483ceb017c0ed8b550327f4cd"} Mar 08 20:02:02 crc kubenswrapper[4885]: I0308 20:02:02.350377 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" event={"ID":"cdb49670-90cc-4dfa-9e28-1ae44ed1104b","Type":"ContainerStarted","Data":"85234bd429cc704e5f096400e4f1b62d94a0128f9b3c1c36899a84d4f6ac1ba9"} Mar 08 20:02:03 crc kubenswrapper[4885]: I0308 20:02:03.383753 4885 generic.go:334] "Generic (PLEG): container finished" podID="cdb49670-90cc-4dfa-9e28-1ae44ed1104b" containerID="85234bd429cc704e5f096400e4f1b62d94a0128f9b3c1c36899a84d4f6ac1ba9" exitCode=0 Mar 08 20:02:03 crc kubenswrapper[4885]: I0308 20:02:03.392088 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" event={"ID":"cdb49670-90cc-4dfa-9e28-1ae44ed1104b","Type":"ContainerDied","Data":"85234bd429cc704e5f096400e4f1b62d94a0128f9b3c1c36899a84d4f6ac1ba9"} Mar 08 20:02:03 crc kubenswrapper[4885]: I0308 20:02:03.731207 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:03 crc kubenswrapper[4885]: I0308 20:02:03.756282 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc7pl\" (UniqueName: \"kubernetes.io/projected/cdb49670-90cc-4dfa-9e28-1ae44ed1104b-kube-api-access-zc7pl\") pod \"cdb49670-90cc-4dfa-9e28-1ae44ed1104b\" (UID: \"cdb49670-90cc-4dfa-9e28-1ae44ed1104b\") " Mar 08 20:02:03 crc kubenswrapper[4885]: I0308 20:02:03.777273 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb49670-90cc-4dfa-9e28-1ae44ed1104b-kube-api-access-zc7pl" (OuterVolumeSpecName: "kube-api-access-zc7pl") pod "cdb49670-90cc-4dfa-9e28-1ae44ed1104b" (UID: "cdb49670-90cc-4dfa-9e28-1ae44ed1104b"). InnerVolumeSpecName "kube-api-access-zc7pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:02:03 crc kubenswrapper[4885]: I0308 20:02:03.857700 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc7pl\" (UniqueName: \"kubernetes.io/projected/cdb49670-90cc-4dfa-9e28-1ae44ed1104b-kube-api-access-zc7pl\") on node \"crc\" DevicePath \"\"" Mar 08 20:02:04 crc kubenswrapper[4885]: I0308 20:02:04.397944 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" event={"ID":"cdb49670-90cc-4dfa-9e28-1ae44ed1104b","Type":"ContainerDied","Data":"a26dfd0763605ab3866fd64f399ffc941af7eaf483ceb017c0ed8b550327f4cd"} Mar 08 20:02:04 crc kubenswrapper[4885]: I0308 20:02:04.398007 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a26dfd0763605ab3866fd64f399ffc941af7eaf483ceb017c0ed8b550327f4cd" Mar 08 20:02:04 crc kubenswrapper[4885]: I0308 20:02:04.398028 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550002-jw6jr" Mar 08 20:02:04 crc kubenswrapper[4885]: I0308 20:02:04.809753 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549996-rr28z"] Mar 08 20:02:04 crc kubenswrapper[4885]: I0308 20:02:04.820333 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549996-rr28z"] Mar 08 20:02:05 crc kubenswrapper[4885]: I0308 20:02:05.383254 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d9eec97-1b10-4d6a-9787-e137d3c37dec" path="/var/lib/kubelet/pods/5d9eec97-1b10-4d6a-9787-e137d3c37dec/volumes" Mar 08 20:02:10 crc kubenswrapper[4885]: I0308 20:02:10.368504 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:02:10 crc kubenswrapper[4885]: E0308 20:02:10.369482 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:02:22 crc kubenswrapper[4885]: I0308 20:02:22.368374 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:02:22 crc kubenswrapper[4885]: E0308 20:02:22.369742 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:02:33 crc kubenswrapper[4885]: I0308 20:02:33.368813 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:02:33 crc kubenswrapper[4885]: E0308 20:02:33.369778 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:02:47 crc kubenswrapper[4885]: I0308 20:02:47.368550 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:02:47 crc kubenswrapper[4885]: E0308 20:02:47.369505 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:02:59 crc kubenswrapper[4885]: I0308 20:02:59.879525 4885 scope.go:117] "RemoveContainer" containerID="5576d8a075a0f44218f0cea569a1265805ed7bccbb93726d2adc83621dd67e49" Mar 08 20:03:01 crc kubenswrapper[4885]: I0308 20:03:01.369043 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:03:01 crc kubenswrapper[4885]: E0308 20:03:01.369698 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:03:13 crc kubenswrapper[4885]: I0308 20:03:13.368665 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:03:13 crc kubenswrapper[4885]: E0308 20:03:13.369791 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:03:21 crc kubenswrapper[4885]: I0308 20:03:21.310161 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-hlpjf" podUID="157555d5-ca64-49f8-8849-cd763c83feda" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.82:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 20:03:28 crc kubenswrapper[4885]: I0308 20:03:28.369189 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:03:28 crc kubenswrapper[4885]: E0308 20:03:28.370145 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:03:39 crc kubenswrapper[4885]: I0308 20:03:39.375672 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:03:39 crc kubenswrapper[4885]: E0308 20:03:39.376721 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.062146 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4ptr9"] Mar 08 20:03:47 crc kubenswrapper[4885]: E0308 20:03:47.063176 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb49670-90cc-4dfa-9e28-1ae44ed1104b" containerName="oc" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.063198 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb49670-90cc-4dfa-9e28-1ae44ed1104b" containerName="oc" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.063471 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb49670-90cc-4dfa-9e28-1ae44ed1104b" containerName="oc" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.065220 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.078250 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ptr9"] Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.123971 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prh5k\" (UniqueName: \"kubernetes.io/projected/f7a5e492-4ddd-4229-b1fb-019fa71d2951-kube-api-access-prh5k\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.124038 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-catalog-content\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.124112 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-utilities\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.225244 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-catalog-content\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.225365 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-utilities\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.225417 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prh5k\" (UniqueName: \"kubernetes.io/projected/f7a5e492-4ddd-4229-b1fb-019fa71d2951-kube-api-access-prh5k\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.226119 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-utilities\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.226126 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-catalog-content\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.250723 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cqvqv"] Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.256990 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prh5k\" (UniqueName: \"kubernetes.io/projected/f7a5e492-4ddd-4229-b1fb-019fa71d2951-kube-api-access-prh5k\") pod \"redhat-marketplace-4ptr9\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.259208 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.268002 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cqvqv"] Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.327208 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrbzc\" (UniqueName: \"kubernetes.io/projected/2ffd5483-5bf1-4ca1-945d-5de49426ee21-kube-api-access-zrbzc\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.327249 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-catalog-content\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.327294 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-utilities\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.397889 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.428708 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrbzc\" (UniqueName: \"kubernetes.io/projected/2ffd5483-5bf1-4ca1-945d-5de49426ee21-kube-api-access-zrbzc\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.429073 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-catalog-content\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.429099 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-utilities\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.429472 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-catalog-content\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.429495 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-utilities\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.445518 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrbzc\" (UniqueName: \"kubernetes.io/projected/2ffd5483-5bf1-4ca1-945d-5de49426ee21-kube-api-access-zrbzc\") pod \"certified-operators-cqvqv\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.613609 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.644984 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ptr9"] Mar 08 20:03:47 crc kubenswrapper[4885]: I0308 20:03:47.932045 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cqvqv"] Mar 08 20:03:47 crc kubenswrapper[4885]: E0308 20:03:47.997983 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7a5e492_4ddd_4229_b1fb_019fa71d2951.slice/crio-2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7a5e492_4ddd_4229_b1fb_019fa71d2951.slice/crio-conmon-2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2.scope\": RecentStats: unable to find data in memory cache]" Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.668661 4885 generic.go:334] "Generic (PLEG): container finished" podID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerID="2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2" exitCode=0 Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.668762 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerDied","Data":"2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2"} Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.668981 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerStarted","Data":"7562d15c2fd135e779c5aa72239de4c7f9c7869128d1cb264d9a14a668809cf4"} Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.671192 4885 generic.go:334] "Generic (PLEG): container finished" podID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerID="22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326" exitCode=0 Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.671235 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerDied","Data":"22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326"} Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.671261 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerStarted","Data":"d49c4dba2cb2e10975f26f51e87613eadff9b443e9c9c89ec4745a20041c01b4"} Mar 08 20:03:48 crc kubenswrapper[4885]: I0308 20:03:48.671704 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.468600 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dn4kh"] Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.469907 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.475881 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dn4kh"] Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.560222 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-utilities\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.560271 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7n4n\" (UniqueName: \"kubernetes.io/projected/14843e03-07ac-482d-b2f6-6bbfb7567b91-kube-api-access-j7n4n\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.560337 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-catalog-content\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.641990 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-84srk"] Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.643270 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.661799 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-catalog-content\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.661863 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-utilities\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.661889 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7n4n\" (UniqueName: \"kubernetes.io/projected/14843e03-07ac-482d-b2f6-6bbfb7567b91-kube-api-access-j7n4n\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.662389 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-catalog-content\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.662458 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-utilities\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.665913 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84srk"] Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.690940 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerStarted","Data":"99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c"} Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.696071 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7n4n\" (UniqueName: \"kubernetes.io/projected/14843e03-07ac-482d-b2f6-6bbfb7567b91-kube-api-access-j7n4n\") pod \"redhat-operators-dn4kh\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.701535 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerStarted","Data":"9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941"} Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.763465 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbpfb\" (UniqueName: \"kubernetes.io/projected/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-kube-api-access-mbpfb\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.764029 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-catalog-content\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.764272 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-utilities\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.790600 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.865287 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbpfb\" (UniqueName: \"kubernetes.io/projected/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-kube-api-access-mbpfb\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.865334 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-catalog-content\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.865378 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-utilities\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.865803 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-utilities\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.866269 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-catalog-content\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.897956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbpfb\" (UniqueName: \"kubernetes.io/projected/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-kube-api-access-mbpfb\") pod \"community-operators-84srk\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:49 crc kubenswrapper[4885]: I0308 20:03:49.957761 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.230706 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dn4kh"] Mar 08 20:03:50 crc kubenswrapper[4885]: W0308 20:03:50.478379 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17fe0b53_ae94_4b6b_8a6d_1d1f68a52bf7.slice/crio-6a5c70ea2ed0ef38c370fb83471173899ee581eabbc19e5b33bd1521ba7a60d0 WatchSource:0}: Error finding container 6a5c70ea2ed0ef38c370fb83471173899ee581eabbc19e5b33bd1521ba7a60d0: Status 404 returned error can't find the container with id 6a5c70ea2ed0ef38c370fb83471173899ee581eabbc19e5b33bd1521ba7a60d0 Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.480077 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84srk"] Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.710322 4885 generic.go:334] "Generic (PLEG): container finished" podID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerID="99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c" exitCode=0 Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.710518 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerDied","Data":"99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c"} Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.713438 4885 generic.go:334] "Generic (PLEG): container finished" podID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerID="df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853" exitCode=0 Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.713505 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerDied","Data":"df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853"} Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.713534 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerStarted","Data":"1d0f0c8555c4b252c5081e0c34f2d3afd49d81f6583bcf2c95de0ec400208b12"} Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.717692 4885 generic.go:334] "Generic (PLEG): container finished" podID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerID="9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941" exitCode=0 Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.717756 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerDied","Data":"9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941"} Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.722892 4885 generic.go:334] "Generic (PLEG): container finished" podID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerID="e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6" exitCode=0 Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.722938 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerDied","Data":"e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6"} Mar 08 20:03:50 crc kubenswrapper[4885]: I0308 20:03:50.722958 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerStarted","Data":"6a5c70ea2ed0ef38c370fb83471173899ee581eabbc19e5b33bd1521ba7a60d0"} Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.368660 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:03:51 crc kubenswrapper[4885]: E0308 20:03:51.369569 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.745273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerStarted","Data":"e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7"} Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.747910 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerStarted","Data":"68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60"} Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.751973 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerStarted","Data":"2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a"} Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.754283 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerStarted","Data":"6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164"} Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.777055 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cqvqv" podStartSLOduration=2.116615552 podStartE2EDuration="4.777029414s" podCreationTimestamp="2026-03-08 20:03:47 +0000 UTC" firstStartedPulling="2026-03-08 20:03:48.673347997 +0000 UTC m=+1930.069402060" lastFinishedPulling="2026-03-08 20:03:51.333761899 +0000 UTC m=+1932.729815922" observedRunningTime="2026-03-08 20:03:51.76522529 +0000 UTC m=+1933.161279323" watchObservedRunningTime="2026-03-08 20:03:51.777029414 +0000 UTC m=+1933.173083437" Mar 08 20:03:51 crc kubenswrapper[4885]: I0308 20:03:51.835963 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4ptr9" podStartSLOduration=2.322101759 podStartE2EDuration="4.835922622s" podCreationTimestamp="2026-03-08 20:03:47 +0000 UTC" firstStartedPulling="2026-03-08 20:03:48.67125566 +0000 UTC m=+1930.067309723" lastFinishedPulling="2026-03-08 20:03:51.185076563 +0000 UTC m=+1932.581130586" observedRunningTime="2026-03-08 20:03:51.820258434 +0000 UTC m=+1933.216312457" watchObservedRunningTime="2026-03-08 20:03:51.835922622 +0000 UTC m=+1933.231976655" Mar 08 20:03:52 crc kubenswrapper[4885]: I0308 20:03:52.768883 4885 generic.go:334] "Generic (PLEG): container finished" podID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerID="6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164" exitCode=0 Mar 08 20:03:52 crc kubenswrapper[4885]: I0308 20:03:52.768952 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerDied","Data":"6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164"} Mar 08 20:03:52 crc kubenswrapper[4885]: I0308 20:03:52.770562 4885 generic.go:334] "Generic (PLEG): container finished" podID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerID="68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60" exitCode=0 Mar 08 20:03:52 crc kubenswrapper[4885]: I0308 20:03:52.770825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerDied","Data":"68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60"} Mar 08 20:03:53 crc kubenswrapper[4885]: I0308 20:03:53.782481 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerStarted","Data":"76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780"} Mar 08 20:03:53 crc kubenswrapper[4885]: I0308 20:03:53.785409 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerStarted","Data":"c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b"} Mar 08 20:03:53 crc kubenswrapper[4885]: I0308 20:03:53.805818 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-84srk" podStartSLOduration=2.35787055 podStartE2EDuration="4.805797249s" podCreationTimestamp="2026-03-08 20:03:49 +0000 UTC" firstStartedPulling="2026-03-08 20:03:50.725124183 +0000 UTC m=+1932.121178216" lastFinishedPulling="2026-03-08 20:03:53.173050852 +0000 UTC m=+1934.569104915" observedRunningTime="2026-03-08 20:03:53.803245151 +0000 UTC m=+1935.199299194" watchObservedRunningTime="2026-03-08 20:03:53.805797249 +0000 UTC m=+1935.201851292" Mar 08 20:03:53 crc kubenswrapper[4885]: I0308 20:03:53.837847 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dn4kh" podStartSLOduration=2.343723174 podStartE2EDuration="4.837829641s" podCreationTimestamp="2026-03-08 20:03:49 +0000 UTC" firstStartedPulling="2026-03-08 20:03:50.714695186 +0000 UTC m=+1932.110749219" lastFinishedPulling="2026-03-08 20:03:53.208801663 +0000 UTC m=+1934.604855686" observedRunningTime="2026-03-08 20:03:53.830498636 +0000 UTC m=+1935.226552689" watchObservedRunningTime="2026-03-08 20:03:53.837829641 +0000 UTC m=+1935.233883674" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.398548 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.398946 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.477436 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.614599 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.614696 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.688784 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.879190 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:03:57 crc kubenswrapper[4885]: I0308 20:03:57.891592 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:03:59 crc kubenswrapper[4885]: I0308 20:03:59.791474 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:59 crc kubenswrapper[4885]: I0308 20:03:59.791827 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:03:59 crc kubenswrapper[4885]: I0308 20:03:59.855719 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ptr9"] Mar 08 20:03:59 crc kubenswrapper[4885]: I0308 20:03:59.856143 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4ptr9" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="registry-server" containerID="cri-o://2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a" gracePeriod=2 Mar 08 20:03:59 crc kubenswrapper[4885]: I0308 20:03:59.959269 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:03:59 crc kubenswrapper[4885]: I0308 20:03:59.959347 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.043612 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cqvqv"] Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.044119 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cqvqv" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="registry-server" containerID="cri-o://e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7" gracePeriod=2 Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.047122 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.142092 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550004-sf7cv"] Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.143356 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.145066 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.147260 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.155875 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.158021 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550004-sf7cv"] Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.231183 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8gzs\" (UniqueName: \"kubernetes.io/projected/b24ca97d-74ff-4a8d-9621-40d03e8be6cc-kube-api-access-p8gzs\") pod \"auto-csr-approver-29550004-sf7cv\" (UID: \"b24ca97d-74ff-4a8d-9621-40d03e8be6cc\") " pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.327617 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.333901 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8gzs\" (UniqueName: \"kubernetes.io/projected/b24ca97d-74ff-4a8d-9621-40d03e8be6cc-kube-api-access-p8gzs\") pod \"auto-csr-approver-29550004-sf7cv\" (UID: \"b24ca97d-74ff-4a8d-9621-40d03e8be6cc\") " pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.407347 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8gzs\" (UniqueName: \"kubernetes.io/projected/b24ca97d-74ff-4a8d-9621-40d03e8be6cc-kube-api-access-p8gzs\") pod \"auto-csr-approver-29550004-sf7cv\" (UID: \"b24ca97d-74ff-4a8d-9621-40d03e8be6cc\") " pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.435780 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-catalog-content\") pod \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.435956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prh5k\" (UniqueName: \"kubernetes.io/projected/f7a5e492-4ddd-4229-b1fb-019fa71d2951-kube-api-access-prh5k\") pod \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.436010 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-utilities\") pod \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\" (UID: \"f7a5e492-4ddd-4229-b1fb-019fa71d2951\") " Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.437325 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-utilities" (OuterVolumeSpecName: "utilities") pod "f7a5e492-4ddd-4229-b1fb-019fa71d2951" (UID: "f7a5e492-4ddd-4229-b1fb-019fa71d2951"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.441195 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a5e492-4ddd-4229-b1fb-019fa71d2951-kube-api-access-prh5k" (OuterVolumeSpecName: "kube-api-access-prh5k") pod "f7a5e492-4ddd-4229-b1fb-019fa71d2951" (UID: "f7a5e492-4ddd-4229-b1fb-019fa71d2951"). InnerVolumeSpecName "kube-api-access-prh5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.484206 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.537529 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prh5k\" (UniqueName: \"kubernetes.io/projected/f7a5e492-4ddd-4229-b1fb-019fa71d2951-kube-api-access-prh5k\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.537558 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.842706 4885 generic.go:334] "Generic (PLEG): container finished" podID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerID="2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a" exitCode=0 Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.843634 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4ptr9" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.844059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerDied","Data":"2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a"} Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.844088 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4ptr9" event={"ID":"f7a5e492-4ddd-4229-b1fb-019fa71d2951","Type":"ContainerDied","Data":"7562d15c2fd135e779c5aa72239de4c7f9c7869128d1cb264d9a14a668809cf4"} Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.844110 4885 scope.go:117] "RemoveContainer" containerID="2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.859727 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-dn4kh" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="registry-server" probeResult="failure" output=< Mar 08 20:04:00 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 20:04:00 crc kubenswrapper[4885]: > Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.875497 4885 scope.go:117] "RemoveContainer" containerID="99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.889892 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550004-sf7cv"] Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.902375 4885 scope.go:117] "RemoveContainer" containerID="2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2" Mar 08 20:04:00 crc kubenswrapper[4885]: W0308 20:04:00.909412 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb24ca97d_74ff_4a8d_9621_40d03e8be6cc.slice/crio-0c30389f607eab2bcfcde7b6e56c017975a1c2fade5ff0464a133ced49f32297 WatchSource:0}: Error finding container 0c30389f607eab2bcfcde7b6e56c017975a1c2fade5ff0464a133ced49f32297: Status 404 returned error can't find the container with id 0c30389f607eab2bcfcde7b6e56c017975a1c2fade5ff0464a133ced49f32297 Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.910889 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.924299 4885 scope.go:117] "RemoveContainer" containerID="2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a" Mar 08 20:04:00 crc kubenswrapper[4885]: E0308 20:04:00.925325 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a\": container with ID starting with 2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a not found: ID does not exist" containerID="2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.925398 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a"} err="failed to get container status \"2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a\": rpc error: code = NotFound desc = could not find container \"2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a\": container with ID starting with 2ef2697f6de90841b861a1f3593d1940405d9dc574a5ab8d31277dd6aa3b937a not found: ID does not exist" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.925446 4885 scope.go:117] "RemoveContainer" containerID="99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c" Mar 08 20:04:00 crc kubenswrapper[4885]: E0308 20:04:00.925907 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c\": container with ID starting with 99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c not found: ID does not exist" containerID="99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.925969 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c"} err="failed to get container status \"99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c\": rpc error: code = NotFound desc = could not find container \"99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c\": container with ID starting with 99d0684021370208692739819cc315669a12256aecdea988770df6de7f609c9c not found: ID does not exist" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.926005 4885 scope.go:117] "RemoveContainer" containerID="2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2" Mar 08 20:04:00 crc kubenswrapper[4885]: E0308 20:04:00.926315 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2\": container with ID starting with 2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2 not found: ID does not exist" containerID="2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2" Mar 08 20:04:00 crc kubenswrapper[4885]: I0308 20:04:00.926344 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2"} err="failed to get container status \"2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2\": rpc error: code = NotFound desc = could not find container \"2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2\": container with ID starting with 2f7f807ce2b3990d2447f7bccb8f2b74b1005975a59321543b6f7fe2bd8122f2 not found: ID does not exist" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.036236 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7a5e492-4ddd-4229-b1fb-019fa71d2951" (UID: "f7a5e492-4ddd-4229-b1fb-019fa71d2951"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.044821 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7a5e492-4ddd-4229-b1fb-019fa71d2951-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.224818 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ptr9"] Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.231434 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4ptr9"] Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.393755 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" path="/var/lib/kubelet/pods/f7a5e492-4ddd-4229-b1fb-019fa71d2951/volumes" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.482440 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.556000 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-catalog-content\") pod \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.556089 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-utilities\") pod \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.556184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrbzc\" (UniqueName: \"kubernetes.io/projected/2ffd5483-5bf1-4ca1-945d-5de49426ee21-kube-api-access-zrbzc\") pod \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\" (UID: \"2ffd5483-5bf1-4ca1-945d-5de49426ee21\") " Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.557347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-utilities" (OuterVolumeSpecName: "utilities") pod "2ffd5483-5bf1-4ca1-945d-5de49426ee21" (UID: "2ffd5483-5bf1-4ca1-945d-5de49426ee21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.561302 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ffd5483-5bf1-4ca1-945d-5de49426ee21-kube-api-access-zrbzc" (OuterVolumeSpecName: "kube-api-access-zrbzc") pod "2ffd5483-5bf1-4ca1-945d-5de49426ee21" (UID: "2ffd5483-5bf1-4ca1-945d-5de49426ee21"). InnerVolumeSpecName "kube-api-access-zrbzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.613679 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2ffd5483-5bf1-4ca1-945d-5de49426ee21" (UID: "2ffd5483-5bf1-4ca1-945d-5de49426ee21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.658225 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrbzc\" (UniqueName: \"kubernetes.io/projected/2ffd5483-5bf1-4ca1-945d-5de49426ee21-kube-api-access-zrbzc\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.658280 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.658290 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ffd5483-5bf1-4ca1-945d-5de49426ee21-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.867964 4885 generic.go:334] "Generic (PLEG): container finished" podID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerID="e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7" exitCode=0 Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.868013 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerDied","Data":"e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7"} Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.868076 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cqvqv" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.868100 4885 scope.go:117] "RemoveContainer" containerID="e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.868079 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cqvqv" event={"ID":"2ffd5483-5bf1-4ca1-945d-5de49426ee21","Type":"ContainerDied","Data":"d49c4dba2cb2e10975f26f51e87613eadff9b443e9c9c89ec4745a20041c01b4"} Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.870433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" event={"ID":"b24ca97d-74ff-4a8d-9621-40d03e8be6cc","Type":"ContainerStarted","Data":"0c30389f607eab2bcfcde7b6e56c017975a1c2fade5ff0464a133ced49f32297"} Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.904043 4885 scope.go:117] "RemoveContainer" containerID="9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.919739 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cqvqv"] Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.933383 4885 scope.go:117] "RemoveContainer" containerID="22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.933474 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cqvqv"] Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.959488 4885 scope.go:117] "RemoveContainer" containerID="e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7" Mar 08 20:04:01 crc kubenswrapper[4885]: E0308 20:04:01.959898 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7\": container with ID starting with e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7 not found: ID does not exist" containerID="e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.959954 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7"} err="failed to get container status \"e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7\": rpc error: code = NotFound desc = could not find container \"e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7\": container with ID starting with e704a0183088f3efc9ffdffb6f2d9bc45d15b0a7b38f864d4c713f8137c6d0f7 not found: ID does not exist" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.959979 4885 scope.go:117] "RemoveContainer" containerID="9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941" Mar 08 20:04:01 crc kubenswrapper[4885]: E0308 20:04:01.960274 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941\": container with ID starting with 9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941 not found: ID does not exist" containerID="9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.960304 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941"} err="failed to get container status \"9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941\": rpc error: code = NotFound desc = could not find container \"9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941\": container with ID starting with 9ea95255bc73888b72da4b3d17286f1897049dacc8f5af4e5a024508e191f941 not found: ID does not exist" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.960342 4885 scope.go:117] "RemoveContainer" containerID="22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326" Mar 08 20:04:01 crc kubenswrapper[4885]: E0308 20:04:01.960597 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326\": container with ID starting with 22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326 not found: ID does not exist" containerID="22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326" Mar 08 20:04:01 crc kubenswrapper[4885]: I0308 20:04:01.960624 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326"} err="failed to get container status \"22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326\": rpc error: code = NotFound desc = could not find container \"22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326\": container with ID starting with 22bea492f45487496e37f64ec1874ce7bfb14d8fbfc84aae7df7b5fb0129e326 not found: ID does not exist" Mar 08 20:04:02 crc kubenswrapper[4885]: I0308 20:04:02.886455 4885 generic.go:334] "Generic (PLEG): container finished" podID="b24ca97d-74ff-4a8d-9621-40d03e8be6cc" containerID="59156303cc1056f8c42d0e37cf490933d64202838fc19ab28796fa13df721c66" exitCode=0 Mar 08 20:04:02 crc kubenswrapper[4885]: I0308 20:04:02.886512 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" event={"ID":"b24ca97d-74ff-4a8d-9621-40d03e8be6cc","Type":"ContainerDied","Data":"59156303cc1056f8c42d0e37cf490933d64202838fc19ab28796fa13df721c66"} Mar 08 20:04:03 crc kubenswrapper[4885]: I0308 20:04:03.399502 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" path="/var/lib/kubelet/pods/2ffd5483-5bf1-4ca1-945d-5de49426ee21/volumes" Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.299157 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.409537 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8gzs\" (UniqueName: \"kubernetes.io/projected/b24ca97d-74ff-4a8d-9621-40d03e8be6cc-kube-api-access-p8gzs\") pod \"b24ca97d-74ff-4a8d-9621-40d03e8be6cc\" (UID: \"b24ca97d-74ff-4a8d-9621-40d03e8be6cc\") " Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.418230 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b24ca97d-74ff-4a8d-9621-40d03e8be6cc-kube-api-access-p8gzs" (OuterVolumeSpecName: "kube-api-access-p8gzs") pod "b24ca97d-74ff-4a8d-9621-40d03e8be6cc" (UID: "b24ca97d-74ff-4a8d-9621-40d03e8be6cc"). InnerVolumeSpecName "kube-api-access-p8gzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.511104 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8gzs\" (UniqueName: \"kubernetes.io/projected/b24ca97d-74ff-4a8d-9621-40d03e8be6cc-kube-api-access-p8gzs\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.913234 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" event={"ID":"b24ca97d-74ff-4a8d-9621-40d03e8be6cc","Type":"ContainerDied","Data":"0c30389f607eab2bcfcde7b6e56c017975a1c2fade5ff0464a133ced49f32297"} Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.913283 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c30389f607eab2bcfcde7b6e56c017975a1c2fade5ff0464a133ced49f32297" Mar 08 20:04:04 crc kubenswrapper[4885]: I0308 20:04:04.913369 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550004-sf7cv" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.048548 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84srk"] Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.048974 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-84srk" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="registry-server" containerID="cri-o://76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780" gracePeriod=2 Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.369834 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:04:05 crc kubenswrapper[4885]: E0308 20:04:05.370228 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.399007 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29549998-97pgs"] Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.403709 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29549998-97pgs"] Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.619697 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.731073 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-catalog-content\") pod \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.731177 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-utilities\") pod \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.731266 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbpfb\" (UniqueName: \"kubernetes.io/projected/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-kube-api-access-mbpfb\") pod \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\" (UID: \"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7\") " Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.734984 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-utilities" (OuterVolumeSpecName: "utilities") pod "17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" (UID: "17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.739088 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-kube-api-access-mbpfb" (OuterVolumeSpecName: "kube-api-access-mbpfb") pod "17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" (UID: "17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7"). InnerVolumeSpecName "kube-api-access-mbpfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.800443 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" (UID: "17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.832341 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.832371 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.832382 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbpfb\" (UniqueName: \"kubernetes.io/projected/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7-kube-api-access-mbpfb\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.924230 4885 generic.go:334] "Generic (PLEG): container finished" podID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerID="76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780" exitCode=0 Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.924276 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84srk" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.924309 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerDied","Data":"76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780"} Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.924397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84srk" event={"ID":"17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7","Type":"ContainerDied","Data":"6a5c70ea2ed0ef38c370fb83471173899ee581eabbc19e5b33bd1521ba7a60d0"} Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.924433 4885 scope.go:117] "RemoveContainer" containerID="76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.951393 4885 scope.go:117] "RemoveContainer" containerID="68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60" Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.973668 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84srk"] Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.986370 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-84srk"] Mar 08 20:04:05 crc kubenswrapper[4885]: I0308 20:04:05.988729 4885 scope.go:117] "RemoveContainer" containerID="e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6" Mar 08 20:04:06 crc kubenswrapper[4885]: I0308 20:04:06.033156 4885 scope.go:117] "RemoveContainer" containerID="76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780" Mar 08 20:04:06 crc kubenswrapper[4885]: E0308 20:04:06.033612 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780\": container with ID starting with 76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780 not found: ID does not exist" containerID="76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780" Mar 08 20:04:06 crc kubenswrapper[4885]: I0308 20:04:06.033640 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780"} err="failed to get container status \"76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780\": rpc error: code = NotFound desc = could not find container \"76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780\": container with ID starting with 76faa641b28240cb2174e4b04059e3d380cae85ec9462e91fed77b9f97918780 not found: ID does not exist" Mar 08 20:04:06 crc kubenswrapper[4885]: I0308 20:04:06.033658 4885 scope.go:117] "RemoveContainer" containerID="68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60" Mar 08 20:04:06 crc kubenswrapper[4885]: E0308 20:04:06.033981 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60\": container with ID starting with 68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60 not found: ID does not exist" containerID="68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60" Mar 08 20:04:06 crc kubenswrapper[4885]: I0308 20:04:06.033998 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60"} err="failed to get container status \"68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60\": rpc error: code = NotFound desc = could not find container \"68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60\": container with ID starting with 68b3f6b83f329f4db33d0bdf76d1df3ed6e263e4bcfb77c3367b0e7f1dd89e60 not found: ID does not exist" Mar 08 20:04:06 crc kubenswrapper[4885]: I0308 20:04:06.034013 4885 scope.go:117] "RemoveContainer" containerID="e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6" Mar 08 20:04:06 crc kubenswrapper[4885]: E0308 20:04:06.034258 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6\": container with ID starting with e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6 not found: ID does not exist" containerID="e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6" Mar 08 20:04:06 crc kubenswrapper[4885]: I0308 20:04:06.034276 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6"} err="failed to get container status \"e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6\": rpc error: code = NotFound desc = could not find container \"e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6\": container with ID starting with e7822901f4fdc5b3dbd59cdec83adba3e83f9bd7370cc3cdf30cca23036f4ad6 not found: ID does not exist" Mar 08 20:04:07 crc kubenswrapper[4885]: I0308 20:04:07.384951 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f78a7ad-7933-489d-8395-4bb334007a30" path="/var/lib/kubelet/pods/0f78a7ad-7933-489d-8395-4bb334007a30/volumes" Mar 08 20:04:07 crc kubenswrapper[4885]: I0308 20:04:07.386094 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" path="/var/lib/kubelet/pods/17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7/volumes" Mar 08 20:04:09 crc kubenswrapper[4885]: I0308 20:04:09.872603 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:04:09 crc kubenswrapper[4885]: I0308 20:04:09.948743 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:04:10 crc kubenswrapper[4885]: I0308 20:04:10.125346 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dn4kh"] Mar 08 20:04:10 crc kubenswrapper[4885]: I0308 20:04:10.991272 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dn4kh" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="registry-server" containerID="cri-o://c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b" gracePeriod=2 Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.435691 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.528721 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-utilities\") pod \"14843e03-07ac-482d-b2f6-6bbfb7567b91\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.528833 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-catalog-content\") pod \"14843e03-07ac-482d-b2f6-6bbfb7567b91\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.528955 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7n4n\" (UniqueName: \"kubernetes.io/projected/14843e03-07ac-482d-b2f6-6bbfb7567b91-kube-api-access-j7n4n\") pod \"14843e03-07ac-482d-b2f6-6bbfb7567b91\" (UID: \"14843e03-07ac-482d-b2f6-6bbfb7567b91\") " Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.531036 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-utilities" (OuterVolumeSpecName: "utilities") pod "14843e03-07ac-482d-b2f6-6bbfb7567b91" (UID: "14843e03-07ac-482d-b2f6-6bbfb7567b91"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.541281 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14843e03-07ac-482d-b2f6-6bbfb7567b91-kube-api-access-j7n4n" (OuterVolumeSpecName: "kube-api-access-j7n4n") pod "14843e03-07ac-482d-b2f6-6bbfb7567b91" (UID: "14843e03-07ac-482d-b2f6-6bbfb7567b91"). InnerVolumeSpecName "kube-api-access-j7n4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.629945 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7n4n\" (UniqueName: \"kubernetes.io/projected/14843e03-07ac-482d-b2f6-6bbfb7567b91-kube-api-access-j7n4n\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.629976 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.732910 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14843e03-07ac-482d-b2f6-6bbfb7567b91" (UID: "14843e03-07ac-482d-b2f6-6bbfb7567b91"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:04:11 crc kubenswrapper[4885]: I0308 20:04:11.832766 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14843e03-07ac-482d-b2f6-6bbfb7567b91-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.019366 4885 generic.go:334] "Generic (PLEG): container finished" podID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerID="c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b" exitCode=0 Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.019465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerDied","Data":"c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b"} Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.019553 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dn4kh" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.019607 4885 scope.go:117] "RemoveContainer" containerID="c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.019581 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dn4kh" event={"ID":"14843e03-07ac-482d-b2f6-6bbfb7567b91","Type":"ContainerDied","Data":"1d0f0c8555c4b252c5081e0c34f2d3afd49d81f6583bcf2c95de0ec400208b12"} Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.066449 4885 scope.go:117] "RemoveContainer" containerID="6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.069773 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dn4kh"] Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.076596 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dn4kh"] Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.102574 4885 scope.go:117] "RemoveContainer" containerID="df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.134320 4885 scope.go:117] "RemoveContainer" containerID="c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b" Mar 08 20:04:12 crc kubenswrapper[4885]: E0308 20:04:12.134943 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b\": container with ID starting with c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b not found: ID does not exist" containerID="c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.134984 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b"} err="failed to get container status \"c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b\": rpc error: code = NotFound desc = could not find container \"c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b\": container with ID starting with c64e24ac1ead38098dea0f0240211b75ec12402bf19a3e2a5ac88643a0f5e52b not found: ID does not exist" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.135013 4885 scope.go:117] "RemoveContainer" containerID="6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164" Mar 08 20:04:12 crc kubenswrapper[4885]: E0308 20:04:12.135627 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164\": container with ID starting with 6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164 not found: ID does not exist" containerID="6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.135811 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164"} err="failed to get container status \"6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164\": rpc error: code = NotFound desc = could not find container \"6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164\": container with ID starting with 6d02b43b2257f1bf2fa01ccce985b03e5f2a967155e87243554c0d9bbf443164 not found: ID does not exist" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.136014 4885 scope.go:117] "RemoveContainer" containerID="df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853" Mar 08 20:04:12 crc kubenswrapper[4885]: E0308 20:04:12.136573 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853\": container with ID starting with df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853 not found: ID does not exist" containerID="df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853" Mar 08 20:04:12 crc kubenswrapper[4885]: I0308 20:04:12.136645 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853"} err="failed to get container status \"df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853\": rpc error: code = NotFound desc = could not find container \"df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853\": container with ID starting with df7544f53f96b3c01c0be216224c7463ef23589dca586a03c3266c90700c5853 not found: ID does not exist" Mar 08 20:04:13 crc kubenswrapper[4885]: I0308 20:04:13.385564 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" path="/var/lib/kubelet/pods/14843e03-07ac-482d-b2f6-6bbfb7567b91/volumes" Mar 08 20:04:16 crc kubenswrapper[4885]: I0308 20:04:16.367649 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:04:16 crc kubenswrapper[4885]: E0308 20:04:16.368148 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:04:27 crc kubenswrapper[4885]: I0308 20:04:27.369909 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:04:27 crc kubenswrapper[4885]: E0308 20:04:27.370854 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:04:39 crc kubenswrapper[4885]: I0308 20:04:39.376739 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:04:39 crc kubenswrapper[4885]: E0308 20:04:39.377974 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:04:53 crc kubenswrapper[4885]: I0308 20:04:53.368780 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:04:53 crc kubenswrapper[4885]: E0308 20:04:53.370087 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:05:00 crc kubenswrapper[4885]: I0308 20:05:00.014419 4885 scope.go:117] "RemoveContainer" containerID="f1ee7ab75e6cdb54c44da03961cfd9f0079aa1cd90d1e18350bad8572cfd08fa" Mar 08 20:05:08 crc kubenswrapper[4885]: I0308 20:05:08.368991 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:05:08 crc kubenswrapper[4885]: E0308 20:05:08.370116 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:05:20 crc kubenswrapper[4885]: I0308 20:05:20.369020 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:05:20 crc kubenswrapper[4885]: E0308 20:05:20.370103 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:05:34 crc kubenswrapper[4885]: I0308 20:05:34.368580 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:05:34 crc kubenswrapper[4885]: E0308 20:05:34.369585 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:05:45 crc kubenswrapper[4885]: I0308 20:05:45.368075 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:05:45 crc kubenswrapper[4885]: E0308 20:05:45.369102 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:05:56 crc kubenswrapper[4885]: I0308 20:05:56.368881 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:05:56 crc kubenswrapper[4885]: E0308 20:05:56.370119 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.158967 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550006-6vnqt"] Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159576 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159603 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159629 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b24ca97d-74ff-4a8d-9621-40d03e8be6cc" containerName="oc" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159639 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b24ca97d-74ff-4a8d-9621-40d03e8be6cc" containerName="oc" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159650 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159657 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159676 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159687 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159701 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159711 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159727 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159739 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159757 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159766 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159779 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159789 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159803 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159813 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159832 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159844 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159866 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159875 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="extract-utilities" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159887 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159899 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: E0308 20:06:00.159913 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.159948 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="extract-content" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.160221 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ffd5483-5bf1-4ca1-945d-5de49426ee21" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.160251 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a5e492-4ddd-4229-b1fb-019fa71d2951" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.160266 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="14843e03-07ac-482d-b2f6-6bbfb7567b91" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.160285 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b24ca97d-74ff-4a8d-9621-40d03e8be6cc" containerName="oc" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.160303 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="17fe0b53-ae94-4b6b-8a6d-1d1f68a52bf7" containerName="registry-server" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.161156 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.167808 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.167826 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.167861 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.175232 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550006-6vnqt"] Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.244584 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t4dc\" (UniqueName: \"kubernetes.io/projected/f2563b40-3861-46cb-b313-1c221b526aa7-kube-api-access-5t4dc\") pod \"auto-csr-approver-29550006-6vnqt\" (UID: \"f2563b40-3861-46cb-b313-1c221b526aa7\") " pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.346186 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t4dc\" (UniqueName: \"kubernetes.io/projected/f2563b40-3861-46cb-b313-1c221b526aa7-kube-api-access-5t4dc\") pod \"auto-csr-approver-29550006-6vnqt\" (UID: \"f2563b40-3861-46cb-b313-1c221b526aa7\") " pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.367684 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t4dc\" (UniqueName: \"kubernetes.io/projected/f2563b40-3861-46cb-b313-1c221b526aa7-kube-api-access-5t4dc\") pod \"auto-csr-approver-29550006-6vnqt\" (UID: \"f2563b40-3861-46cb-b313-1c221b526aa7\") " pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:00 crc kubenswrapper[4885]: I0308 20:06:00.488614 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:01 crc kubenswrapper[4885]: I0308 20:06:01.025055 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550006-6vnqt"] Mar 08 20:06:01 crc kubenswrapper[4885]: I0308 20:06:01.166838 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" event={"ID":"f2563b40-3861-46cb-b313-1c221b526aa7","Type":"ContainerStarted","Data":"8338dd38101eb0f2437313f523dea1c5dee54a4c1716e9638cf7059653c9580d"} Mar 08 20:06:03 crc kubenswrapper[4885]: I0308 20:06:03.187549 4885 generic.go:334] "Generic (PLEG): container finished" podID="f2563b40-3861-46cb-b313-1c221b526aa7" containerID="221a3e9bb54bc45b1a9a4a543aadeb429393adbc0fe46f6f79ad34e45269413a" exitCode=0 Mar 08 20:06:03 crc kubenswrapper[4885]: I0308 20:06:03.187904 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" event={"ID":"f2563b40-3861-46cb-b313-1c221b526aa7","Type":"ContainerDied","Data":"221a3e9bb54bc45b1a9a4a543aadeb429393adbc0fe46f6f79ad34e45269413a"} Mar 08 20:06:04 crc kubenswrapper[4885]: I0308 20:06:04.615663 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:04 crc kubenswrapper[4885]: I0308 20:06:04.722432 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t4dc\" (UniqueName: \"kubernetes.io/projected/f2563b40-3861-46cb-b313-1c221b526aa7-kube-api-access-5t4dc\") pod \"f2563b40-3861-46cb-b313-1c221b526aa7\" (UID: \"f2563b40-3861-46cb-b313-1c221b526aa7\") " Mar 08 20:06:04 crc kubenswrapper[4885]: I0308 20:06:04.729272 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2563b40-3861-46cb-b313-1c221b526aa7-kube-api-access-5t4dc" (OuterVolumeSpecName: "kube-api-access-5t4dc") pod "f2563b40-3861-46cb-b313-1c221b526aa7" (UID: "f2563b40-3861-46cb-b313-1c221b526aa7"). InnerVolumeSpecName "kube-api-access-5t4dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:06:04 crc kubenswrapper[4885]: I0308 20:06:04.824199 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t4dc\" (UniqueName: \"kubernetes.io/projected/f2563b40-3861-46cb-b313-1c221b526aa7-kube-api-access-5t4dc\") on node \"crc\" DevicePath \"\"" Mar 08 20:06:05 crc kubenswrapper[4885]: I0308 20:06:05.220201 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" event={"ID":"f2563b40-3861-46cb-b313-1c221b526aa7","Type":"ContainerDied","Data":"8338dd38101eb0f2437313f523dea1c5dee54a4c1716e9638cf7059653c9580d"} Mar 08 20:06:05 crc kubenswrapper[4885]: I0308 20:06:05.220719 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8338dd38101eb0f2437313f523dea1c5dee54a4c1716e9638cf7059653c9580d" Mar 08 20:06:05 crc kubenswrapper[4885]: I0308 20:06:05.220273 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550006-6vnqt" Mar 08 20:06:05 crc kubenswrapper[4885]: I0308 20:06:05.711698 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550000-bp85d"] Mar 08 20:06:05 crc kubenswrapper[4885]: I0308 20:06:05.725943 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550000-bp85d"] Mar 08 20:06:07 crc kubenswrapper[4885]: I0308 20:06:07.383721 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d589285c-60a3-4871-9149-7f1f99fc35ee" path="/var/lib/kubelet/pods/d589285c-60a3-4871-9149-7f1f99fc35ee/volumes" Mar 08 20:06:08 crc kubenswrapper[4885]: I0308 20:06:08.368976 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:06:09 crc kubenswrapper[4885]: I0308 20:06:09.261143 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"b1b5b99f3cf4b3dd1c44244000f04eee91b4f616c32d2bab38cc39ac028ab97a"} Mar 08 20:07:00 crc kubenswrapper[4885]: I0308 20:07:00.187133 4885 scope.go:117] "RemoveContainer" containerID="a1918622a7f691d5b0978579d743cac5d40266346f9de21b0dbb76cf8ca3f823" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.159380 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550008-v776t"] Mar 08 20:08:00 crc kubenswrapper[4885]: E0308 20:08:00.160373 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2563b40-3861-46cb-b313-1c221b526aa7" containerName="oc" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.160418 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2563b40-3861-46cb-b313-1c221b526aa7" containerName="oc" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.160686 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2563b40-3861-46cb-b313-1c221b526aa7" containerName="oc" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.161392 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.165556 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.168343 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.168349 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.183580 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550008-v776t"] Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.204297 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffb4b\" (UniqueName: \"kubernetes.io/projected/a693b4d7-ae29-482d-8b4d-8025be4ce19f-kube-api-access-ffb4b\") pod \"auto-csr-approver-29550008-v776t\" (UID: \"a693b4d7-ae29-482d-8b4d-8025be4ce19f\") " pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.305847 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffb4b\" (UniqueName: \"kubernetes.io/projected/a693b4d7-ae29-482d-8b4d-8025be4ce19f-kube-api-access-ffb4b\") pod \"auto-csr-approver-29550008-v776t\" (UID: \"a693b4d7-ae29-482d-8b4d-8025be4ce19f\") " pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.338947 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffb4b\" (UniqueName: \"kubernetes.io/projected/a693b4d7-ae29-482d-8b4d-8025be4ce19f-kube-api-access-ffb4b\") pod \"auto-csr-approver-29550008-v776t\" (UID: \"a693b4d7-ae29-482d-8b4d-8025be4ce19f\") " pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.510693 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:00 crc kubenswrapper[4885]: I0308 20:08:00.835094 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550008-v776t"] Mar 08 20:08:00 crc kubenswrapper[4885]: W0308 20:08:00.837978 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda693b4d7_ae29_482d_8b4d_8025be4ce19f.slice/crio-a58c8c9994d676f36994b18a6363c092924699d57df290a8f864d281e6b8f7a5 WatchSource:0}: Error finding container a58c8c9994d676f36994b18a6363c092924699d57df290a8f864d281e6b8f7a5: Status 404 returned error can't find the container with id a58c8c9994d676f36994b18a6363c092924699d57df290a8f864d281e6b8f7a5 Mar 08 20:08:01 crc kubenswrapper[4885]: I0308 20:08:01.305431 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550008-v776t" event={"ID":"a693b4d7-ae29-482d-8b4d-8025be4ce19f","Type":"ContainerStarted","Data":"a58c8c9994d676f36994b18a6363c092924699d57df290a8f864d281e6b8f7a5"} Mar 08 20:08:02 crc kubenswrapper[4885]: I0308 20:08:02.320810 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550008-v776t" event={"ID":"a693b4d7-ae29-482d-8b4d-8025be4ce19f","Type":"ContainerStarted","Data":"e3ca62c7c9954975ab02aef71fa3fa4daa75c76bdca0ee86cf37dfeed6b0d2ef"} Mar 08 20:08:02 crc kubenswrapper[4885]: I0308 20:08:02.350817 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550008-v776t" podStartSLOduration=1.307888983 podStartE2EDuration="2.350795935s" podCreationTimestamp="2026-03-08 20:08:00 +0000 UTC" firstStartedPulling="2026-03-08 20:08:00.842564552 +0000 UTC m=+2182.238618605" lastFinishedPulling="2026-03-08 20:08:01.885471524 +0000 UTC m=+2183.281525557" observedRunningTime="2026-03-08 20:08:02.34345708 +0000 UTC m=+2183.739511143" watchObservedRunningTime="2026-03-08 20:08:02.350795935 +0000 UTC m=+2183.746849978" Mar 08 20:08:03 crc kubenswrapper[4885]: I0308 20:08:03.335143 4885 generic.go:334] "Generic (PLEG): container finished" podID="a693b4d7-ae29-482d-8b4d-8025be4ce19f" containerID="e3ca62c7c9954975ab02aef71fa3fa4daa75c76bdca0ee86cf37dfeed6b0d2ef" exitCode=0 Mar 08 20:08:03 crc kubenswrapper[4885]: I0308 20:08:03.335220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550008-v776t" event={"ID":"a693b4d7-ae29-482d-8b4d-8025be4ce19f","Type":"ContainerDied","Data":"e3ca62c7c9954975ab02aef71fa3fa4daa75c76bdca0ee86cf37dfeed6b0d2ef"} Mar 08 20:08:04 crc kubenswrapper[4885]: I0308 20:08:04.709771 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:04 crc kubenswrapper[4885]: I0308 20:08:04.881785 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffb4b\" (UniqueName: \"kubernetes.io/projected/a693b4d7-ae29-482d-8b4d-8025be4ce19f-kube-api-access-ffb4b\") pod \"a693b4d7-ae29-482d-8b4d-8025be4ce19f\" (UID: \"a693b4d7-ae29-482d-8b4d-8025be4ce19f\") " Mar 08 20:08:04 crc kubenswrapper[4885]: I0308 20:08:04.891019 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a693b4d7-ae29-482d-8b4d-8025be4ce19f-kube-api-access-ffb4b" (OuterVolumeSpecName: "kube-api-access-ffb4b") pod "a693b4d7-ae29-482d-8b4d-8025be4ce19f" (UID: "a693b4d7-ae29-482d-8b4d-8025be4ce19f"). InnerVolumeSpecName "kube-api-access-ffb4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:08:04 crc kubenswrapper[4885]: I0308 20:08:04.984967 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffb4b\" (UniqueName: \"kubernetes.io/projected/a693b4d7-ae29-482d-8b4d-8025be4ce19f-kube-api-access-ffb4b\") on node \"crc\" DevicePath \"\"" Mar 08 20:08:05 crc kubenswrapper[4885]: I0308 20:08:05.358137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550008-v776t" event={"ID":"a693b4d7-ae29-482d-8b4d-8025be4ce19f","Type":"ContainerDied","Data":"a58c8c9994d676f36994b18a6363c092924699d57df290a8f864d281e6b8f7a5"} Mar 08 20:08:05 crc kubenswrapper[4885]: I0308 20:08:05.358198 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a58c8c9994d676f36994b18a6363c092924699d57df290a8f864d281e6b8f7a5" Mar 08 20:08:05 crc kubenswrapper[4885]: I0308 20:08:05.358278 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550008-v776t" Mar 08 20:08:05 crc kubenswrapper[4885]: I0308 20:08:05.447094 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550002-jw6jr"] Mar 08 20:08:05 crc kubenswrapper[4885]: I0308 20:08:05.457694 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550002-jw6jr"] Mar 08 20:08:07 crc kubenswrapper[4885]: I0308 20:08:07.385442 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb49670-90cc-4dfa-9e28-1ae44ed1104b" path="/var/lib/kubelet/pods/cdb49670-90cc-4dfa-9e28-1ae44ed1104b/volumes" Mar 08 20:08:32 crc kubenswrapper[4885]: I0308 20:08:32.818406 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:08:32 crc kubenswrapper[4885]: I0308 20:08:32.819070 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:09:00 crc kubenswrapper[4885]: I0308 20:09:00.310665 4885 scope.go:117] "RemoveContainer" containerID="85234bd429cc704e5f096400e4f1b62d94a0128f9b3c1c36899a84d4f6ac1ba9" Mar 08 20:09:02 crc kubenswrapper[4885]: I0308 20:09:02.818672 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:09:02 crc kubenswrapper[4885]: I0308 20:09:02.819648 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:09:32 crc kubenswrapper[4885]: I0308 20:09:32.818532 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:09:32 crc kubenswrapper[4885]: I0308 20:09:32.820048 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:09:32 crc kubenswrapper[4885]: I0308 20:09:32.820134 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:09:32 crc kubenswrapper[4885]: I0308 20:09:32.820997 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b1b5b99f3cf4b3dd1c44244000f04eee91b4f616c32d2bab38cc39ac028ab97a"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:09:32 crc kubenswrapper[4885]: I0308 20:09:32.821094 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://b1b5b99f3cf4b3dd1c44244000f04eee91b4f616c32d2bab38cc39ac028ab97a" gracePeriod=600 Mar 08 20:09:33 crc kubenswrapper[4885]: I0308 20:09:33.137433 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="b1b5b99f3cf4b3dd1c44244000f04eee91b4f616c32d2bab38cc39ac028ab97a" exitCode=0 Mar 08 20:09:33 crc kubenswrapper[4885]: I0308 20:09:33.137469 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"b1b5b99f3cf4b3dd1c44244000f04eee91b4f616c32d2bab38cc39ac028ab97a"} Mar 08 20:09:33 crc kubenswrapper[4885]: I0308 20:09:33.137972 4885 scope.go:117] "RemoveContainer" containerID="58c3ba042a790eea5ba42b828fc6f11c90eac5b29961e77da5b1edc93aeba400" Mar 08 20:09:34 crc kubenswrapper[4885]: I0308 20:09:34.151792 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675"} Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.167217 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550010-xflbc"] Mar 08 20:10:00 crc kubenswrapper[4885]: E0308 20:10:00.168248 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a693b4d7-ae29-482d-8b4d-8025be4ce19f" containerName="oc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.168270 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a693b4d7-ae29-482d-8b4d-8025be4ce19f" containerName="oc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.168552 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a693b4d7-ae29-482d-8b4d-8025be4ce19f" containerName="oc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.169268 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.172112 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.173187 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.173241 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.177412 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550010-xflbc"] Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.306218 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2b5j\" (UniqueName: \"kubernetes.io/projected/14bc9568-8018-41b0-9d6f-1b71feaa1021-kube-api-access-r2b5j\") pod \"auto-csr-approver-29550010-xflbc\" (UID: \"14bc9568-8018-41b0-9d6f-1b71feaa1021\") " pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.407965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2b5j\" (UniqueName: \"kubernetes.io/projected/14bc9568-8018-41b0-9d6f-1b71feaa1021-kube-api-access-r2b5j\") pod \"auto-csr-approver-29550010-xflbc\" (UID: \"14bc9568-8018-41b0-9d6f-1b71feaa1021\") " pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.447792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2b5j\" (UniqueName: \"kubernetes.io/projected/14bc9568-8018-41b0-9d6f-1b71feaa1021-kube-api-access-r2b5j\") pod \"auto-csr-approver-29550010-xflbc\" (UID: \"14bc9568-8018-41b0-9d6f-1b71feaa1021\") " pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:00 crc kubenswrapper[4885]: I0308 20:10:00.500315 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:01 crc kubenswrapper[4885]: I0308 20:10:01.062495 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:10:01 crc kubenswrapper[4885]: I0308 20:10:01.064211 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550010-xflbc"] Mar 08 20:10:01 crc kubenswrapper[4885]: I0308 20:10:01.429360 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550010-xflbc" event={"ID":"14bc9568-8018-41b0-9d6f-1b71feaa1021","Type":"ContainerStarted","Data":"3105cec59e23fd26d2430e5263de03fdea6f871660e0cf2b5080e3ffd87c69e5"} Mar 08 20:10:02 crc kubenswrapper[4885]: I0308 20:10:02.438743 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550010-xflbc" event={"ID":"14bc9568-8018-41b0-9d6f-1b71feaa1021","Type":"ContainerStarted","Data":"a94ec38b15c7d4aceca8f088f44ff876dec7a0f0f94b13e1c65ece3d594f4ff4"} Mar 08 20:10:02 crc kubenswrapper[4885]: I0308 20:10:02.461719 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550010-xflbc" podStartSLOduration=1.559694219 podStartE2EDuration="2.461687923s" podCreationTimestamp="2026-03-08 20:10:00 +0000 UTC" firstStartedPulling="2026-03-08 20:10:01.061662375 +0000 UTC m=+2302.457716438" lastFinishedPulling="2026-03-08 20:10:01.963656119 +0000 UTC m=+2303.359710142" observedRunningTime="2026-03-08 20:10:02.458467317 +0000 UTC m=+2303.854521380" watchObservedRunningTime="2026-03-08 20:10:02.461687923 +0000 UTC m=+2303.857742006" Mar 08 20:10:03 crc kubenswrapper[4885]: I0308 20:10:03.449649 4885 generic.go:334] "Generic (PLEG): container finished" podID="14bc9568-8018-41b0-9d6f-1b71feaa1021" containerID="a94ec38b15c7d4aceca8f088f44ff876dec7a0f0f94b13e1c65ece3d594f4ff4" exitCode=0 Mar 08 20:10:03 crc kubenswrapper[4885]: I0308 20:10:03.449710 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550010-xflbc" event={"ID":"14bc9568-8018-41b0-9d6f-1b71feaa1021","Type":"ContainerDied","Data":"a94ec38b15c7d4aceca8f088f44ff876dec7a0f0f94b13e1c65ece3d594f4ff4"} Mar 08 20:10:04 crc kubenswrapper[4885]: I0308 20:10:04.841434 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:04 crc kubenswrapper[4885]: I0308 20:10:04.982007 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2b5j\" (UniqueName: \"kubernetes.io/projected/14bc9568-8018-41b0-9d6f-1b71feaa1021-kube-api-access-r2b5j\") pod \"14bc9568-8018-41b0-9d6f-1b71feaa1021\" (UID: \"14bc9568-8018-41b0-9d6f-1b71feaa1021\") " Mar 08 20:10:04 crc kubenswrapper[4885]: I0308 20:10:04.989564 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14bc9568-8018-41b0-9d6f-1b71feaa1021-kube-api-access-r2b5j" (OuterVolumeSpecName: "kube-api-access-r2b5j") pod "14bc9568-8018-41b0-9d6f-1b71feaa1021" (UID: "14bc9568-8018-41b0-9d6f-1b71feaa1021"). InnerVolumeSpecName "kube-api-access-r2b5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:10:05 crc kubenswrapper[4885]: I0308 20:10:05.083774 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2b5j\" (UniqueName: \"kubernetes.io/projected/14bc9568-8018-41b0-9d6f-1b71feaa1021-kube-api-access-r2b5j\") on node \"crc\" DevicePath \"\"" Mar 08 20:10:05 crc kubenswrapper[4885]: I0308 20:10:05.473557 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550010-xflbc" event={"ID":"14bc9568-8018-41b0-9d6f-1b71feaa1021","Type":"ContainerDied","Data":"3105cec59e23fd26d2430e5263de03fdea6f871660e0cf2b5080e3ffd87c69e5"} Mar 08 20:10:05 crc kubenswrapper[4885]: I0308 20:10:05.473633 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3105cec59e23fd26d2430e5263de03fdea6f871660e0cf2b5080e3ffd87c69e5" Mar 08 20:10:05 crc kubenswrapper[4885]: I0308 20:10:05.473650 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550010-xflbc" Mar 08 20:10:05 crc kubenswrapper[4885]: I0308 20:10:05.528996 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550004-sf7cv"] Mar 08 20:10:05 crc kubenswrapper[4885]: I0308 20:10:05.534868 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550004-sf7cv"] Mar 08 20:10:07 crc kubenswrapper[4885]: I0308 20:10:07.385164 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b24ca97d-74ff-4a8d-9621-40d03e8be6cc" path="/var/lib/kubelet/pods/b24ca97d-74ff-4a8d-9621-40d03e8be6cc/volumes" Mar 08 20:11:00 crc kubenswrapper[4885]: I0308 20:11:00.417253 4885 scope.go:117] "RemoveContainer" containerID="59156303cc1056f8c42d0e37cf490933d64202838fc19ab28796fa13df721c66" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.166917 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550012-l9ql6"] Mar 08 20:12:00 crc kubenswrapper[4885]: E0308 20:12:00.169432 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14bc9568-8018-41b0-9d6f-1b71feaa1021" containerName="oc" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.169474 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="14bc9568-8018-41b0-9d6f-1b71feaa1021" containerName="oc" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.169652 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="14bc9568-8018-41b0-9d6f-1b71feaa1021" containerName="oc" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.170140 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.173712 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.174042 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.174381 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.176902 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550012-l9ql6"] Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.215553 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stkkn\" (UniqueName: \"kubernetes.io/projected/8c0dd782-7d55-432a-a4c4-72eab3a342f0-kube-api-access-stkkn\") pod \"auto-csr-approver-29550012-l9ql6\" (UID: \"8c0dd782-7d55-432a-a4c4-72eab3a342f0\") " pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.317474 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stkkn\" (UniqueName: \"kubernetes.io/projected/8c0dd782-7d55-432a-a4c4-72eab3a342f0-kube-api-access-stkkn\") pod \"auto-csr-approver-29550012-l9ql6\" (UID: \"8c0dd782-7d55-432a-a4c4-72eab3a342f0\") " pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.337473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stkkn\" (UniqueName: \"kubernetes.io/projected/8c0dd782-7d55-432a-a4c4-72eab3a342f0-kube-api-access-stkkn\") pod \"auto-csr-approver-29550012-l9ql6\" (UID: \"8c0dd782-7d55-432a-a4c4-72eab3a342f0\") " pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.515446 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:00 crc kubenswrapper[4885]: I0308 20:12:00.998300 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550012-l9ql6"] Mar 08 20:12:01 crc kubenswrapper[4885]: I0308 20:12:01.603500 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" event={"ID":"8c0dd782-7d55-432a-a4c4-72eab3a342f0","Type":"ContainerStarted","Data":"7de121d44fef7b71077fe52cabbc8997f38c115f104f7b0cdeba56eff3a43c7b"} Mar 08 20:12:02 crc kubenswrapper[4885]: I0308 20:12:02.613280 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" event={"ID":"8c0dd782-7d55-432a-a4c4-72eab3a342f0","Type":"ContainerStarted","Data":"88a04df5e845bc6b15ebd93cf0a05c83311989ab76aa928e4be86bfc18c02a82"} Mar 08 20:12:02 crc kubenswrapper[4885]: I0308 20:12:02.638109 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" podStartSLOduration=1.527782276 podStartE2EDuration="2.638090074s" podCreationTimestamp="2026-03-08 20:12:00 +0000 UTC" firstStartedPulling="2026-03-08 20:12:01.016163011 +0000 UTC m=+2422.412217054" lastFinishedPulling="2026-03-08 20:12:02.126470799 +0000 UTC m=+2423.522524852" observedRunningTime="2026-03-08 20:12:02.633283436 +0000 UTC m=+2424.029337499" watchObservedRunningTime="2026-03-08 20:12:02.638090074 +0000 UTC m=+2424.034144107" Mar 08 20:12:02 crc kubenswrapper[4885]: I0308 20:12:02.818660 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:12:02 crc kubenswrapper[4885]: I0308 20:12:02.818743 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:12:03 crc kubenswrapper[4885]: I0308 20:12:03.626735 4885 generic.go:334] "Generic (PLEG): container finished" podID="8c0dd782-7d55-432a-a4c4-72eab3a342f0" containerID="88a04df5e845bc6b15ebd93cf0a05c83311989ab76aa928e4be86bfc18c02a82" exitCode=0 Mar 08 20:12:03 crc kubenswrapper[4885]: I0308 20:12:03.626802 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" event={"ID":"8c0dd782-7d55-432a-a4c4-72eab3a342f0","Type":"ContainerDied","Data":"88a04df5e845bc6b15ebd93cf0a05c83311989ab76aa928e4be86bfc18c02a82"} Mar 08 20:12:04 crc kubenswrapper[4885]: I0308 20:12:04.992322 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.093278 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stkkn\" (UniqueName: \"kubernetes.io/projected/8c0dd782-7d55-432a-a4c4-72eab3a342f0-kube-api-access-stkkn\") pod \"8c0dd782-7d55-432a-a4c4-72eab3a342f0\" (UID: \"8c0dd782-7d55-432a-a4c4-72eab3a342f0\") " Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.099995 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c0dd782-7d55-432a-a4c4-72eab3a342f0-kube-api-access-stkkn" (OuterVolumeSpecName: "kube-api-access-stkkn") pod "8c0dd782-7d55-432a-a4c4-72eab3a342f0" (UID: "8c0dd782-7d55-432a-a4c4-72eab3a342f0"). InnerVolumeSpecName "kube-api-access-stkkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.195190 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stkkn\" (UniqueName: \"kubernetes.io/projected/8c0dd782-7d55-432a-a4c4-72eab3a342f0-kube-api-access-stkkn\") on node \"crc\" DevicePath \"\"" Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.649791 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" event={"ID":"8c0dd782-7d55-432a-a4c4-72eab3a342f0","Type":"ContainerDied","Data":"7de121d44fef7b71077fe52cabbc8997f38c115f104f7b0cdeba56eff3a43c7b"} Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.650178 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7de121d44fef7b71077fe52cabbc8997f38c115f104f7b0cdeba56eff3a43c7b" Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.649915 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550012-l9ql6" Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.722265 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550006-6vnqt"] Mar 08 20:12:05 crc kubenswrapper[4885]: I0308 20:12:05.734224 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550006-6vnqt"] Mar 08 20:12:07 crc kubenswrapper[4885]: I0308 20:12:07.401860 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2563b40-3861-46cb-b313-1c221b526aa7" path="/var/lib/kubelet/pods/f2563b40-3861-46cb-b313-1c221b526aa7/volumes" Mar 08 20:12:32 crc kubenswrapper[4885]: I0308 20:12:32.818172 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:12:32 crc kubenswrapper[4885]: I0308 20:12:32.818997 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:13:00 crc kubenswrapper[4885]: I0308 20:13:00.536764 4885 scope.go:117] "RemoveContainer" containerID="221a3e9bb54bc45b1a9a4a543aadeb429393adbc0fe46f6f79ad34e45269413a" Mar 08 20:13:02 crc kubenswrapper[4885]: I0308 20:13:02.818358 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:13:02 crc kubenswrapper[4885]: I0308 20:13:02.818778 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:13:02 crc kubenswrapper[4885]: I0308 20:13:02.818825 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:13:02 crc kubenswrapper[4885]: I0308 20:13:02.819452 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:13:02 crc kubenswrapper[4885]: I0308 20:13:02.819509 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" gracePeriod=600 Mar 08 20:13:02 crc kubenswrapper[4885]: E0308 20:13:02.952834 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:13:03 crc kubenswrapper[4885]: I0308 20:13:03.184460 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" exitCode=0 Mar 08 20:13:03 crc kubenswrapper[4885]: I0308 20:13:03.184503 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675"} Mar 08 20:13:03 crc kubenswrapper[4885]: I0308 20:13:03.184538 4885 scope.go:117] "RemoveContainer" containerID="b1b5b99f3cf4b3dd1c44244000f04eee91b4f616c32d2bab38cc39ac028ab97a" Mar 08 20:13:03 crc kubenswrapper[4885]: I0308 20:13:03.184947 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:13:03 crc kubenswrapper[4885]: E0308 20:13:03.185152 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:13:17 crc kubenswrapper[4885]: I0308 20:13:17.407956 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:13:17 crc kubenswrapper[4885]: E0308 20:13:17.408702 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:13:31 crc kubenswrapper[4885]: I0308 20:13:31.368736 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:13:31 crc kubenswrapper[4885]: E0308 20:13:31.370152 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:13:45 crc kubenswrapper[4885]: I0308 20:13:45.368548 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:13:45 crc kubenswrapper[4885]: E0308 20:13:45.369708 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:13:56 crc kubenswrapper[4885]: I0308 20:13:56.368758 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:13:56 crc kubenswrapper[4885]: E0308 20:13:56.369616 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.167477 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550014-9hsx2"] Mar 08 20:14:00 crc kubenswrapper[4885]: E0308 20:14:00.167889 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0dd782-7d55-432a-a4c4-72eab3a342f0" containerName="oc" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.167910 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0dd782-7d55-432a-a4c4-72eab3a342f0" containerName="oc" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.168214 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c0dd782-7d55-432a-a4c4-72eab3a342f0" containerName="oc" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.169442 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.173542 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.173730 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.173730 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.193647 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550014-9hsx2"] Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.321428 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-576td\" (UniqueName: \"kubernetes.io/projected/fba06a47-da5c-44a3-9184-d2d92d14ce91-kube-api-access-576td\") pod \"auto-csr-approver-29550014-9hsx2\" (UID: \"fba06a47-da5c-44a3-9184-d2d92d14ce91\") " pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.423821 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-576td\" (UniqueName: \"kubernetes.io/projected/fba06a47-da5c-44a3-9184-d2d92d14ce91-kube-api-access-576td\") pod \"auto-csr-approver-29550014-9hsx2\" (UID: \"fba06a47-da5c-44a3-9184-d2d92d14ce91\") " pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.445420 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-576td\" (UniqueName: \"kubernetes.io/projected/fba06a47-da5c-44a3-9184-d2d92d14ce91-kube-api-access-576td\") pod \"auto-csr-approver-29550014-9hsx2\" (UID: \"fba06a47-da5c-44a3-9184-d2d92d14ce91\") " pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.515112 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:00 crc kubenswrapper[4885]: I0308 20:14:00.999254 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550014-9hsx2"] Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.521752 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tnwk6"] Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.523933 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.539468 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnwk6"] Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.540573 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-catalog-content\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.540723 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhlgv\" (UniqueName: \"kubernetes.io/projected/aa818349-b932-4e77-a8c6-6d200c15e61f-kube-api-access-mhlgv\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.540776 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-utilities\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.642603 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhlgv\" (UniqueName: \"kubernetes.io/projected/aa818349-b932-4e77-a8c6-6d200c15e61f-kube-api-access-mhlgv\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.642941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-utilities\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.643073 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-catalog-content\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.643428 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-utilities\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.643552 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-catalog-content\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.669956 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhlgv\" (UniqueName: \"kubernetes.io/projected/aa818349-b932-4e77-a8c6-6d200c15e61f-kube-api-access-mhlgv\") pod \"redhat-operators-tnwk6\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.760885 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" event={"ID":"fba06a47-da5c-44a3-9184-d2d92d14ce91","Type":"ContainerStarted","Data":"8fa71bcc9cadb5737db6a912639dac0f75701bde33b2020f436a23a83e9430b0"} Mar 08 20:14:01 crc kubenswrapper[4885]: I0308 20:14:01.855062 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:02 crc kubenswrapper[4885]: I0308 20:14:02.094029 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnwk6"] Mar 08 20:14:02 crc kubenswrapper[4885]: W0308 20:14:02.101195 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa818349_b932_4e77_a8c6_6d200c15e61f.slice/crio-d14dc0da59b4b2e35dfc751eb386a45c2ac52286cde582275df925bc4b25a000 WatchSource:0}: Error finding container d14dc0da59b4b2e35dfc751eb386a45c2ac52286cde582275df925bc4b25a000: Status 404 returned error can't find the container with id d14dc0da59b4b2e35dfc751eb386a45c2ac52286cde582275df925bc4b25a000 Mar 08 20:14:02 crc kubenswrapper[4885]: I0308 20:14:02.773900 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerID="9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7" exitCode=0 Mar 08 20:14:02 crc kubenswrapper[4885]: I0308 20:14:02.773962 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnwk6" event={"ID":"aa818349-b932-4e77-a8c6-6d200c15e61f","Type":"ContainerDied","Data":"9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7"} Mar 08 20:14:02 crc kubenswrapper[4885]: I0308 20:14:02.773989 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnwk6" event={"ID":"aa818349-b932-4e77-a8c6-6d200c15e61f","Type":"ContainerStarted","Data":"d14dc0da59b4b2e35dfc751eb386a45c2ac52286cde582275df925bc4b25a000"} Mar 08 20:14:03 crc kubenswrapper[4885]: I0308 20:14:03.789678 4885 generic.go:334] "Generic (PLEG): container finished" podID="fba06a47-da5c-44a3-9184-d2d92d14ce91" containerID="c1bd976fbf85046e9452c23b99e63722a390fc224cc0805a726ba4a52e322c78" exitCode=0 Mar 08 20:14:03 crc kubenswrapper[4885]: I0308 20:14:03.789762 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" event={"ID":"fba06a47-da5c-44a3-9184-d2d92d14ce91","Type":"ContainerDied","Data":"c1bd976fbf85046e9452c23b99e63722a390fc224cc0805a726ba4a52e322c78"} Mar 08 20:14:04 crc kubenswrapper[4885]: I0308 20:14:04.805232 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerID="aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f" exitCode=0 Mar 08 20:14:04 crc kubenswrapper[4885]: I0308 20:14:04.805304 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnwk6" event={"ID":"aa818349-b932-4e77-a8c6-6d200c15e61f","Type":"ContainerDied","Data":"aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f"} Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.221006 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.399415 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-576td\" (UniqueName: \"kubernetes.io/projected/fba06a47-da5c-44a3-9184-d2d92d14ce91-kube-api-access-576td\") pod \"fba06a47-da5c-44a3-9184-d2d92d14ce91\" (UID: \"fba06a47-da5c-44a3-9184-d2d92d14ce91\") " Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.420029 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fba06a47-da5c-44a3-9184-d2d92d14ce91-kube-api-access-576td" (OuterVolumeSpecName: "kube-api-access-576td") pod "fba06a47-da5c-44a3-9184-d2d92d14ce91" (UID: "fba06a47-da5c-44a3-9184-d2d92d14ce91"). InnerVolumeSpecName "kube-api-access-576td". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.502235 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-576td\" (UniqueName: \"kubernetes.io/projected/fba06a47-da5c-44a3-9184-d2d92d14ce91-kube-api-access-576td\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.816808 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" event={"ID":"fba06a47-da5c-44a3-9184-d2d92d14ce91","Type":"ContainerDied","Data":"8fa71bcc9cadb5737db6a912639dac0f75701bde33b2020f436a23a83e9430b0"} Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.817199 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fa71bcc9cadb5737db6a912639dac0f75701bde33b2020f436a23a83e9430b0" Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.816834 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550014-9hsx2" Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.820947 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnwk6" event={"ID":"aa818349-b932-4e77-a8c6-6d200c15e61f","Type":"ContainerStarted","Data":"c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c"} Mar 08 20:14:05 crc kubenswrapper[4885]: I0308 20:14:05.857529 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tnwk6" podStartSLOduration=2.405304759 podStartE2EDuration="4.857500226s" podCreationTimestamp="2026-03-08 20:14:01 +0000 UTC" firstStartedPulling="2026-03-08 20:14:02.776412303 +0000 UTC m=+2544.172466326" lastFinishedPulling="2026-03-08 20:14:05.22860773 +0000 UTC m=+2546.624661793" observedRunningTime="2026-03-08 20:14:05.844652914 +0000 UTC m=+2547.240706977" watchObservedRunningTime="2026-03-08 20:14:05.857500226 +0000 UTC m=+2547.253554289" Mar 08 20:14:06 crc kubenswrapper[4885]: I0308 20:14:06.347765 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550008-v776t"] Mar 08 20:14:06 crc kubenswrapper[4885]: I0308 20:14:06.360160 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550008-v776t"] Mar 08 20:14:07 crc kubenswrapper[4885]: I0308 20:14:07.384480 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a693b4d7-ae29-482d-8b4d-8025be4ce19f" path="/var/lib/kubelet/pods/a693b4d7-ae29-482d-8b4d-8025be4ce19f/volumes" Mar 08 20:14:08 crc kubenswrapper[4885]: I0308 20:14:08.369225 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:14:08 crc kubenswrapper[4885]: E0308 20:14:08.369603 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:14:11 crc kubenswrapper[4885]: I0308 20:14:11.855868 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:11 crc kubenswrapper[4885]: I0308 20:14:11.856374 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:12 crc kubenswrapper[4885]: I0308 20:14:12.929352 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tnwk6" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="registry-server" probeResult="failure" output=< Mar 08 20:14:12 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 20:14:12 crc kubenswrapper[4885]: > Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.693340 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4qwhh"] Mar 08 20:14:13 crc kubenswrapper[4885]: E0308 20:14:13.693796 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fba06a47-da5c-44a3-9184-d2d92d14ce91" containerName="oc" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.693825 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="fba06a47-da5c-44a3-9184-d2d92d14ce91" containerName="oc" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.694099 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="fba06a47-da5c-44a3-9184-d2d92d14ce91" containerName="oc" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.695777 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.728123 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4qwhh"] Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.743611 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgwq8\" (UniqueName: \"kubernetes.io/projected/6258a7a2-e590-4201-a36b-26f438c35b4f-kube-api-access-xgwq8\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.743700 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-utilities\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.743734 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-catalog-content\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.844456 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgwq8\" (UniqueName: \"kubernetes.io/projected/6258a7a2-e590-4201-a36b-26f438c35b4f-kube-api-access-xgwq8\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.844517 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-utilities\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.844538 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-catalog-content\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.845171 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-utilities\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.845190 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-catalog-content\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:13 crc kubenswrapper[4885]: I0308 20:14:13.862872 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgwq8\" (UniqueName: \"kubernetes.io/projected/6258a7a2-e590-4201-a36b-26f438c35b4f-kube-api-access-xgwq8\") pod \"certified-operators-4qwhh\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:14 crc kubenswrapper[4885]: I0308 20:14:14.079843 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:14 crc kubenswrapper[4885]: I0308 20:14:14.573335 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4qwhh"] Mar 08 20:14:14 crc kubenswrapper[4885]: I0308 20:14:14.896101 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerStarted","Data":"d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7"} Mar 08 20:14:14 crc kubenswrapper[4885]: I0308 20:14:14.896461 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerStarted","Data":"7b14c100c2776453655789b80e53f3f167965cd1849e65873687ab218eeb5a83"} Mar 08 20:14:15 crc kubenswrapper[4885]: I0308 20:14:15.910966 4885 generic.go:334] "Generic (PLEG): container finished" podID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerID="d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7" exitCode=0 Mar 08 20:14:15 crc kubenswrapper[4885]: I0308 20:14:15.911053 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerDied","Data":"d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7"} Mar 08 20:14:16 crc kubenswrapper[4885]: I0308 20:14:16.923401 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerStarted","Data":"a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01"} Mar 08 20:14:17 crc kubenswrapper[4885]: I0308 20:14:17.937152 4885 generic.go:334] "Generic (PLEG): container finished" podID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerID="a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01" exitCode=0 Mar 08 20:14:17 crc kubenswrapper[4885]: I0308 20:14:17.937205 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerDied","Data":"a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01"} Mar 08 20:14:18 crc kubenswrapper[4885]: I0308 20:14:18.948403 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerStarted","Data":"a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6"} Mar 08 20:14:18 crc kubenswrapper[4885]: I0308 20:14:18.993482 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4qwhh" podStartSLOduration=3.563928367 podStartE2EDuration="5.993446121s" podCreationTimestamp="2026-03-08 20:14:13 +0000 UTC" firstStartedPulling="2026-03-08 20:14:15.914505744 +0000 UTC m=+2557.310559807" lastFinishedPulling="2026-03-08 20:14:18.344023528 +0000 UTC m=+2559.740077561" observedRunningTime="2026-03-08 20:14:18.975777231 +0000 UTC m=+2560.371831314" watchObservedRunningTime="2026-03-08 20:14:18.993446121 +0000 UTC m=+2560.389500184" Mar 08 20:14:21 crc kubenswrapper[4885]: I0308 20:14:21.370117 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:14:21 crc kubenswrapper[4885]: E0308 20:14:21.371382 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:14:21 crc kubenswrapper[4885]: I0308 20:14:21.914784 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:21 crc kubenswrapper[4885]: I0308 20:14:21.975782 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:22 crc kubenswrapper[4885]: I0308 20:14:22.880325 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnwk6"] Mar 08 20:14:22 crc kubenswrapper[4885]: I0308 20:14:22.981280 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tnwk6" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="registry-server" containerID="cri-o://c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c" gracePeriod=2 Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.441865 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.623694 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-catalog-content\") pod \"aa818349-b932-4e77-a8c6-6d200c15e61f\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.623791 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-utilities\") pod \"aa818349-b932-4e77-a8c6-6d200c15e61f\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.623942 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhlgv\" (UniqueName: \"kubernetes.io/projected/aa818349-b932-4e77-a8c6-6d200c15e61f-kube-api-access-mhlgv\") pod \"aa818349-b932-4e77-a8c6-6d200c15e61f\" (UID: \"aa818349-b932-4e77-a8c6-6d200c15e61f\") " Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.624977 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-utilities" (OuterVolumeSpecName: "utilities") pod "aa818349-b932-4e77-a8c6-6d200c15e61f" (UID: "aa818349-b932-4e77-a8c6-6d200c15e61f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.635385 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa818349-b932-4e77-a8c6-6d200c15e61f-kube-api-access-mhlgv" (OuterVolumeSpecName: "kube-api-access-mhlgv") pod "aa818349-b932-4e77-a8c6-6d200c15e61f" (UID: "aa818349-b932-4e77-a8c6-6d200c15e61f"). InnerVolumeSpecName "kube-api-access-mhlgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.730345 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhlgv\" (UniqueName: \"kubernetes.io/projected/aa818349-b932-4e77-a8c6-6d200c15e61f-kube-api-access-mhlgv\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.730399 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.797956 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa818349-b932-4e77-a8c6-6d200c15e61f" (UID: "aa818349-b932-4e77-a8c6-6d200c15e61f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:14:23 crc kubenswrapper[4885]: I0308 20:14:23.832320 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa818349-b932-4e77-a8c6-6d200c15e61f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.002886 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerID="c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c" exitCode=0 Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.002971 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnwk6" event={"ID":"aa818349-b932-4e77-a8c6-6d200c15e61f","Type":"ContainerDied","Data":"c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c"} Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.003021 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnwk6" event={"ID":"aa818349-b932-4e77-a8c6-6d200c15e61f","Type":"ContainerDied","Data":"d14dc0da59b4b2e35dfc751eb386a45c2ac52286cde582275df925bc4b25a000"} Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.003051 4885 scope.go:117] "RemoveContainer" containerID="c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.003042 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnwk6" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.031070 4885 scope.go:117] "RemoveContainer" containerID="aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.069534 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnwk6"] Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.075967 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tnwk6"] Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.081176 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.081258 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.081959 4885 scope.go:117] "RemoveContainer" containerID="9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.123429 4885 scope.go:117] "RemoveContainer" containerID="c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c" Mar 08 20:14:24 crc kubenswrapper[4885]: E0308 20:14:24.124000 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c\": container with ID starting with c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c not found: ID does not exist" containerID="c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.124045 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c"} err="failed to get container status \"c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c\": rpc error: code = NotFound desc = could not find container \"c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c\": container with ID starting with c90ad971312d85d7c5d308dce24478848a51e8aa2bcb9e10925c31b9b650d91c not found: ID does not exist" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.124079 4885 scope.go:117] "RemoveContainer" containerID="aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f" Mar 08 20:14:24 crc kubenswrapper[4885]: E0308 20:14:24.124507 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f\": container with ID starting with aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f not found: ID does not exist" containerID="aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.124534 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f"} err="failed to get container status \"aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f\": rpc error: code = NotFound desc = could not find container \"aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f\": container with ID starting with aff6b201354fc1767f9eae647679d69ab3ecb1890676bbbf8a325c25559bd92f not found: ID does not exist" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.124550 4885 scope.go:117] "RemoveContainer" containerID="9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7" Mar 08 20:14:24 crc kubenswrapper[4885]: E0308 20:14:24.124785 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7\": container with ID starting with 9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7 not found: ID does not exist" containerID="9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.124855 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7"} err="failed to get container status \"9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7\": rpc error: code = NotFound desc = could not find container \"9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7\": container with ID starting with 9a7a9ffe7e437b0b6b20d626d3ce31602776e09c02881b47d9c4be69d4d8a3d7 not found: ID does not exist" Mar 08 20:14:24 crc kubenswrapper[4885]: I0308 20:14:24.132334 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:25 crc kubenswrapper[4885]: I0308 20:14:25.095595 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:25 crc kubenswrapper[4885]: I0308 20:14:25.385810 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" path="/var/lib/kubelet/pods/aa818349-b932-4e77-a8c6-6d200c15e61f/volumes" Mar 08 20:14:26 crc kubenswrapper[4885]: I0308 20:14:26.476671 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4qwhh"] Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.034375 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4qwhh" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="registry-server" containerID="cri-o://a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6" gracePeriod=2 Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.556548 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.694982 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-catalog-content\") pod \"6258a7a2-e590-4201-a36b-26f438c35b4f\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.695122 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-utilities\") pod \"6258a7a2-e590-4201-a36b-26f438c35b4f\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.695163 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgwq8\" (UniqueName: \"kubernetes.io/projected/6258a7a2-e590-4201-a36b-26f438c35b4f-kube-api-access-xgwq8\") pod \"6258a7a2-e590-4201-a36b-26f438c35b4f\" (UID: \"6258a7a2-e590-4201-a36b-26f438c35b4f\") " Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.696701 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-utilities" (OuterVolumeSpecName: "utilities") pod "6258a7a2-e590-4201-a36b-26f438c35b4f" (UID: "6258a7a2-e590-4201-a36b-26f438c35b4f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.704513 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6258a7a2-e590-4201-a36b-26f438c35b4f-kube-api-access-xgwq8" (OuterVolumeSpecName: "kube-api-access-xgwq8") pod "6258a7a2-e590-4201-a36b-26f438c35b4f" (UID: "6258a7a2-e590-4201-a36b-26f438c35b4f"). InnerVolumeSpecName "kube-api-access-xgwq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.796988 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:27 crc kubenswrapper[4885]: I0308 20:14:27.797044 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgwq8\" (UniqueName: \"kubernetes.io/projected/6258a7a2-e590-4201-a36b-26f438c35b4f-kube-api-access-xgwq8\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.049271 4885 generic.go:334] "Generic (PLEG): container finished" podID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerID="a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6" exitCode=0 Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.049332 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerDied","Data":"a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6"} Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.049364 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4qwhh" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.049422 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4qwhh" event={"ID":"6258a7a2-e590-4201-a36b-26f438c35b4f","Type":"ContainerDied","Data":"7b14c100c2776453655789b80e53f3f167965cd1849e65873687ab218eeb5a83"} Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.049457 4885 scope.go:117] "RemoveContainer" containerID="a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.080993 4885 scope.go:117] "RemoveContainer" containerID="a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.086955 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6258a7a2-e590-4201-a36b-26f438c35b4f" (UID: "6258a7a2-e590-4201-a36b-26f438c35b4f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.105605 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6258a7a2-e590-4201-a36b-26f438c35b4f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.110787 4885 scope.go:117] "RemoveContainer" containerID="d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.139253 4885 scope.go:117] "RemoveContainer" containerID="a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6" Mar 08 20:14:28 crc kubenswrapper[4885]: E0308 20:14:28.139775 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6\": container with ID starting with a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6 not found: ID does not exist" containerID="a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.139896 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6"} err="failed to get container status \"a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6\": rpc error: code = NotFound desc = could not find container \"a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6\": container with ID starting with a0e32d27d56a2671a88afc891e9d0fb93972fa67755cfd42f6501132f15a64c6 not found: ID does not exist" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.140063 4885 scope.go:117] "RemoveContainer" containerID="a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01" Mar 08 20:14:28 crc kubenswrapper[4885]: E0308 20:14:28.140670 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01\": container with ID starting with a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01 not found: ID does not exist" containerID="a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.140721 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01"} err="failed to get container status \"a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01\": rpc error: code = NotFound desc = could not find container \"a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01\": container with ID starting with a77fadefeaba0a4d63331bf12b6dcfb9502203891d3b90291ae5315a260b5a01 not found: ID does not exist" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.140748 4885 scope.go:117] "RemoveContainer" containerID="d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7" Mar 08 20:14:28 crc kubenswrapper[4885]: E0308 20:14:28.141487 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7\": container with ID starting with d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7 not found: ID does not exist" containerID="d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.141529 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7"} err="failed to get container status \"d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7\": rpc error: code = NotFound desc = could not find container \"d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7\": container with ID starting with d2fd2cff1181b150c7344d486eec340ebe4048a7d214c516d1dc51fbdc239ea7 not found: ID does not exist" Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.410269 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4qwhh"] Mar 08 20:14:28 crc kubenswrapper[4885]: I0308 20:14:28.432011 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4qwhh"] Mar 08 20:14:29 crc kubenswrapper[4885]: I0308 20:14:29.404167 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" path="/var/lib/kubelet/pods/6258a7a2-e590-4201-a36b-26f438c35b4f/volumes" Mar 08 20:14:32 crc kubenswrapper[4885]: I0308 20:14:32.368572 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:14:32 crc kubenswrapper[4885]: E0308 20:14:32.369359 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815145 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mhpk5"] Mar 08 20:14:36 crc kubenswrapper[4885]: E0308 20:14:36.815694 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="extract-content" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815706 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="extract-content" Mar 08 20:14:36 crc kubenswrapper[4885]: E0308 20:14:36.815714 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="registry-server" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815722 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="registry-server" Mar 08 20:14:36 crc kubenswrapper[4885]: E0308 20:14:36.815740 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="extract-content" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815751 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="extract-content" Mar 08 20:14:36 crc kubenswrapper[4885]: E0308 20:14:36.815761 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="extract-utilities" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815767 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="extract-utilities" Mar 08 20:14:36 crc kubenswrapper[4885]: E0308 20:14:36.815779 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="extract-utilities" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815785 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="extract-utilities" Mar 08 20:14:36 crc kubenswrapper[4885]: E0308 20:14:36.815794 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="registry-server" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815800 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="registry-server" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815966 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa818349-b932-4e77-a8c6-6d200c15e61f" containerName="registry-server" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.815979 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6258a7a2-e590-4201-a36b-26f438c35b4f" containerName="registry-server" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.816997 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.830400 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhpk5"] Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.953760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kzlp\" (UniqueName: \"kubernetes.io/projected/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-kube-api-access-6kzlp\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.953812 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-utilities\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:36 crc kubenswrapper[4885]: I0308 20:14:36.953865 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-catalog-content\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.055870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kzlp\" (UniqueName: \"kubernetes.io/projected/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-kube-api-access-6kzlp\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.055999 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-utilities\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.056105 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-catalog-content\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.056616 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-utilities\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.056869 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-catalog-content\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.077354 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kzlp\" (UniqueName: \"kubernetes.io/projected/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-kube-api-access-6kzlp\") pod \"redhat-marketplace-mhpk5\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.153040 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:37 crc kubenswrapper[4885]: I0308 20:14:37.615518 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhpk5"] Mar 08 20:14:37 crc kubenswrapper[4885]: W0308 20:14:37.630549 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef92c765_f2e8_4808_a0dc_f9ccdb20509e.slice/crio-bca972bad3453ff797681d426c40f0f63ffb7ec8d160b90afca8544044109b5e WatchSource:0}: Error finding container bca972bad3453ff797681d426c40f0f63ffb7ec8d160b90afca8544044109b5e: Status 404 returned error can't find the container with id bca972bad3453ff797681d426c40f0f63ffb7ec8d160b90afca8544044109b5e Mar 08 20:14:38 crc kubenswrapper[4885]: I0308 20:14:38.157551 4885 generic.go:334] "Generic (PLEG): container finished" podID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerID="9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f" exitCode=0 Mar 08 20:14:38 crc kubenswrapper[4885]: I0308 20:14:38.157607 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerDied","Data":"9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f"} Mar 08 20:14:38 crc kubenswrapper[4885]: I0308 20:14:38.157641 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerStarted","Data":"bca972bad3453ff797681d426c40f0f63ffb7ec8d160b90afca8544044109b5e"} Mar 08 20:14:39 crc kubenswrapper[4885]: I0308 20:14:39.168409 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerStarted","Data":"0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88"} Mar 08 20:14:40 crc kubenswrapper[4885]: I0308 20:14:40.182971 4885 generic.go:334] "Generic (PLEG): container finished" podID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerID="0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88" exitCode=0 Mar 08 20:14:40 crc kubenswrapper[4885]: I0308 20:14:40.183040 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerDied","Data":"0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88"} Mar 08 20:14:41 crc kubenswrapper[4885]: I0308 20:14:41.201112 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerStarted","Data":"c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786"} Mar 08 20:14:41 crc kubenswrapper[4885]: I0308 20:14:41.240694 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mhpk5" podStartSLOduration=2.784750011 podStartE2EDuration="5.240634217s" podCreationTimestamp="2026-03-08 20:14:36 +0000 UTC" firstStartedPulling="2026-03-08 20:14:38.159755168 +0000 UTC m=+2579.555809231" lastFinishedPulling="2026-03-08 20:14:40.615639374 +0000 UTC m=+2582.011693437" observedRunningTime="2026-03-08 20:14:41.230959689 +0000 UTC m=+2582.627013742" watchObservedRunningTime="2026-03-08 20:14:41.240634217 +0000 UTC m=+2582.636688280" Mar 08 20:14:45 crc kubenswrapper[4885]: I0308 20:14:45.369196 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:14:45 crc kubenswrapper[4885]: E0308 20:14:45.370583 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:14:47 crc kubenswrapper[4885]: I0308 20:14:47.153901 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:47 crc kubenswrapper[4885]: I0308 20:14:47.155865 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:47 crc kubenswrapper[4885]: I0308 20:14:47.243798 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:47 crc kubenswrapper[4885]: I0308 20:14:47.329709 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:47 crc kubenswrapper[4885]: I0308 20:14:47.492438 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhpk5"] Mar 08 20:14:49 crc kubenswrapper[4885]: I0308 20:14:49.274156 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mhpk5" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="registry-server" containerID="cri-o://c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786" gracePeriod=2 Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.267183 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.290555 4885 generic.go:334] "Generic (PLEG): container finished" podID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerID="c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786" exitCode=0 Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.290624 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerDied","Data":"c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786"} Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.290657 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mhpk5" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.290678 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mhpk5" event={"ID":"ef92c765-f2e8-4808-a0dc-f9ccdb20509e","Type":"ContainerDied","Data":"bca972bad3453ff797681d426c40f0f63ffb7ec8d160b90afca8544044109b5e"} Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.290710 4885 scope.go:117] "RemoveContainer" containerID="c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.316845 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-catalog-content\") pod \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.316992 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kzlp\" (UniqueName: \"kubernetes.io/projected/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-kube-api-access-6kzlp\") pod \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.317124 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-utilities\") pod \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\" (UID: \"ef92c765-f2e8-4808-a0dc-f9ccdb20509e\") " Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.319469 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-utilities" (OuterVolumeSpecName: "utilities") pod "ef92c765-f2e8-4808-a0dc-f9ccdb20509e" (UID: "ef92c765-f2e8-4808-a0dc-f9ccdb20509e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.326054 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-kube-api-access-6kzlp" (OuterVolumeSpecName: "kube-api-access-6kzlp") pod "ef92c765-f2e8-4808-a0dc-f9ccdb20509e" (UID: "ef92c765-f2e8-4808-a0dc-f9ccdb20509e"). InnerVolumeSpecName "kube-api-access-6kzlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.331401 4885 scope.go:117] "RemoveContainer" containerID="0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.357028 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef92c765-f2e8-4808-a0dc-f9ccdb20509e" (UID: "ef92c765-f2e8-4808-a0dc-f9ccdb20509e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.359306 4885 scope.go:117] "RemoveContainer" containerID="9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.414209 4885 scope.go:117] "RemoveContainer" containerID="c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786" Mar 08 20:14:50 crc kubenswrapper[4885]: E0308 20:14:50.415153 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786\": container with ID starting with c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786 not found: ID does not exist" containerID="c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.415283 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786"} err="failed to get container status \"c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786\": rpc error: code = NotFound desc = could not find container \"c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786\": container with ID starting with c4ed0b5760ce46b7c86047de07cc5dd173fff1766df90eea50c8d60c09376786 not found: ID does not exist" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.415341 4885 scope.go:117] "RemoveContainer" containerID="0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88" Mar 08 20:14:50 crc kubenswrapper[4885]: E0308 20:14:50.416117 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88\": container with ID starting with 0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88 not found: ID does not exist" containerID="0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.416198 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88"} err="failed to get container status \"0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88\": rpc error: code = NotFound desc = could not find container \"0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88\": container with ID starting with 0d3dcb7f1b7c62d002445c94a87ee63f4ef2c979c886717910e8d5f096d68d88 not found: ID does not exist" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.416260 4885 scope.go:117] "RemoveContainer" containerID="9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f" Mar 08 20:14:50 crc kubenswrapper[4885]: E0308 20:14:50.416770 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f\": container with ID starting with 9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f not found: ID does not exist" containerID="9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.416811 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f"} err="failed to get container status \"9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f\": rpc error: code = NotFound desc = could not find container \"9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f\": container with ID starting with 9ac818f49ae96a1e8df3f1d6fa7ac869c314da4c5829cbe63b6d0dd4f84d018f not found: ID does not exist" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.421012 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.421042 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.421055 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kzlp\" (UniqueName: \"kubernetes.io/projected/ef92c765-f2e8-4808-a0dc-f9ccdb20509e-kube-api-access-6kzlp\") on node \"crc\" DevicePath \"\"" Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.649317 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhpk5"] Mar 08 20:14:50 crc kubenswrapper[4885]: I0308 20:14:50.658944 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mhpk5"] Mar 08 20:14:51 crc kubenswrapper[4885]: I0308 20:14:51.386415 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" path="/var/lib/kubelet/pods/ef92c765-f2e8-4808-a0dc-f9ccdb20509e/volumes" Mar 08 20:14:57 crc kubenswrapper[4885]: I0308 20:14:57.368591 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:14:57 crc kubenswrapper[4885]: E0308 20:14:57.369620 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.160887 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb"] Mar 08 20:15:00 crc kubenswrapper[4885]: E0308 20:15:00.161220 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="registry-server" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.161234 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="registry-server" Mar 08 20:15:00 crc kubenswrapper[4885]: E0308 20:15:00.161251 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="extract-content" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.161259 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="extract-content" Mar 08 20:15:00 crc kubenswrapper[4885]: E0308 20:15:00.161276 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="extract-utilities" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.161286 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="extract-utilities" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.161462 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef92c765-f2e8-4808-a0dc-f9ccdb20509e" containerName="registry-server" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.161998 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.165832 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.166256 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.173404 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb"] Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.320517 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19d8979b-517e-4b02-8f5a-ead2361596ea-config-volume\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.320585 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9zzx\" (UniqueName: \"kubernetes.io/projected/19d8979b-517e-4b02-8f5a-ead2361596ea-kube-api-access-z9zzx\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.320822 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19d8979b-517e-4b02-8f5a-ead2361596ea-secret-volume\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.422455 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19d8979b-517e-4b02-8f5a-ead2361596ea-config-volume\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.422533 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9zzx\" (UniqueName: \"kubernetes.io/projected/19d8979b-517e-4b02-8f5a-ead2361596ea-kube-api-access-z9zzx\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.422650 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19d8979b-517e-4b02-8f5a-ead2361596ea-secret-volume\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.424326 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19d8979b-517e-4b02-8f5a-ead2361596ea-config-volume\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.443390 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19d8979b-517e-4b02-8f5a-ead2361596ea-secret-volume\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.454105 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9zzx\" (UniqueName: \"kubernetes.io/projected/19d8979b-517e-4b02-8f5a-ead2361596ea-kube-api-access-z9zzx\") pod \"collect-profiles-29550015-5rkqb\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.484228 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.645244 4885 scope.go:117] "RemoveContainer" containerID="e3ca62c7c9954975ab02aef71fa3fa4daa75c76bdca0ee86cf37dfeed6b0d2ef" Mar 08 20:15:00 crc kubenswrapper[4885]: I0308 20:15:00.793335 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb"] Mar 08 20:15:01 crc kubenswrapper[4885]: I0308 20:15:01.390821 4885 generic.go:334] "Generic (PLEG): container finished" podID="19d8979b-517e-4b02-8f5a-ead2361596ea" containerID="26e42e6d76089e28dc85056b673d6bddbefe770761a003ab85ecc351d49b7771" exitCode=0 Mar 08 20:15:01 crc kubenswrapper[4885]: I0308 20:15:01.391076 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" event={"ID":"19d8979b-517e-4b02-8f5a-ead2361596ea","Type":"ContainerDied","Data":"26e42e6d76089e28dc85056b673d6bddbefe770761a003ab85ecc351d49b7771"} Mar 08 20:15:01 crc kubenswrapper[4885]: I0308 20:15:01.391258 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" event={"ID":"19d8979b-517e-4b02-8f5a-ead2361596ea","Type":"ContainerStarted","Data":"c59fcb1c1493c2243458e4817fc6c053c4398a0cdac69285394e895d25eea35e"} Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.829271 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.959534 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9zzx\" (UniqueName: \"kubernetes.io/projected/19d8979b-517e-4b02-8f5a-ead2361596ea-kube-api-access-z9zzx\") pod \"19d8979b-517e-4b02-8f5a-ead2361596ea\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.959699 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19d8979b-517e-4b02-8f5a-ead2361596ea-config-volume\") pod \"19d8979b-517e-4b02-8f5a-ead2361596ea\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.959898 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19d8979b-517e-4b02-8f5a-ead2361596ea-secret-volume\") pod \"19d8979b-517e-4b02-8f5a-ead2361596ea\" (UID: \"19d8979b-517e-4b02-8f5a-ead2361596ea\") " Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.960910 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19d8979b-517e-4b02-8f5a-ead2361596ea-config-volume" (OuterVolumeSpecName: "config-volume") pod "19d8979b-517e-4b02-8f5a-ead2361596ea" (UID: "19d8979b-517e-4b02-8f5a-ead2361596ea"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.967781 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19d8979b-517e-4b02-8f5a-ead2361596ea-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "19d8979b-517e-4b02-8f5a-ead2361596ea" (UID: "19d8979b-517e-4b02-8f5a-ead2361596ea"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:15:02 crc kubenswrapper[4885]: I0308 20:15:02.967791 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19d8979b-517e-4b02-8f5a-ead2361596ea-kube-api-access-z9zzx" (OuterVolumeSpecName: "kube-api-access-z9zzx") pod "19d8979b-517e-4b02-8f5a-ead2361596ea" (UID: "19d8979b-517e-4b02-8f5a-ead2361596ea"). InnerVolumeSpecName "kube-api-access-z9zzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.061723 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/19d8979b-517e-4b02-8f5a-ead2361596ea-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.061790 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9zzx\" (UniqueName: \"kubernetes.io/projected/19d8979b-517e-4b02-8f5a-ead2361596ea-kube-api-access-z9zzx\") on node \"crc\" DevicePath \"\"" Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.061807 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19d8979b-517e-4b02-8f5a-ead2361596ea-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.416582 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.416478 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb" event={"ID":"19d8979b-517e-4b02-8f5a-ead2361596ea","Type":"ContainerDied","Data":"c59fcb1c1493c2243458e4817fc6c053c4398a0cdac69285394e895d25eea35e"} Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.417547 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c59fcb1c1493c2243458e4817fc6c053c4398a0cdac69285394e895d25eea35e" Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.925743 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv"] Mar 08 20:15:03 crc kubenswrapper[4885]: I0308 20:15:03.932179 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549970-lfczv"] Mar 08 20:15:05 crc kubenswrapper[4885]: I0308 20:15:05.385062 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1329795d-a8f9-4896-adba-23c2c0da9261" path="/var/lib/kubelet/pods/1329795d-a8f9-4896-adba-23c2c0da9261/volumes" Mar 08 20:15:09 crc kubenswrapper[4885]: I0308 20:15:09.376449 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:15:09 crc kubenswrapper[4885]: E0308 20:15:09.377562 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.172576 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5bcbs"] Mar 08 20:15:10 crc kubenswrapper[4885]: E0308 20:15:10.173351 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19d8979b-517e-4b02-8f5a-ead2361596ea" containerName="collect-profiles" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.173380 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="19d8979b-517e-4b02-8f5a-ead2361596ea" containerName="collect-profiles" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.173627 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="19d8979b-517e-4b02-8f5a-ead2361596ea" containerName="collect-profiles" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.175304 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.198728 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5bcbs"] Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.375593 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-utilities\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.375633 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dc6r\" (UniqueName: \"kubernetes.io/projected/c91aad57-6c3a-41b7-9844-b93c89a30127-kube-api-access-2dc6r\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.375908 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-catalog-content\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.480794 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-catalog-content\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.480999 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-utilities\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.481032 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dc6r\" (UniqueName: \"kubernetes.io/projected/c91aad57-6c3a-41b7-9844-b93c89a30127-kube-api-access-2dc6r\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.481533 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-catalog-content\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.481812 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-utilities\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.505813 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dc6r\" (UniqueName: \"kubernetes.io/projected/c91aad57-6c3a-41b7-9844-b93c89a30127-kube-api-access-2dc6r\") pod \"community-operators-5bcbs\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:10 crc kubenswrapper[4885]: I0308 20:15:10.539993 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:11 crc kubenswrapper[4885]: I0308 20:15:11.065466 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5bcbs"] Mar 08 20:15:11 crc kubenswrapper[4885]: I0308 20:15:11.501653 4885 generic.go:334] "Generic (PLEG): container finished" podID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerID="65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f" exitCode=0 Mar 08 20:15:11 crc kubenswrapper[4885]: I0308 20:15:11.501712 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bcbs" event={"ID":"c91aad57-6c3a-41b7-9844-b93c89a30127","Type":"ContainerDied","Data":"65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f"} Mar 08 20:15:11 crc kubenswrapper[4885]: I0308 20:15:11.501751 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bcbs" event={"ID":"c91aad57-6c3a-41b7-9844-b93c89a30127","Type":"ContainerStarted","Data":"b7621f432752e14ef81ea4741592219bed91c1151a2d13496f884aab8c9e039e"} Mar 08 20:15:11 crc kubenswrapper[4885]: I0308 20:15:11.505309 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:15:13 crc kubenswrapper[4885]: I0308 20:15:13.524274 4885 generic.go:334] "Generic (PLEG): container finished" podID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerID="2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80" exitCode=0 Mar 08 20:15:13 crc kubenswrapper[4885]: I0308 20:15:13.524438 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bcbs" event={"ID":"c91aad57-6c3a-41b7-9844-b93c89a30127","Type":"ContainerDied","Data":"2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80"} Mar 08 20:15:14 crc kubenswrapper[4885]: I0308 20:15:14.536486 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bcbs" event={"ID":"c91aad57-6c3a-41b7-9844-b93c89a30127","Type":"ContainerStarted","Data":"308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14"} Mar 08 20:15:14 crc kubenswrapper[4885]: I0308 20:15:14.561979 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5bcbs" podStartSLOduration=2.147329028 podStartE2EDuration="4.561961177s" podCreationTimestamp="2026-03-08 20:15:10 +0000 UTC" firstStartedPulling="2026-03-08 20:15:11.50485714 +0000 UTC m=+2612.900911203" lastFinishedPulling="2026-03-08 20:15:13.919489289 +0000 UTC m=+2615.315543352" observedRunningTime="2026-03-08 20:15:14.557422916 +0000 UTC m=+2615.953476939" watchObservedRunningTime="2026-03-08 20:15:14.561961177 +0000 UTC m=+2615.958015190" Mar 08 20:15:20 crc kubenswrapper[4885]: I0308 20:15:20.541853 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:20 crc kubenswrapper[4885]: I0308 20:15:20.542682 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:20 crc kubenswrapper[4885]: I0308 20:15:20.607955 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:20 crc kubenswrapper[4885]: I0308 20:15:20.701856 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:20 crc kubenswrapper[4885]: I0308 20:15:20.891188 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5bcbs"] Mar 08 20:15:21 crc kubenswrapper[4885]: I0308 20:15:21.368495 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:15:21 crc kubenswrapper[4885]: E0308 20:15:21.369430 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:15:22 crc kubenswrapper[4885]: I0308 20:15:22.609583 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5bcbs" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="registry-server" containerID="cri-o://308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14" gracePeriod=2 Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.169348 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.248340 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-catalog-content\") pod \"c91aad57-6c3a-41b7-9844-b93c89a30127\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.251171 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-utilities\") pod \"c91aad57-6c3a-41b7-9844-b93c89a30127\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.251355 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dc6r\" (UniqueName: \"kubernetes.io/projected/c91aad57-6c3a-41b7-9844-b93c89a30127-kube-api-access-2dc6r\") pod \"c91aad57-6c3a-41b7-9844-b93c89a30127\" (UID: \"c91aad57-6c3a-41b7-9844-b93c89a30127\") " Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.252333 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-utilities" (OuterVolumeSpecName: "utilities") pod "c91aad57-6c3a-41b7-9844-b93c89a30127" (UID: "c91aad57-6c3a-41b7-9844-b93c89a30127"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.257792 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c91aad57-6c3a-41b7-9844-b93c89a30127-kube-api-access-2dc6r" (OuterVolumeSpecName: "kube-api-access-2dc6r") pod "c91aad57-6c3a-41b7-9844-b93c89a30127" (UID: "c91aad57-6c3a-41b7-9844-b93c89a30127"). InnerVolumeSpecName "kube-api-access-2dc6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.298512 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c91aad57-6c3a-41b7-9844-b93c89a30127" (UID: "c91aad57-6c3a-41b7-9844-b93c89a30127"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.352765 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dc6r\" (UniqueName: \"kubernetes.io/projected/c91aad57-6c3a-41b7-9844-b93c89a30127-kube-api-access-2dc6r\") on node \"crc\" DevicePath \"\"" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.352793 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.352803 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c91aad57-6c3a-41b7-9844-b93c89a30127-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.628185 4885 generic.go:334] "Generic (PLEG): container finished" podID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerID="308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14" exitCode=0 Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.628286 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bcbs" event={"ID":"c91aad57-6c3a-41b7-9844-b93c89a30127","Type":"ContainerDied","Data":"308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14"} Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.628354 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5bcbs" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.628377 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5bcbs" event={"ID":"c91aad57-6c3a-41b7-9844-b93c89a30127","Type":"ContainerDied","Data":"b7621f432752e14ef81ea4741592219bed91c1151a2d13496f884aab8c9e039e"} Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.628419 4885 scope.go:117] "RemoveContainer" containerID="308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.672610 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5bcbs"] Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.679360 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5bcbs"] Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.679439 4885 scope.go:117] "RemoveContainer" containerID="2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.711940 4885 scope.go:117] "RemoveContainer" containerID="65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.752022 4885 scope.go:117] "RemoveContainer" containerID="308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14" Mar 08 20:15:23 crc kubenswrapper[4885]: E0308 20:15:23.752755 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14\": container with ID starting with 308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14 not found: ID does not exist" containerID="308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.752818 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14"} err="failed to get container status \"308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14\": rpc error: code = NotFound desc = could not find container \"308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14\": container with ID starting with 308b6b7490f81e82a226ef8016b30cd962cd4bddbe79b7df82d0279e41594a14 not found: ID does not exist" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.752848 4885 scope.go:117] "RemoveContainer" containerID="2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80" Mar 08 20:15:23 crc kubenswrapper[4885]: E0308 20:15:23.753460 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80\": container with ID starting with 2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80 not found: ID does not exist" containerID="2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.753502 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80"} err="failed to get container status \"2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80\": rpc error: code = NotFound desc = could not find container \"2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80\": container with ID starting with 2646945dfb670d297eceb31d5b33385a35c9dfb6a878a05296a4432448f80a80 not found: ID does not exist" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.753528 4885 scope.go:117] "RemoveContainer" containerID="65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f" Mar 08 20:15:23 crc kubenswrapper[4885]: E0308 20:15:23.753968 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f\": container with ID starting with 65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f not found: ID does not exist" containerID="65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f" Mar 08 20:15:23 crc kubenswrapper[4885]: I0308 20:15:23.754049 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f"} err="failed to get container status \"65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f\": rpc error: code = NotFound desc = could not find container \"65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f\": container with ID starting with 65f295ffc969eab23aab0aa963b6f5ce8b93f4a71b80a5202448342a74cc3e6f not found: ID does not exist" Mar 08 20:15:25 crc kubenswrapper[4885]: I0308 20:15:25.384543 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" path="/var/lib/kubelet/pods/c91aad57-6c3a-41b7-9844-b93c89a30127/volumes" Mar 08 20:15:32 crc kubenswrapper[4885]: I0308 20:15:32.368415 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:15:32 crc kubenswrapper[4885]: E0308 20:15:32.370259 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:15:47 crc kubenswrapper[4885]: I0308 20:15:47.369334 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:15:47 crc kubenswrapper[4885]: E0308 20:15:47.370297 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.162579 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550016-dnrqv"] Mar 08 20:16:00 crc kubenswrapper[4885]: E0308 20:16:00.163332 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="extract-content" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.163350 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="extract-content" Mar 08 20:16:00 crc kubenswrapper[4885]: E0308 20:16:00.163365 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="registry-server" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.163373 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="registry-server" Mar 08 20:16:00 crc kubenswrapper[4885]: E0308 20:16:00.163385 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="extract-utilities" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.163421 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="extract-utilities" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.163581 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c91aad57-6c3a-41b7-9844-b93c89a30127" containerName="registry-server" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.164148 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.169582 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.169798 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.170222 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.185503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550016-dnrqv"] Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.287908 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m58qt\" (UniqueName: \"kubernetes.io/projected/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0-kube-api-access-m58qt\") pod \"auto-csr-approver-29550016-dnrqv\" (UID: \"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0\") " pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.368382 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:16:00 crc kubenswrapper[4885]: E0308 20:16:00.368681 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.389850 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m58qt\" (UniqueName: \"kubernetes.io/projected/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0-kube-api-access-m58qt\") pod \"auto-csr-approver-29550016-dnrqv\" (UID: \"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0\") " pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.410592 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m58qt\" (UniqueName: \"kubernetes.io/projected/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0-kube-api-access-m58qt\") pod \"auto-csr-approver-29550016-dnrqv\" (UID: \"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0\") " pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.487374 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:00 crc kubenswrapper[4885]: I0308 20:16:00.785813 4885 scope.go:117] "RemoveContainer" containerID="2dc1a91346ca1a3f953a586589c1cef9384b7780b322dbba277349a4d5f8d041" Mar 08 20:16:01 crc kubenswrapper[4885]: I0308 20:16:01.027436 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550016-dnrqv"] Mar 08 20:16:01 crc kubenswrapper[4885]: I0308 20:16:01.090462 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" event={"ID":"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0","Type":"ContainerStarted","Data":"1b99350a5def51ffd7403138deecf2ddaeb74b10865b87374e723960e9827387"} Mar 08 20:16:03 crc kubenswrapper[4885]: I0308 20:16:03.108483 4885 generic.go:334] "Generic (PLEG): container finished" podID="9d41fb78-094e-49ae-b3d1-4bd0dca17fc0" containerID="a43d61c4ab66d4467b0260a033d95a916347542edf45ef6f0bb611f886b4a5b2" exitCode=0 Mar 08 20:16:03 crc kubenswrapper[4885]: I0308 20:16:03.108625 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" event={"ID":"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0","Type":"ContainerDied","Data":"a43d61c4ab66d4467b0260a033d95a916347542edf45ef6f0bb611f886b4a5b2"} Mar 08 20:16:04 crc kubenswrapper[4885]: I0308 20:16:04.468384 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:04 crc kubenswrapper[4885]: I0308 20:16:04.561000 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m58qt\" (UniqueName: \"kubernetes.io/projected/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0-kube-api-access-m58qt\") pod \"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0\" (UID: \"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0\") " Mar 08 20:16:04 crc kubenswrapper[4885]: I0308 20:16:04.569377 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0-kube-api-access-m58qt" (OuterVolumeSpecName: "kube-api-access-m58qt") pod "9d41fb78-094e-49ae-b3d1-4bd0dca17fc0" (UID: "9d41fb78-094e-49ae-b3d1-4bd0dca17fc0"). InnerVolumeSpecName "kube-api-access-m58qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:16:04 crc kubenswrapper[4885]: I0308 20:16:04.662353 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m58qt\" (UniqueName: \"kubernetes.io/projected/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0-kube-api-access-m58qt\") on node \"crc\" DevicePath \"\"" Mar 08 20:16:05 crc kubenswrapper[4885]: I0308 20:16:05.133552 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" event={"ID":"9d41fb78-094e-49ae-b3d1-4bd0dca17fc0","Type":"ContainerDied","Data":"1b99350a5def51ffd7403138deecf2ddaeb74b10865b87374e723960e9827387"} Mar 08 20:16:05 crc kubenswrapper[4885]: I0308 20:16:05.134013 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b99350a5def51ffd7403138deecf2ddaeb74b10865b87374e723960e9827387" Mar 08 20:16:05 crc kubenswrapper[4885]: I0308 20:16:05.134090 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550016-dnrqv" Mar 08 20:16:05 crc kubenswrapper[4885]: I0308 20:16:05.547519 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550010-xflbc"] Mar 08 20:16:05 crc kubenswrapper[4885]: I0308 20:16:05.555238 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550010-xflbc"] Mar 08 20:16:07 crc kubenswrapper[4885]: I0308 20:16:07.382755 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14bc9568-8018-41b0-9d6f-1b71feaa1021" path="/var/lib/kubelet/pods/14bc9568-8018-41b0-9d6f-1b71feaa1021/volumes" Mar 08 20:16:14 crc kubenswrapper[4885]: I0308 20:16:14.368117 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:16:14 crc kubenswrapper[4885]: E0308 20:16:14.369263 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:16:29 crc kubenswrapper[4885]: I0308 20:16:29.376265 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:16:29 crc kubenswrapper[4885]: E0308 20:16:29.377285 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:16:42 crc kubenswrapper[4885]: I0308 20:16:42.368221 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:16:42 crc kubenswrapper[4885]: E0308 20:16:42.369777 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:16:56 crc kubenswrapper[4885]: I0308 20:16:56.368003 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:16:56 crc kubenswrapper[4885]: E0308 20:16:56.368767 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:17:00 crc kubenswrapper[4885]: I0308 20:17:00.869051 4885 scope.go:117] "RemoveContainer" containerID="a94ec38b15c7d4aceca8f088f44ff876dec7a0f0f94b13e1c65ece3d594f4ff4" Mar 08 20:17:09 crc kubenswrapper[4885]: I0308 20:17:09.375102 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:17:09 crc kubenswrapper[4885]: E0308 20:17:09.376184 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:17:23 crc kubenswrapper[4885]: I0308 20:17:23.369051 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:17:23 crc kubenswrapper[4885]: E0308 20:17:23.369658 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:17:36 crc kubenswrapper[4885]: I0308 20:17:36.369797 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:17:36 crc kubenswrapper[4885]: E0308 20:17:36.370831 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:17:47 crc kubenswrapper[4885]: I0308 20:17:47.369570 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:17:47 crc kubenswrapper[4885]: E0308 20:17:47.370747 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.183330 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550018-kfvl8"] Mar 08 20:18:00 crc kubenswrapper[4885]: E0308 20:18:00.186107 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d41fb78-094e-49ae-b3d1-4bd0dca17fc0" containerName="oc" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.186358 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d41fb78-094e-49ae-b3d1-4bd0dca17fc0" containerName="oc" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.186734 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d41fb78-094e-49ae-b3d1-4bd0dca17fc0" containerName="oc" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.187881 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.191049 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.191796 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.192261 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.229883 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550018-kfvl8"] Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.304717 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7hfc\" (UniqueName: \"kubernetes.io/projected/059c0df3-29c7-4def-970d-ba52e1884b8f-kube-api-access-r7hfc\") pod \"auto-csr-approver-29550018-kfvl8\" (UID: \"059c0df3-29c7-4def-970d-ba52e1884b8f\") " pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.369519 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:18:00 crc kubenswrapper[4885]: E0308 20:18:00.369724 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.406262 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7hfc\" (UniqueName: \"kubernetes.io/projected/059c0df3-29c7-4def-970d-ba52e1884b8f-kube-api-access-r7hfc\") pod \"auto-csr-approver-29550018-kfvl8\" (UID: \"059c0df3-29c7-4def-970d-ba52e1884b8f\") " pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.426362 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7hfc\" (UniqueName: \"kubernetes.io/projected/059c0df3-29c7-4def-970d-ba52e1884b8f-kube-api-access-r7hfc\") pod \"auto-csr-approver-29550018-kfvl8\" (UID: \"059c0df3-29c7-4def-970d-ba52e1884b8f\") " pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:00 crc kubenswrapper[4885]: I0308 20:18:00.515722 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:01 crc kubenswrapper[4885]: I0308 20:18:01.005590 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550018-kfvl8"] Mar 08 20:18:01 crc kubenswrapper[4885]: I0308 20:18:01.213001 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" event={"ID":"059c0df3-29c7-4def-970d-ba52e1884b8f","Type":"ContainerStarted","Data":"b8e3d15551152e4d35cc2deb4275283b006136a51cb606129885cb483cb3ec1a"} Mar 08 20:18:03 crc kubenswrapper[4885]: I0308 20:18:03.235000 4885 generic.go:334] "Generic (PLEG): container finished" podID="059c0df3-29c7-4def-970d-ba52e1884b8f" containerID="bf1fcbdcbc45401a37fee6b7776fac0a669dfcd972b2838d8063a648b4ecf370" exitCode=0 Mar 08 20:18:03 crc kubenswrapper[4885]: I0308 20:18:03.235146 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" event={"ID":"059c0df3-29c7-4def-970d-ba52e1884b8f","Type":"ContainerDied","Data":"bf1fcbdcbc45401a37fee6b7776fac0a669dfcd972b2838d8063a648b4ecf370"} Mar 08 20:18:04 crc kubenswrapper[4885]: I0308 20:18:04.505832 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:04 crc kubenswrapper[4885]: I0308 20:18:04.593603 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7hfc\" (UniqueName: \"kubernetes.io/projected/059c0df3-29c7-4def-970d-ba52e1884b8f-kube-api-access-r7hfc\") pod \"059c0df3-29c7-4def-970d-ba52e1884b8f\" (UID: \"059c0df3-29c7-4def-970d-ba52e1884b8f\") " Mar 08 20:18:04 crc kubenswrapper[4885]: I0308 20:18:04.623152 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059c0df3-29c7-4def-970d-ba52e1884b8f-kube-api-access-r7hfc" (OuterVolumeSpecName: "kube-api-access-r7hfc") pod "059c0df3-29c7-4def-970d-ba52e1884b8f" (UID: "059c0df3-29c7-4def-970d-ba52e1884b8f"). InnerVolumeSpecName "kube-api-access-r7hfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:18:04 crc kubenswrapper[4885]: I0308 20:18:04.695525 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7hfc\" (UniqueName: \"kubernetes.io/projected/059c0df3-29c7-4def-970d-ba52e1884b8f-kube-api-access-r7hfc\") on node \"crc\" DevicePath \"\"" Mar 08 20:18:05 crc kubenswrapper[4885]: I0308 20:18:05.252703 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" event={"ID":"059c0df3-29c7-4def-970d-ba52e1884b8f","Type":"ContainerDied","Data":"b8e3d15551152e4d35cc2deb4275283b006136a51cb606129885cb483cb3ec1a"} Mar 08 20:18:05 crc kubenswrapper[4885]: I0308 20:18:05.252794 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8e3d15551152e4d35cc2deb4275283b006136a51cb606129885cb483cb3ec1a" Mar 08 20:18:05 crc kubenswrapper[4885]: I0308 20:18:05.252750 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550018-kfvl8" Mar 08 20:18:05 crc kubenswrapper[4885]: I0308 20:18:05.593050 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550012-l9ql6"] Mar 08 20:18:05 crc kubenswrapper[4885]: I0308 20:18:05.598048 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550012-l9ql6"] Mar 08 20:18:07 crc kubenswrapper[4885]: I0308 20:18:07.386968 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c0dd782-7d55-432a-a4c4-72eab3a342f0" path="/var/lib/kubelet/pods/8c0dd782-7d55-432a-a4c4-72eab3a342f0/volumes" Mar 08 20:18:11 crc kubenswrapper[4885]: I0308 20:18:11.368352 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:18:12 crc kubenswrapper[4885]: I0308 20:18:12.318570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"91908c0f002af6ada70bcd1ada5cc456d680033640f7f942b2ba88a86e668929"} Mar 08 20:19:00 crc kubenswrapper[4885]: I0308 20:19:00.993615 4885 scope.go:117] "RemoveContainer" containerID="88a04df5e845bc6b15ebd93cf0a05c83311989ab76aa928e4be86bfc18c02a82" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.163748 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550020-vm6nl"] Mar 08 20:20:00 crc kubenswrapper[4885]: E0308 20:20:00.164891 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="059c0df3-29c7-4def-970d-ba52e1884b8f" containerName="oc" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.164915 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="059c0df3-29c7-4def-970d-ba52e1884b8f" containerName="oc" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.165364 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="059c0df3-29c7-4def-970d-ba52e1884b8f" containerName="oc" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.166308 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.170024 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.170747 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.170812 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.178665 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550020-vm6nl"] Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.315352 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwfx4\" (UniqueName: \"kubernetes.io/projected/c98e098b-03c5-49cd-8009-4d1dde33cd6d-kube-api-access-nwfx4\") pod \"auto-csr-approver-29550020-vm6nl\" (UID: \"c98e098b-03c5-49cd-8009-4d1dde33cd6d\") " pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.417366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwfx4\" (UniqueName: \"kubernetes.io/projected/c98e098b-03c5-49cd-8009-4d1dde33cd6d-kube-api-access-nwfx4\") pod \"auto-csr-approver-29550020-vm6nl\" (UID: \"c98e098b-03c5-49cd-8009-4d1dde33cd6d\") " pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.453207 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwfx4\" (UniqueName: \"kubernetes.io/projected/c98e098b-03c5-49cd-8009-4d1dde33cd6d-kube-api-access-nwfx4\") pod \"auto-csr-approver-29550020-vm6nl\" (UID: \"c98e098b-03c5-49cd-8009-4d1dde33cd6d\") " pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:00 crc kubenswrapper[4885]: I0308 20:20:00.501145 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:01 crc kubenswrapper[4885]: I0308 20:20:01.012566 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550020-vm6nl"] Mar 08 20:20:01 crc kubenswrapper[4885]: I0308 20:20:01.360254 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" event={"ID":"c98e098b-03c5-49cd-8009-4d1dde33cd6d","Type":"ContainerStarted","Data":"209ee4e8df421b11b91168058f4625287aa1bd9e162fc26f5ae45913d4df07ec"} Mar 08 20:20:03 crc kubenswrapper[4885]: I0308 20:20:03.387179 4885 generic.go:334] "Generic (PLEG): container finished" podID="c98e098b-03c5-49cd-8009-4d1dde33cd6d" containerID="c67ebacbdb0212a697eaa79c423debddfd753cb26e765c85de268bbfc9ff9476" exitCode=0 Mar 08 20:20:03 crc kubenswrapper[4885]: I0308 20:20:03.387261 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" event={"ID":"c98e098b-03c5-49cd-8009-4d1dde33cd6d","Type":"ContainerDied","Data":"c67ebacbdb0212a697eaa79c423debddfd753cb26e765c85de268bbfc9ff9476"} Mar 08 20:20:04 crc kubenswrapper[4885]: I0308 20:20:04.736248 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:04 crc kubenswrapper[4885]: I0308 20:20:04.885799 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwfx4\" (UniqueName: \"kubernetes.io/projected/c98e098b-03c5-49cd-8009-4d1dde33cd6d-kube-api-access-nwfx4\") pod \"c98e098b-03c5-49cd-8009-4d1dde33cd6d\" (UID: \"c98e098b-03c5-49cd-8009-4d1dde33cd6d\") " Mar 08 20:20:04 crc kubenswrapper[4885]: I0308 20:20:04.897174 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c98e098b-03c5-49cd-8009-4d1dde33cd6d-kube-api-access-nwfx4" (OuterVolumeSpecName: "kube-api-access-nwfx4") pod "c98e098b-03c5-49cd-8009-4d1dde33cd6d" (UID: "c98e098b-03c5-49cd-8009-4d1dde33cd6d"). InnerVolumeSpecName "kube-api-access-nwfx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:20:04 crc kubenswrapper[4885]: I0308 20:20:04.987973 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwfx4\" (UniqueName: \"kubernetes.io/projected/c98e098b-03c5-49cd-8009-4d1dde33cd6d-kube-api-access-nwfx4\") on node \"crc\" DevicePath \"\"" Mar 08 20:20:05 crc kubenswrapper[4885]: I0308 20:20:05.417196 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" event={"ID":"c98e098b-03c5-49cd-8009-4d1dde33cd6d","Type":"ContainerDied","Data":"209ee4e8df421b11b91168058f4625287aa1bd9e162fc26f5ae45913d4df07ec"} Mar 08 20:20:05 crc kubenswrapper[4885]: I0308 20:20:05.417253 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="209ee4e8df421b11b91168058f4625287aa1bd9e162fc26f5ae45913d4df07ec" Mar 08 20:20:05 crc kubenswrapper[4885]: I0308 20:20:05.417306 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550020-vm6nl" Mar 08 20:20:05 crc kubenswrapper[4885]: I0308 20:20:05.840856 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550014-9hsx2"] Mar 08 20:20:05 crc kubenswrapper[4885]: I0308 20:20:05.851341 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550014-9hsx2"] Mar 08 20:20:07 crc kubenswrapper[4885]: I0308 20:20:07.385400 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fba06a47-da5c-44a3-9184-d2d92d14ce91" path="/var/lib/kubelet/pods/fba06a47-da5c-44a3-9184-d2d92d14ce91/volumes" Mar 08 20:20:32 crc kubenswrapper[4885]: I0308 20:20:32.818052 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:20:32 crc kubenswrapper[4885]: I0308 20:20:32.818846 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:21:01 crc kubenswrapper[4885]: I0308 20:21:01.107559 4885 scope.go:117] "RemoveContainer" containerID="c1bd976fbf85046e9452c23b99e63722a390fc224cc0805a726ba4a52e322c78" Mar 08 20:21:02 crc kubenswrapper[4885]: I0308 20:21:02.819031 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:21:02 crc kubenswrapper[4885]: I0308 20:21:02.819458 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:21:32 crc kubenswrapper[4885]: I0308 20:21:32.818802 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:21:32 crc kubenswrapper[4885]: I0308 20:21:32.819611 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:21:32 crc kubenswrapper[4885]: I0308 20:21:32.819671 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:21:32 crc kubenswrapper[4885]: I0308 20:21:32.820845 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"91908c0f002af6ada70bcd1ada5cc456d680033640f7f942b2ba88a86e668929"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:21:32 crc kubenswrapper[4885]: I0308 20:21:32.820993 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://91908c0f002af6ada70bcd1ada5cc456d680033640f7f942b2ba88a86e668929" gracePeriod=600 Mar 08 20:21:33 crc kubenswrapper[4885]: I0308 20:21:33.304358 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="91908c0f002af6ada70bcd1ada5cc456d680033640f7f942b2ba88a86e668929" exitCode=0 Mar 08 20:21:33 crc kubenswrapper[4885]: I0308 20:21:33.304579 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"91908c0f002af6ada70bcd1ada5cc456d680033640f7f942b2ba88a86e668929"} Mar 08 20:21:33 crc kubenswrapper[4885]: I0308 20:21:33.305067 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36"} Mar 08 20:21:33 crc kubenswrapper[4885]: I0308 20:21:33.305179 4885 scope.go:117] "RemoveContainer" containerID="52c5ab1e50d12b98ec006544aa290d2382d2eb389b9307d8514b9964e5b26675" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.154464 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550022-cwspw"] Mar 08 20:22:00 crc kubenswrapper[4885]: E0308 20:22:00.155602 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c98e098b-03c5-49cd-8009-4d1dde33cd6d" containerName="oc" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.155645 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c98e098b-03c5-49cd-8009-4d1dde33cd6d" containerName="oc" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.155940 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c98e098b-03c5-49cd-8009-4d1dde33cd6d" containerName="oc" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.156450 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.162763 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.162869 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.163335 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.169980 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550022-cwspw"] Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.276000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmqp\" (UniqueName: \"kubernetes.io/projected/9c1236a4-ac5f-4b66-8064-a0877ea3eb13-kube-api-access-ccmqp\") pod \"auto-csr-approver-29550022-cwspw\" (UID: \"9c1236a4-ac5f-4b66-8064-a0877ea3eb13\") " pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.377679 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmqp\" (UniqueName: \"kubernetes.io/projected/9c1236a4-ac5f-4b66-8064-a0877ea3eb13-kube-api-access-ccmqp\") pod \"auto-csr-approver-29550022-cwspw\" (UID: \"9c1236a4-ac5f-4b66-8064-a0877ea3eb13\") " pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.407814 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmqp\" (UniqueName: \"kubernetes.io/projected/9c1236a4-ac5f-4b66-8064-a0877ea3eb13-kube-api-access-ccmqp\") pod \"auto-csr-approver-29550022-cwspw\" (UID: \"9c1236a4-ac5f-4b66-8064-a0877ea3eb13\") " pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.519326 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.786285 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550022-cwspw"] Mar 08 20:22:00 crc kubenswrapper[4885]: I0308 20:22:00.802542 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:22:01 crc kubenswrapper[4885]: I0308 20:22:01.575632 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550022-cwspw" event={"ID":"9c1236a4-ac5f-4b66-8064-a0877ea3eb13","Type":"ContainerStarted","Data":"ca296d5c26e34896d77d029616e76eb2c8c7bb8af0a56bf88bc8e7b3b5612378"} Mar 08 20:22:02 crc kubenswrapper[4885]: I0308 20:22:02.585883 4885 generic.go:334] "Generic (PLEG): container finished" podID="9c1236a4-ac5f-4b66-8064-a0877ea3eb13" containerID="02239c6874bdb24f3dcd1bad4a5e3559f6f779758a7821e5dae46f6c1d9294ea" exitCode=0 Mar 08 20:22:02 crc kubenswrapper[4885]: I0308 20:22:02.586006 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550022-cwspw" event={"ID":"9c1236a4-ac5f-4b66-8064-a0877ea3eb13","Type":"ContainerDied","Data":"02239c6874bdb24f3dcd1bad4a5e3559f6f779758a7821e5dae46f6c1d9294ea"} Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.038736 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.135175 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccmqp\" (UniqueName: \"kubernetes.io/projected/9c1236a4-ac5f-4b66-8064-a0877ea3eb13-kube-api-access-ccmqp\") pod \"9c1236a4-ac5f-4b66-8064-a0877ea3eb13\" (UID: \"9c1236a4-ac5f-4b66-8064-a0877ea3eb13\") " Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.140881 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1236a4-ac5f-4b66-8064-a0877ea3eb13-kube-api-access-ccmqp" (OuterVolumeSpecName: "kube-api-access-ccmqp") pod "9c1236a4-ac5f-4b66-8064-a0877ea3eb13" (UID: "9c1236a4-ac5f-4b66-8064-a0877ea3eb13"). InnerVolumeSpecName "kube-api-access-ccmqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.236402 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccmqp\" (UniqueName: \"kubernetes.io/projected/9c1236a4-ac5f-4b66-8064-a0877ea3eb13-kube-api-access-ccmqp\") on node \"crc\" DevicePath \"\"" Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.617911 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550022-cwspw" event={"ID":"9c1236a4-ac5f-4b66-8064-a0877ea3eb13","Type":"ContainerDied","Data":"ca296d5c26e34896d77d029616e76eb2c8c7bb8af0a56bf88bc8e7b3b5612378"} Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.618002 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca296d5c26e34896d77d029616e76eb2c8c7bb8af0a56bf88bc8e7b3b5612378" Mar 08 20:22:04 crc kubenswrapper[4885]: I0308 20:22:04.618083 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550022-cwspw" Mar 08 20:22:05 crc kubenswrapper[4885]: I0308 20:22:05.129133 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550016-dnrqv"] Mar 08 20:22:05 crc kubenswrapper[4885]: I0308 20:22:05.135647 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550016-dnrqv"] Mar 08 20:22:05 crc kubenswrapper[4885]: I0308 20:22:05.382403 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d41fb78-094e-49ae-b3d1-4bd0dca17fc0" path="/var/lib/kubelet/pods/9d41fb78-094e-49ae-b3d1-4bd0dca17fc0/volumes" Mar 08 20:23:01 crc kubenswrapper[4885]: I0308 20:23:01.226580 4885 scope.go:117] "RemoveContainer" containerID="a43d61c4ab66d4467b0260a033d95a916347542edf45ef6f0bb611f886b4a5b2" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.160518 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550024-9zz2p"] Mar 08 20:24:00 crc kubenswrapper[4885]: E0308 20:24:00.161622 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1236a4-ac5f-4b66-8064-a0877ea3eb13" containerName="oc" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.161646 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1236a4-ac5f-4b66-8064-a0877ea3eb13" containerName="oc" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.161900 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1236a4-ac5f-4b66-8064-a0877ea3eb13" containerName="oc" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.162663 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.166330 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.166774 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.166683 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.179713 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550024-9zz2p"] Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.323482 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xctl5\" (UniqueName: \"kubernetes.io/projected/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104-kube-api-access-xctl5\") pod \"auto-csr-approver-29550024-9zz2p\" (UID: \"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104\") " pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.425184 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xctl5\" (UniqueName: \"kubernetes.io/projected/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104-kube-api-access-xctl5\") pod \"auto-csr-approver-29550024-9zz2p\" (UID: \"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104\") " pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.459618 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xctl5\" (UniqueName: \"kubernetes.io/projected/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104-kube-api-access-xctl5\") pod \"auto-csr-approver-29550024-9zz2p\" (UID: \"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104\") " pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.496226 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.803815 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550024-9zz2p"] Mar 08 20:24:00 crc kubenswrapper[4885]: I0308 20:24:00.822399 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" event={"ID":"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104","Type":"ContainerStarted","Data":"ad70ce182b1fb55e4e39317e12a5c25b3c761fefbf247fb843baee1de63015d0"} Mar 08 20:24:02 crc kubenswrapper[4885]: I0308 20:24:02.819829 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:24:02 crc kubenswrapper[4885]: I0308 20:24:02.820327 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:24:02 crc kubenswrapper[4885]: I0308 20:24:02.844404 4885 generic.go:334] "Generic (PLEG): container finished" podID="1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104" containerID="43f525f1dbdfe3e17033373036412190ee626493c56e86fcd87bd80de646fc57" exitCode=0 Mar 08 20:24:02 crc kubenswrapper[4885]: I0308 20:24:02.844488 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" event={"ID":"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104","Type":"ContainerDied","Data":"43f525f1dbdfe3e17033373036412190ee626493c56e86fcd87bd80de646fc57"} Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.240883 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.394719 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xctl5\" (UniqueName: \"kubernetes.io/projected/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104-kube-api-access-xctl5\") pod \"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104\" (UID: \"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104\") " Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.405225 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104-kube-api-access-xctl5" (OuterVolumeSpecName: "kube-api-access-xctl5") pod "1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104" (UID: "1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104"). InnerVolumeSpecName "kube-api-access-xctl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.497229 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xctl5\" (UniqueName: \"kubernetes.io/projected/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104-kube-api-access-xctl5\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.865907 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" event={"ID":"1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104","Type":"ContainerDied","Data":"ad70ce182b1fb55e4e39317e12a5c25b3c761fefbf247fb843baee1de63015d0"} Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.866291 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad70ce182b1fb55e4e39317e12a5c25b3c761fefbf247fb843baee1de63015d0" Mar 08 20:24:04 crc kubenswrapper[4885]: I0308 20:24:04.866019 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550024-9zz2p" Mar 08 20:24:05 crc kubenswrapper[4885]: I0308 20:24:05.330965 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550018-kfvl8"] Mar 08 20:24:05 crc kubenswrapper[4885]: I0308 20:24:05.340223 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550018-kfvl8"] Mar 08 20:24:05 crc kubenswrapper[4885]: I0308 20:24:05.384646 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="059c0df3-29c7-4def-970d-ba52e1884b8f" path="/var/lib/kubelet/pods/059c0df3-29c7-4def-970d-ba52e1884b8f/volumes" Mar 08 20:24:32 crc kubenswrapper[4885]: I0308 20:24:32.818064 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:24:32 crc kubenswrapper[4885]: I0308 20:24:32.819690 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.227844 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6zr2n"] Mar 08 20:24:36 crc kubenswrapper[4885]: E0308 20:24:36.231143 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104" containerName="oc" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.231178 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104" containerName="oc" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.231595 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104" containerName="oc" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.235645 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zr2n"] Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.236089 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.368572 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-utilities\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.368625 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-catalog-content\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.368739 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpzr4\" (UniqueName: \"kubernetes.io/projected/9dc24a73-d641-47de-9542-5898804547cf-kube-api-access-dpzr4\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.470480 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpzr4\" (UniqueName: \"kubernetes.io/projected/9dc24a73-d641-47de-9542-5898804547cf-kube-api-access-dpzr4\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.470568 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-utilities\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.470609 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-catalog-content\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.471051 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-catalog-content\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.471506 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-utilities\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.512720 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpzr4\" (UniqueName: \"kubernetes.io/projected/9dc24a73-d641-47de-9542-5898804547cf-kube-api-access-dpzr4\") pod \"redhat-marketplace-6zr2n\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:36 crc kubenswrapper[4885]: I0308 20:24:36.564734 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.049124 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zr2n"] Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.213151 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5wgq8"] Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.216629 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.232447 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerStarted","Data":"d8abcb92c837d0e997506f5dff9884df30ccd03f8039a5bdc556e91073778235"} Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.259463 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5wgq8"] Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.386905 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-695js\" (UniqueName: \"kubernetes.io/projected/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-kube-api-access-695js\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.387002 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-catalog-content\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.387045 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-utilities\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.488066 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-695js\" (UniqueName: \"kubernetes.io/projected/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-kube-api-access-695js\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.488120 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-catalog-content\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.488164 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-utilities\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.488769 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-catalog-content\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.488844 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-utilities\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.508148 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-695js\" (UniqueName: \"kubernetes.io/projected/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-kube-api-access-695js\") pod \"redhat-operators-5wgq8\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.552116 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:37 crc kubenswrapper[4885]: I0308 20:24:37.976291 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5wgq8"] Mar 08 20:24:37 crc kubenswrapper[4885]: W0308 20:24:37.986606 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37ef9b83_5e44_4dd4_917b_0cc2b22994a7.slice/crio-a1d5a5740ffa83d735afd40a7eaa37bf1f2399163d49ebd9aa6df35eb7248970 WatchSource:0}: Error finding container a1d5a5740ffa83d735afd40a7eaa37bf1f2399163d49ebd9aa6df35eb7248970: Status 404 returned error can't find the container with id a1d5a5740ffa83d735afd40a7eaa37bf1f2399163d49ebd9aa6df35eb7248970 Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.242454 4885 generic.go:334] "Generic (PLEG): container finished" podID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerID="76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755" exitCode=0 Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.242561 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerDied","Data":"76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755"} Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.242593 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerStarted","Data":"a1d5a5740ffa83d735afd40a7eaa37bf1f2399163d49ebd9aa6df35eb7248970"} Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.245820 4885 generic.go:334] "Generic (PLEG): container finished" podID="9dc24a73-d641-47de-9542-5898804547cf" containerID="e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d" exitCode=0 Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.245865 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerDied","Data":"e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d"} Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.608415 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5v28s"] Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.609771 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.626456 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5v28s"] Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.701867 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-utilities\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.701973 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b2r4\" (UniqueName: \"kubernetes.io/projected/0f86ef9b-5ca5-425b-aec8-17efc661afb8-kube-api-access-8b2r4\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.702001 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-catalog-content\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.804063 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8b2r4\" (UniqueName: \"kubernetes.io/projected/0f86ef9b-5ca5-425b-aec8-17efc661afb8-kube-api-access-8b2r4\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.804140 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-catalog-content\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.804320 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-utilities\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.804728 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-catalog-content\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.804791 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-utilities\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.835357 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b2r4\" (UniqueName: \"kubernetes.io/projected/0f86ef9b-5ca5-425b-aec8-17efc661afb8-kube-api-access-8b2r4\") pod \"certified-operators-5v28s\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:38 crc kubenswrapper[4885]: I0308 20:24:38.934099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:39 crc kubenswrapper[4885]: I0308 20:24:39.254397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerStarted","Data":"cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a"} Mar 08 20:24:39 crc kubenswrapper[4885]: I0308 20:24:39.260374 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerStarted","Data":"cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d"} Mar 08 20:24:39 crc kubenswrapper[4885]: I0308 20:24:39.421640 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5v28s"] Mar 08 20:24:39 crc kubenswrapper[4885]: W0308 20:24:39.451627 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f86ef9b_5ca5_425b_aec8_17efc661afb8.slice/crio-c6c42b763278541056b77e1140376fc7d055c7c270f1b8a2f7313a7c8600e3fe WatchSource:0}: Error finding container c6c42b763278541056b77e1140376fc7d055c7c270f1b8a2f7313a7c8600e3fe: Status 404 returned error can't find the container with id c6c42b763278541056b77e1140376fc7d055c7c270f1b8a2f7313a7c8600e3fe Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.277740 4885 generic.go:334] "Generic (PLEG): container finished" podID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerID="8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b" exitCode=0 Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.277882 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerDied","Data":"8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b"} Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.278421 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerStarted","Data":"c6c42b763278541056b77e1140376fc7d055c7c270f1b8a2f7313a7c8600e3fe"} Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.283066 4885 generic.go:334] "Generic (PLEG): container finished" podID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerID="cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d" exitCode=0 Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.283152 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerDied","Data":"cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d"} Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.299326 4885 generic.go:334] "Generic (PLEG): container finished" podID="9dc24a73-d641-47de-9542-5898804547cf" containerID="cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a" exitCode=0 Mar 08 20:24:40 crc kubenswrapper[4885]: I0308 20:24:40.299390 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerDied","Data":"cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a"} Mar 08 20:24:41 crc kubenswrapper[4885]: I0308 20:24:41.312113 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerStarted","Data":"ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef"} Mar 08 20:24:41 crc kubenswrapper[4885]: I0308 20:24:41.315168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerStarted","Data":"a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714"} Mar 08 20:24:41 crc kubenswrapper[4885]: I0308 20:24:41.318441 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerStarted","Data":"e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8"} Mar 08 20:24:41 crc kubenswrapper[4885]: I0308 20:24:41.343099 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6zr2n" podStartSLOduration=2.886295057 podStartE2EDuration="5.343068231s" podCreationTimestamp="2026-03-08 20:24:36 +0000 UTC" firstStartedPulling="2026-03-08 20:24:38.247579665 +0000 UTC m=+3179.643633688" lastFinishedPulling="2026-03-08 20:24:40.704352799 +0000 UTC m=+3182.100406862" observedRunningTime="2026-03-08 20:24:41.338211571 +0000 UTC m=+3182.734265644" watchObservedRunningTime="2026-03-08 20:24:41.343068231 +0000 UTC m=+3182.739122294" Mar 08 20:24:41 crc kubenswrapper[4885]: I0308 20:24:41.367558 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5wgq8" podStartSLOduration=1.8996834219999998 podStartE2EDuration="4.367541962s" podCreationTimestamp="2026-03-08 20:24:37 +0000 UTC" firstStartedPulling="2026-03-08 20:24:38.24402052 +0000 UTC m=+3179.640074553" lastFinishedPulling="2026-03-08 20:24:40.71187904 +0000 UTC m=+3182.107933093" observedRunningTime="2026-03-08 20:24:41.364863561 +0000 UTC m=+3182.760917654" watchObservedRunningTime="2026-03-08 20:24:41.367541962 +0000 UTC m=+3182.763595985" Mar 08 20:24:42 crc kubenswrapper[4885]: I0308 20:24:42.336025 4885 generic.go:334] "Generic (PLEG): container finished" podID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerID="a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714" exitCode=0 Mar 08 20:24:42 crc kubenswrapper[4885]: I0308 20:24:42.336119 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerDied","Data":"a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714"} Mar 08 20:24:43 crc kubenswrapper[4885]: I0308 20:24:43.343673 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerStarted","Data":"c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0"} Mar 08 20:24:43 crc kubenswrapper[4885]: I0308 20:24:43.367993 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5v28s" podStartSLOduration=2.892734866 podStartE2EDuration="5.367975272s" podCreationTimestamp="2026-03-08 20:24:38 +0000 UTC" firstStartedPulling="2026-03-08 20:24:40.280545551 +0000 UTC m=+3181.676599584" lastFinishedPulling="2026-03-08 20:24:42.755785957 +0000 UTC m=+3184.151839990" observedRunningTime="2026-03-08 20:24:43.366068762 +0000 UTC m=+3184.762122795" watchObservedRunningTime="2026-03-08 20:24:43.367975272 +0000 UTC m=+3184.764029295" Mar 08 20:24:46 crc kubenswrapper[4885]: I0308 20:24:46.565613 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:46 crc kubenswrapper[4885]: I0308 20:24:46.565906 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:46 crc kubenswrapper[4885]: I0308 20:24:46.634007 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:47 crc kubenswrapper[4885]: I0308 20:24:47.457053 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:47 crc kubenswrapper[4885]: I0308 20:24:47.552294 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:47 crc kubenswrapper[4885]: I0308 20:24:47.552698 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:48 crc kubenswrapper[4885]: I0308 20:24:48.626045 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5wgq8" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="registry-server" probeResult="failure" output=< Mar 08 20:24:48 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 20:24:48 crc kubenswrapper[4885]: > Mar 08 20:24:48 crc kubenswrapper[4885]: I0308 20:24:48.934689 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:48 crc kubenswrapper[4885]: I0308 20:24:48.934790 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:49 crc kubenswrapper[4885]: I0308 20:24:49.007702 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:49 crc kubenswrapper[4885]: I0308 20:24:49.205456 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zr2n"] Mar 08 20:24:49 crc kubenswrapper[4885]: I0308 20:24:49.398588 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6zr2n" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="registry-server" containerID="cri-o://ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef" gracePeriod=2 Mar 08 20:24:49 crc kubenswrapper[4885]: I0308 20:24:49.484288 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:49 crc kubenswrapper[4885]: I0308 20:24:49.924357 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.086279 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpzr4\" (UniqueName: \"kubernetes.io/projected/9dc24a73-d641-47de-9542-5898804547cf-kube-api-access-dpzr4\") pod \"9dc24a73-d641-47de-9542-5898804547cf\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.086387 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-utilities\") pod \"9dc24a73-d641-47de-9542-5898804547cf\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.086429 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-catalog-content\") pod \"9dc24a73-d641-47de-9542-5898804547cf\" (UID: \"9dc24a73-d641-47de-9542-5898804547cf\") " Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.087608 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-utilities" (OuterVolumeSpecName: "utilities") pod "9dc24a73-d641-47de-9542-5898804547cf" (UID: "9dc24a73-d641-47de-9542-5898804547cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.095485 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9dc24a73-d641-47de-9542-5898804547cf-kube-api-access-dpzr4" (OuterVolumeSpecName: "kube-api-access-dpzr4") pod "9dc24a73-d641-47de-9542-5898804547cf" (UID: "9dc24a73-d641-47de-9542-5898804547cf"). InnerVolumeSpecName "kube-api-access-dpzr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.111097 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9dc24a73-d641-47de-9542-5898804547cf" (UID: "9dc24a73-d641-47de-9542-5898804547cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.188108 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.188677 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9dc24a73-d641-47de-9542-5898804547cf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.188745 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpzr4\" (UniqueName: \"kubernetes.io/projected/9dc24a73-d641-47de-9542-5898804547cf-kube-api-access-dpzr4\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.414063 4885 generic.go:334] "Generic (PLEG): container finished" podID="9dc24a73-d641-47de-9542-5898804547cf" containerID="ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef" exitCode=0 Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.414158 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6zr2n" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.414262 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerDied","Data":"ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef"} Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.414405 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6zr2n" event={"ID":"9dc24a73-d641-47de-9542-5898804547cf","Type":"ContainerDied","Data":"d8abcb92c837d0e997506f5dff9884df30ccd03f8039a5bdc556e91073778235"} Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.414445 4885 scope.go:117] "RemoveContainer" containerID="ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.444519 4885 scope.go:117] "RemoveContainer" containerID="cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.477134 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zr2n"] Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.488674 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6zr2n"] Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.500257 4885 scope.go:117] "RemoveContainer" containerID="e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.525006 4885 scope.go:117] "RemoveContainer" containerID="ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef" Mar 08 20:24:50 crc kubenswrapper[4885]: E0308 20:24:50.538344 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef\": container with ID starting with ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef not found: ID does not exist" containerID="ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.538409 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef"} err="failed to get container status \"ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef\": rpc error: code = NotFound desc = could not find container \"ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef\": container with ID starting with ffff81cd87d04d5304284fbe05a26ca18a07639d8f13efbfa72ea9469abe10ef not found: ID does not exist" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.538449 4885 scope.go:117] "RemoveContainer" containerID="cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a" Mar 08 20:24:50 crc kubenswrapper[4885]: E0308 20:24:50.539257 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a\": container with ID starting with cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a not found: ID does not exist" containerID="cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.539320 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a"} err="failed to get container status \"cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a\": rpc error: code = NotFound desc = could not find container \"cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a\": container with ID starting with cbb6eab9f142be019b82610aa8e86c513cfe185cc89b623edff16f8b089d7a2a not found: ID does not exist" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.539361 4885 scope.go:117] "RemoveContainer" containerID="e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d" Mar 08 20:24:50 crc kubenswrapper[4885]: E0308 20:24:50.540247 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d\": container with ID starting with e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d not found: ID does not exist" containerID="e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d" Mar 08 20:24:50 crc kubenswrapper[4885]: I0308 20:24:50.540312 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d"} err="failed to get container status \"e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d\": rpc error: code = NotFound desc = could not find container \"e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d\": container with ID starting with e6a63870597fff021faffcbf1c123dee5613fd7cce99a8c8ee0877023109da7d not found: ID does not exist" Mar 08 20:24:51 crc kubenswrapper[4885]: I0308 20:24:51.383025 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9dc24a73-d641-47de-9542-5898804547cf" path="/var/lib/kubelet/pods/9dc24a73-d641-47de-9542-5898804547cf/volumes" Mar 08 20:24:51 crc kubenswrapper[4885]: I0308 20:24:51.402640 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5v28s"] Mar 08 20:24:51 crc kubenswrapper[4885]: I0308 20:24:51.425126 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5v28s" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="registry-server" containerID="cri-o://c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0" gracePeriod=2 Mar 08 20:24:51 crc kubenswrapper[4885]: I0308 20:24:51.919833 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.026094 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-catalog-content\") pod \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.026165 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b2r4\" (UniqueName: \"kubernetes.io/projected/0f86ef9b-5ca5-425b-aec8-17efc661afb8-kube-api-access-8b2r4\") pod \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.026260 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-utilities\") pod \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\" (UID: \"0f86ef9b-5ca5-425b-aec8-17efc661afb8\") " Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.027945 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-utilities" (OuterVolumeSpecName: "utilities") pod "0f86ef9b-5ca5-425b-aec8-17efc661afb8" (UID: "0f86ef9b-5ca5-425b-aec8-17efc661afb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.035977 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f86ef9b-5ca5-425b-aec8-17efc661afb8-kube-api-access-8b2r4" (OuterVolumeSpecName: "kube-api-access-8b2r4") pod "0f86ef9b-5ca5-425b-aec8-17efc661afb8" (UID: "0f86ef9b-5ca5-425b-aec8-17efc661afb8"). InnerVolumeSpecName "kube-api-access-8b2r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.108590 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0f86ef9b-5ca5-425b-aec8-17efc661afb8" (UID: "0f86ef9b-5ca5-425b-aec8-17efc661afb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.128272 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.128321 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8b2r4\" (UniqueName: \"kubernetes.io/projected/0f86ef9b-5ca5-425b-aec8-17efc661afb8-kube-api-access-8b2r4\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.128343 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0f86ef9b-5ca5-425b-aec8-17efc661afb8-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.438600 4885 generic.go:334] "Generic (PLEG): container finished" podID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerID="c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0" exitCode=0 Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.438665 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v28s" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.438675 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerDied","Data":"c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0"} Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.438740 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v28s" event={"ID":"0f86ef9b-5ca5-425b-aec8-17efc661afb8","Type":"ContainerDied","Data":"c6c42b763278541056b77e1140376fc7d055c7c270f1b8a2f7313a7c8600e3fe"} Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.438782 4885 scope.go:117] "RemoveContainer" containerID="c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.468238 4885 scope.go:117] "RemoveContainer" containerID="a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.498098 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5v28s"] Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.507378 4885 scope.go:117] "RemoveContainer" containerID="8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.508468 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5v28s"] Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.542381 4885 scope.go:117] "RemoveContainer" containerID="c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0" Mar 08 20:24:52 crc kubenswrapper[4885]: E0308 20:24:52.543042 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0\": container with ID starting with c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0 not found: ID does not exist" containerID="c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.543099 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0"} err="failed to get container status \"c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0\": rpc error: code = NotFound desc = could not find container \"c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0\": container with ID starting with c7076888dff2a50bc66b83706c741e2000850fbb031a0f4ba78843e2257b89b0 not found: ID does not exist" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.543141 4885 scope.go:117] "RemoveContainer" containerID="a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714" Mar 08 20:24:52 crc kubenswrapper[4885]: E0308 20:24:52.543668 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714\": container with ID starting with a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714 not found: ID does not exist" containerID="a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.543718 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714"} err="failed to get container status \"a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714\": rpc error: code = NotFound desc = could not find container \"a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714\": container with ID starting with a388c221cd53318d4cbe32de505ce7e7c823b159d79a8720604f5797bfd77714 not found: ID does not exist" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.543757 4885 scope.go:117] "RemoveContainer" containerID="8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b" Mar 08 20:24:52 crc kubenswrapper[4885]: E0308 20:24:52.544445 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b\": container with ID starting with 8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b not found: ID does not exist" containerID="8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b" Mar 08 20:24:52 crc kubenswrapper[4885]: I0308 20:24:52.544538 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b"} err="failed to get container status \"8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b\": rpc error: code = NotFound desc = could not find container \"8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b\": container with ID starting with 8e3d72e584d45fcfd94bc5bc9102ccd556e9b99a570b4c088e62faa68cbd681b not found: ID does not exist" Mar 08 20:24:53 crc kubenswrapper[4885]: I0308 20:24:53.381350 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" path="/var/lib/kubelet/pods/0f86ef9b-5ca5-425b-aec8-17efc661afb8/volumes" Mar 08 20:24:57 crc kubenswrapper[4885]: I0308 20:24:57.627562 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:57 crc kubenswrapper[4885]: I0308 20:24:57.692701 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:24:57 crc kubenswrapper[4885]: I0308 20:24:57.912610 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5wgq8"] Mar 08 20:24:59 crc kubenswrapper[4885]: I0308 20:24:59.502500 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5wgq8" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="registry-server" containerID="cri-o://e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8" gracePeriod=2 Mar 08 20:24:59 crc kubenswrapper[4885]: I0308 20:24:59.972263 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.152625 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-catalog-content\") pod \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.152692 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-695js\" (UniqueName: \"kubernetes.io/projected/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-kube-api-access-695js\") pod \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.152759 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-utilities\") pod \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\" (UID: \"37ef9b83-5e44-4dd4-917b-0cc2b22994a7\") " Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.153973 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-utilities" (OuterVolumeSpecName: "utilities") pod "37ef9b83-5e44-4dd4-917b-0cc2b22994a7" (UID: "37ef9b83-5e44-4dd4-917b-0cc2b22994a7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.162798 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-kube-api-access-695js" (OuterVolumeSpecName: "kube-api-access-695js") pod "37ef9b83-5e44-4dd4-917b-0cc2b22994a7" (UID: "37ef9b83-5e44-4dd4-917b-0cc2b22994a7"). InnerVolumeSpecName "kube-api-access-695js". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.267493 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-695js\" (UniqueName: \"kubernetes.io/projected/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-kube-api-access-695js\") on node \"crc\" DevicePath \"\"" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.267557 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.328386 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "37ef9b83-5e44-4dd4-917b-0cc2b22994a7" (UID: "37ef9b83-5e44-4dd4-917b-0cc2b22994a7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.368689 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/37ef9b83-5e44-4dd4-917b-0cc2b22994a7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.515718 4885 generic.go:334] "Generic (PLEG): container finished" podID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerID="e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8" exitCode=0 Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.515810 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5wgq8" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.515816 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerDied","Data":"e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8"} Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.517821 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5wgq8" event={"ID":"37ef9b83-5e44-4dd4-917b-0cc2b22994a7","Type":"ContainerDied","Data":"a1d5a5740ffa83d735afd40a7eaa37bf1f2399163d49ebd9aa6df35eb7248970"} Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.517865 4885 scope.go:117] "RemoveContainer" containerID="e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.549302 4885 scope.go:117] "RemoveContainer" containerID="cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.571468 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5wgq8"] Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.578648 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5wgq8"] Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.596117 4885 scope.go:117] "RemoveContainer" containerID="76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.625050 4885 scope.go:117] "RemoveContainer" containerID="e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8" Mar 08 20:25:00 crc kubenswrapper[4885]: E0308 20:25:00.625697 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8\": container with ID starting with e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8 not found: ID does not exist" containerID="e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.625766 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8"} err="failed to get container status \"e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8\": rpc error: code = NotFound desc = could not find container \"e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8\": container with ID starting with e59dbb69ec78b251806e042e871209fcb5a54815bd0a492eff51764a9d5e4ba8 not found: ID does not exist" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.625809 4885 scope.go:117] "RemoveContainer" containerID="cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d" Mar 08 20:25:00 crc kubenswrapper[4885]: E0308 20:25:00.626457 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d\": container with ID starting with cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d not found: ID does not exist" containerID="cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.626512 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d"} err="failed to get container status \"cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d\": rpc error: code = NotFound desc = could not find container \"cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d\": container with ID starting with cd964652cd3beea4f4dd59994643f9427a43a99411e205aaa6252912fcd8d86d not found: ID does not exist" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.626550 4885 scope.go:117] "RemoveContainer" containerID="76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755" Mar 08 20:25:00 crc kubenswrapper[4885]: E0308 20:25:00.626981 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755\": container with ID starting with 76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755 not found: ID does not exist" containerID="76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755" Mar 08 20:25:00 crc kubenswrapper[4885]: I0308 20:25:00.627022 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755"} err="failed to get container status \"76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755\": rpc error: code = NotFound desc = could not find container \"76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755\": container with ID starting with 76010d64dcda012b0a566b3fe83308998edb3a9382a2741a707d07cd09b49755 not found: ID does not exist" Mar 08 20:25:01 crc kubenswrapper[4885]: I0308 20:25:01.328914 4885 scope.go:117] "RemoveContainer" containerID="bf1fcbdcbc45401a37fee6b7776fac0a669dfcd972b2838d8063a648b4ecf370" Mar 08 20:25:01 crc kubenswrapper[4885]: I0308 20:25:01.386150 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" path="/var/lib/kubelet/pods/37ef9b83-5e44-4dd4-917b-0cc2b22994a7/volumes" Mar 08 20:25:02 crc kubenswrapper[4885]: I0308 20:25:02.818596 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:25:02 crc kubenswrapper[4885]: I0308 20:25:02.818997 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:25:02 crc kubenswrapper[4885]: I0308 20:25:02.819068 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:25:02 crc kubenswrapper[4885]: I0308 20:25:02.820008 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:25:02 crc kubenswrapper[4885]: I0308 20:25:02.820106 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" gracePeriod=600 Mar 08 20:25:02 crc kubenswrapper[4885]: E0308 20:25:02.957006 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:25:03 crc kubenswrapper[4885]: I0308 20:25:03.547125 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" exitCode=0 Mar 08 20:25:03 crc kubenswrapper[4885]: I0308 20:25:03.547186 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36"} Mar 08 20:25:03 crc kubenswrapper[4885]: I0308 20:25:03.547230 4885 scope.go:117] "RemoveContainer" containerID="91908c0f002af6ada70bcd1ada5cc456d680033640f7f942b2ba88a86e668929" Mar 08 20:25:03 crc kubenswrapper[4885]: I0308 20:25:03.548095 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:25:03 crc kubenswrapper[4885]: E0308 20:25:03.548576 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:25:18 crc kubenswrapper[4885]: I0308 20:25:18.369934 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:25:18 crc kubenswrapper[4885]: E0308 20:25:18.371148 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:25:29 crc kubenswrapper[4885]: I0308 20:25:29.376397 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:25:29 crc kubenswrapper[4885]: E0308 20:25:29.377097 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:25:41 crc kubenswrapper[4885]: I0308 20:25:41.369005 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:25:41 crc kubenswrapper[4885]: E0308 20:25:41.370190 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:25:56 crc kubenswrapper[4885]: I0308 20:25:56.368915 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:25:56 crc kubenswrapper[4885]: E0308 20:25:56.369951 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.165778 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550026-vw7xs"] Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.166842 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="extract-content" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.166873 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="extract-content" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.166905 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="extract-content" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.166961 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="extract-content" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.166990 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="extract-utilities" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.167006 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="extract-utilities" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.167031 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="extract-utilities" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.167047 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="extract-utilities" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.167086 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="extract-utilities" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.167103 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="extract-utilities" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.167120 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.167135 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.167155 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="extract-content" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.167171 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="extract-content" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.167204 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.167219 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: E0308 20:26:00.168596 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.168682 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.169085 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ef9b83-5e44-4dd4-917b-0cc2b22994a7" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.169134 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9dc24a73-d641-47de-9542-5898804547cf" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.169219 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f86ef9b-5ca5-425b-aec8-17efc661afb8" containerName="registry-server" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.170197 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.176640 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.176884 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.177287 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.181183 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550026-vw7xs"] Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.195830 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/99bd2190-0abe-4434-b2f6-3707852e2d43-kube-api-access-t2jnx\") pod \"auto-csr-approver-29550026-vw7xs\" (UID: \"99bd2190-0abe-4434-b2f6-3707852e2d43\") " pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.296871 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/99bd2190-0abe-4434-b2f6-3707852e2d43-kube-api-access-t2jnx\") pod \"auto-csr-approver-29550026-vw7xs\" (UID: \"99bd2190-0abe-4434-b2f6-3707852e2d43\") " pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.326741 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/99bd2190-0abe-4434-b2f6-3707852e2d43-kube-api-access-t2jnx\") pod \"auto-csr-approver-29550026-vw7xs\" (UID: \"99bd2190-0abe-4434-b2f6-3707852e2d43\") " pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.503256 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:00 crc kubenswrapper[4885]: I0308 20:26:00.803638 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550026-vw7xs"] Mar 08 20:26:01 crc kubenswrapper[4885]: I0308 20:26:01.069716 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" event={"ID":"99bd2190-0abe-4434-b2f6-3707852e2d43","Type":"ContainerStarted","Data":"d3e33d2362c9e189e4fc6a457183a0cf126fa32fd3f165d2c89b58532cc1383a"} Mar 08 20:26:02 crc kubenswrapper[4885]: I0308 20:26:02.081087 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" event={"ID":"99bd2190-0abe-4434-b2f6-3707852e2d43","Type":"ContainerStarted","Data":"d37e225980dc2c4d236b82dd40a6a7c22fa7155e659179661d10d8de1d9aabfd"} Mar 08 20:26:02 crc kubenswrapper[4885]: I0308 20:26:02.106595 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" podStartSLOduration=1.294364064 podStartE2EDuration="2.106565446s" podCreationTimestamp="2026-03-08 20:26:00 +0000 UTC" firstStartedPulling="2026-03-08 20:26:00.810878497 +0000 UTC m=+3262.206932560" lastFinishedPulling="2026-03-08 20:26:01.623079879 +0000 UTC m=+3263.019133942" observedRunningTime="2026-03-08 20:26:02.099320303 +0000 UTC m=+3263.495374326" watchObservedRunningTime="2026-03-08 20:26:02.106565446 +0000 UTC m=+3263.502619519" Mar 08 20:26:03 crc kubenswrapper[4885]: I0308 20:26:03.094609 4885 generic.go:334] "Generic (PLEG): container finished" podID="99bd2190-0abe-4434-b2f6-3707852e2d43" containerID="d37e225980dc2c4d236b82dd40a6a7c22fa7155e659179661d10d8de1d9aabfd" exitCode=0 Mar 08 20:26:03 crc kubenswrapper[4885]: I0308 20:26:03.094739 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" event={"ID":"99bd2190-0abe-4434-b2f6-3707852e2d43","Type":"ContainerDied","Data":"d37e225980dc2c4d236b82dd40a6a7c22fa7155e659179661d10d8de1d9aabfd"} Mar 08 20:26:04 crc kubenswrapper[4885]: I0308 20:26:04.449354 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:04 crc kubenswrapper[4885]: I0308 20:26:04.471355 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/99bd2190-0abe-4434-b2f6-3707852e2d43-kube-api-access-t2jnx\") pod \"99bd2190-0abe-4434-b2f6-3707852e2d43\" (UID: \"99bd2190-0abe-4434-b2f6-3707852e2d43\") " Mar 08 20:26:04 crc kubenswrapper[4885]: I0308 20:26:04.478282 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99bd2190-0abe-4434-b2f6-3707852e2d43-kube-api-access-t2jnx" (OuterVolumeSpecName: "kube-api-access-t2jnx") pod "99bd2190-0abe-4434-b2f6-3707852e2d43" (UID: "99bd2190-0abe-4434-b2f6-3707852e2d43"). InnerVolumeSpecName "kube-api-access-t2jnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:26:04 crc kubenswrapper[4885]: I0308 20:26:04.573249 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2jnx\" (UniqueName: \"kubernetes.io/projected/99bd2190-0abe-4434-b2f6-3707852e2d43-kube-api-access-t2jnx\") on node \"crc\" DevicePath \"\"" Mar 08 20:26:05 crc kubenswrapper[4885]: I0308 20:26:05.121725 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" event={"ID":"99bd2190-0abe-4434-b2f6-3707852e2d43","Type":"ContainerDied","Data":"d3e33d2362c9e189e4fc6a457183a0cf126fa32fd3f165d2c89b58532cc1383a"} Mar 08 20:26:05 crc kubenswrapper[4885]: I0308 20:26:05.121767 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550026-vw7xs" Mar 08 20:26:05 crc kubenswrapper[4885]: I0308 20:26:05.121797 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3e33d2362c9e189e4fc6a457183a0cf126fa32fd3f165d2c89b58532cc1383a" Mar 08 20:26:05 crc kubenswrapper[4885]: I0308 20:26:05.257901 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550020-vm6nl"] Mar 08 20:26:05 crc kubenswrapper[4885]: I0308 20:26:05.267277 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550020-vm6nl"] Mar 08 20:26:05 crc kubenswrapper[4885]: I0308 20:26:05.384290 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c98e098b-03c5-49cd-8009-4d1dde33cd6d" path="/var/lib/kubelet/pods/c98e098b-03c5-49cd-8009-4d1dde33cd6d/volumes" Mar 08 20:26:11 crc kubenswrapper[4885]: I0308 20:26:11.368987 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:26:11 crc kubenswrapper[4885]: E0308 20:26:11.370243 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:26:26 crc kubenswrapper[4885]: I0308 20:26:26.368594 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:26:26 crc kubenswrapper[4885]: E0308 20:26:26.370552 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:26:39 crc kubenswrapper[4885]: I0308 20:26:39.375167 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:26:39 crc kubenswrapper[4885]: E0308 20:26:39.377545 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:26:52 crc kubenswrapper[4885]: I0308 20:26:52.369189 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:26:52 crc kubenswrapper[4885]: E0308 20:26:52.370268 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:27:01 crc kubenswrapper[4885]: I0308 20:27:01.482801 4885 scope.go:117] "RemoveContainer" containerID="c67ebacbdb0212a697eaa79c423debddfd753cb26e765c85de268bbfc9ff9476" Mar 08 20:27:06 crc kubenswrapper[4885]: I0308 20:27:06.369247 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:27:06 crc kubenswrapper[4885]: E0308 20:27:06.370319 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:27:19 crc kubenswrapper[4885]: I0308 20:27:19.377123 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:27:19 crc kubenswrapper[4885]: E0308 20:27:19.378066 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:27:31 crc kubenswrapper[4885]: I0308 20:27:31.369028 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:27:31 crc kubenswrapper[4885]: E0308 20:27:31.370034 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:27:45 crc kubenswrapper[4885]: I0308 20:27:45.368563 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:27:45 crc kubenswrapper[4885]: E0308 20:27:45.371795 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:27:57 crc kubenswrapper[4885]: I0308 20:27:57.369068 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:27:57 crc kubenswrapper[4885]: E0308 20:27:57.370289 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.160607 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550028-gfnbr"] Mar 08 20:28:00 crc kubenswrapper[4885]: E0308 20:28:00.161125 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99bd2190-0abe-4434-b2f6-3707852e2d43" containerName="oc" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.161154 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="99bd2190-0abe-4434-b2f6-3707852e2d43" containerName="oc" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.161484 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="99bd2190-0abe-4434-b2f6-3707852e2d43" containerName="oc" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.162422 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.165297 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.165356 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.165445 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.180036 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550028-gfnbr"] Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.293366 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldn5c\" (UniqueName: \"kubernetes.io/projected/67320c5c-bbf3-4828-8a46-effb28e4d9a1-kube-api-access-ldn5c\") pod \"auto-csr-approver-29550028-gfnbr\" (UID: \"67320c5c-bbf3-4828-8a46-effb28e4d9a1\") " pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.395622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldn5c\" (UniqueName: \"kubernetes.io/projected/67320c5c-bbf3-4828-8a46-effb28e4d9a1-kube-api-access-ldn5c\") pod \"auto-csr-approver-29550028-gfnbr\" (UID: \"67320c5c-bbf3-4828-8a46-effb28e4d9a1\") " pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.429962 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldn5c\" (UniqueName: \"kubernetes.io/projected/67320c5c-bbf3-4828-8a46-effb28e4d9a1-kube-api-access-ldn5c\") pod \"auto-csr-approver-29550028-gfnbr\" (UID: \"67320c5c-bbf3-4828-8a46-effb28e4d9a1\") " pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:00 crc kubenswrapper[4885]: I0308 20:28:00.498553 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:01 crc kubenswrapper[4885]: I0308 20:28:01.016265 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550028-gfnbr"] Mar 08 20:28:01 crc kubenswrapper[4885]: I0308 20:28:01.028321 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:28:01 crc kubenswrapper[4885]: I0308 20:28:01.162236 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" event={"ID":"67320c5c-bbf3-4828-8a46-effb28e4d9a1","Type":"ContainerStarted","Data":"6fe9f4df87611064b1b67d0129cce1c0d7902b26837437b2c3d2c39f543ffce7"} Mar 08 20:28:03 crc kubenswrapper[4885]: I0308 20:28:03.180903 4885 generic.go:334] "Generic (PLEG): container finished" podID="67320c5c-bbf3-4828-8a46-effb28e4d9a1" containerID="ef07ccde27b0d39793669ff65783adc2f054456be78c379a19e61fa8a99b06b3" exitCode=0 Mar 08 20:28:03 crc kubenswrapper[4885]: I0308 20:28:03.181008 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" event={"ID":"67320c5c-bbf3-4828-8a46-effb28e4d9a1","Type":"ContainerDied","Data":"ef07ccde27b0d39793669ff65783adc2f054456be78c379a19e61fa8a99b06b3"} Mar 08 20:28:04 crc kubenswrapper[4885]: I0308 20:28:04.555460 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:04 crc kubenswrapper[4885]: I0308 20:28:04.662459 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldn5c\" (UniqueName: \"kubernetes.io/projected/67320c5c-bbf3-4828-8a46-effb28e4d9a1-kube-api-access-ldn5c\") pod \"67320c5c-bbf3-4828-8a46-effb28e4d9a1\" (UID: \"67320c5c-bbf3-4828-8a46-effb28e4d9a1\") " Mar 08 20:28:04 crc kubenswrapper[4885]: I0308 20:28:04.680256 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67320c5c-bbf3-4828-8a46-effb28e4d9a1-kube-api-access-ldn5c" (OuterVolumeSpecName: "kube-api-access-ldn5c") pod "67320c5c-bbf3-4828-8a46-effb28e4d9a1" (UID: "67320c5c-bbf3-4828-8a46-effb28e4d9a1"). InnerVolumeSpecName "kube-api-access-ldn5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:28:04 crc kubenswrapper[4885]: I0308 20:28:04.764414 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldn5c\" (UniqueName: \"kubernetes.io/projected/67320c5c-bbf3-4828-8a46-effb28e4d9a1-kube-api-access-ldn5c\") on node \"crc\" DevicePath \"\"" Mar 08 20:28:05 crc kubenswrapper[4885]: I0308 20:28:05.201777 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" event={"ID":"67320c5c-bbf3-4828-8a46-effb28e4d9a1","Type":"ContainerDied","Data":"6fe9f4df87611064b1b67d0129cce1c0d7902b26837437b2c3d2c39f543ffce7"} Mar 08 20:28:05 crc kubenswrapper[4885]: I0308 20:28:05.202170 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fe9f4df87611064b1b67d0129cce1c0d7902b26837437b2c3d2c39f543ffce7" Mar 08 20:28:05 crc kubenswrapper[4885]: I0308 20:28:05.201864 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550028-gfnbr" Mar 08 20:28:05 crc kubenswrapper[4885]: I0308 20:28:05.641794 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550022-cwspw"] Mar 08 20:28:05 crc kubenswrapper[4885]: I0308 20:28:05.651811 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550022-cwspw"] Mar 08 20:28:07 crc kubenswrapper[4885]: I0308 20:28:07.380857 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c1236a4-ac5f-4b66-8064-a0877ea3eb13" path="/var/lib/kubelet/pods/9c1236a4-ac5f-4b66-8064-a0877ea3eb13/volumes" Mar 08 20:28:12 crc kubenswrapper[4885]: I0308 20:28:12.368094 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:28:12 crc kubenswrapper[4885]: E0308 20:28:12.368871 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:28:24 crc kubenswrapper[4885]: I0308 20:28:24.367891 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:28:24 crc kubenswrapper[4885]: E0308 20:28:24.368611 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:28:35 crc kubenswrapper[4885]: I0308 20:28:35.368669 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:28:35 crc kubenswrapper[4885]: E0308 20:28:35.369836 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.869510 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-82wff"] Mar 08 20:28:37 crc kubenswrapper[4885]: E0308 20:28:37.870608 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67320c5c-bbf3-4828-8a46-effb28e4d9a1" containerName="oc" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.870638 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="67320c5c-bbf3-4828-8a46-effb28e4d9a1" containerName="oc" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.871026 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="67320c5c-bbf3-4828-8a46-effb28e4d9a1" containerName="oc" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.880606 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.887475 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s44hc\" (UniqueName: \"kubernetes.io/projected/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-kube-api-access-s44hc\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.887562 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-catalog-content\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.887619 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-utilities\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.913343 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-82wff"] Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.989242 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s44hc\" (UniqueName: \"kubernetes.io/projected/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-kube-api-access-s44hc\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.989331 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-catalog-content\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.989390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-utilities\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.990069 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-utilities\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:37 crc kubenswrapper[4885]: I0308 20:28:37.990683 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-catalog-content\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:38 crc kubenswrapper[4885]: I0308 20:28:38.014407 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s44hc\" (UniqueName: \"kubernetes.io/projected/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-kube-api-access-s44hc\") pod \"community-operators-82wff\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:38 crc kubenswrapper[4885]: I0308 20:28:38.203094 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:38 crc kubenswrapper[4885]: I0308 20:28:38.709353 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-82wff"] Mar 08 20:28:39 crc kubenswrapper[4885]: I0308 20:28:39.506011 4885 generic.go:334] "Generic (PLEG): container finished" podID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerID="faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba" exitCode=0 Mar 08 20:28:39 crc kubenswrapper[4885]: I0308 20:28:39.506170 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerDied","Data":"faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba"} Mar 08 20:28:39 crc kubenswrapper[4885]: I0308 20:28:39.506465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerStarted","Data":"1866ccbe71f36db1e42b7cbb8c3b67be8a85a9ae35ab075668dd6b8f7edcac68"} Mar 08 20:28:40 crc kubenswrapper[4885]: I0308 20:28:40.518251 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerStarted","Data":"1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132"} Mar 08 20:28:41 crc kubenswrapper[4885]: I0308 20:28:41.531874 4885 generic.go:334] "Generic (PLEG): container finished" podID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerID="1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132" exitCode=0 Mar 08 20:28:41 crc kubenswrapper[4885]: I0308 20:28:41.532131 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerDied","Data":"1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132"} Mar 08 20:28:42 crc kubenswrapper[4885]: I0308 20:28:42.542413 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerStarted","Data":"3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c"} Mar 08 20:28:42 crc kubenswrapper[4885]: I0308 20:28:42.577209 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-82wff" podStartSLOduration=3.139257967 podStartE2EDuration="5.577192189s" podCreationTimestamp="2026-03-08 20:28:37 +0000 UTC" firstStartedPulling="2026-03-08 20:28:39.508749894 +0000 UTC m=+3420.904803947" lastFinishedPulling="2026-03-08 20:28:41.946684106 +0000 UTC m=+3423.342738169" observedRunningTime="2026-03-08 20:28:42.562821897 +0000 UTC m=+3423.958875930" watchObservedRunningTime="2026-03-08 20:28:42.577192189 +0000 UTC m=+3423.973246222" Mar 08 20:28:48 crc kubenswrapper[4885]: I0308 20:28:48.204334 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:48 crc kubenswrapper[4885]: I0308 20:28:48.204721 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:48 crc kubenswrapper[4885]: I0308 20:28:48.261307 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:48 crc kubenswrapper[4885]: I0308 20:28:48.671205 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:49 crc kubenswrapper[4885]: I0308 20:28:49.377810 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:28:49 crc kubenswrapper[4885]: E0308 20:28:49.378322 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:28:49 crc kubenswrapper[4885]: I0308 20:28:49.712074 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-82wff"] Mar 08 20:28:50 crc kubenswrapper[4885]: I0308 20:28:50.619424 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-82wff" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="registry-server" containerID="cri-o://3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c" gracePeriod=2 Mar 08 20:28:50 crc kubenswrapper[4885]: E0308 20:28:50.836215 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21cc25eb_5e12_4a42_ae79_a8ed17b3a437.slice/crio-conmon-3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c.scope\": RecentStats: unable to find data in memory cache]" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.172000 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.221072 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-catalog-content\") pod \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.221381 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-utilities\") pod \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.221508 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s44hc\" (UniqueName: \"kubernetes.io/projected/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-kube-api-access-s44hc\") pod \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\" (UID: \"21cc25eb-5e12-4a42-ae79-a8ed17b3a437\") " Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.223367 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-utilities" (OuterVolumeSpecName: "utilities") pod "21cc25eb-5e12-4a42-ae79-a8ed17b3a437" (UID: "21cc25eb-5e12-4a42-ae79-a8ed17b3a437"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.229835 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-kube-api-access-s44hc" (OuterVolumeSpecName: "kube-api-access-s44hc") pod "21cc25eb-5e12-4a42-ae79-a8ed17b3a437" (UID: "21cc25eb-5e12-4a42-ae79-a8ed17b3a437"). InnerVolumeSpecName "kube-api-access-s44hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.322527 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "21cc25eb-5e12-4a42-ae79-a8ed17b3a437" (UID: "21cc25eb-5e12-4a42-ae79-a8ed17b3a437"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.323015 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s44hc\" (UniqueName: \"kubernetes.io/projected/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-kube-api-access-s44hc\") on node \"crc\" DevicePath \"\"" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.323067 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.323087 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/21cc25eb-5e12-4a42-ae79-a8ed17b3a437-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.631701 4885 generic.go:334] "Generic (PLEG): container finished" podID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerID="3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c" exitCode=0 Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.631765 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerDied","Data":"3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c"} Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.631814 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-82wff" event={"ID":"21cc25eb-5e12-4a42-ae79-a8ed17b3a437","Type":"ContainerDied","Data":"1866ccbe71f36db1e42b7cbb8c3b67be8a85a9ae35ab075668dd6b8f7edcac68"} Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.631825 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-82wff" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.631843 4885 scope.go:117] "RemoveContainer" containerID="3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.668634 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-82wff"] Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.675593 4885 scope.go:117] "RemoveContainer" containerID="1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.681776 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-82wff"] Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.707356 4885 scope.go:117] "RemoveContainer" containerID="faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.758429 4885 scope.go:117] "RemoveContainer" containerID="3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c" Mar 08 20:28:51 crc kubenswrapper[4885]: E0308 20:28:51.759994 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c\": container with ID starting with 3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c not found: ID does not exist" containerID="3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.760127 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c"} err="failed to get container status \"3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c\": rpc error: code = NotFound desc = could not find container \"3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c\": container with ID starting with 3df8f428ea8031848d0bcde1e3655c8d20ce36c19a44628ba03bf7ec2fe23c9c not found: ID does not exist" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.760339 4885 scope.go:117] "RemoveContainer" containerID="1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132" Mar 08 20:28:51 crc kubenswrapper[4885]: E0308 20:28:51.760943 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132\": container with ID starting with 1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132 not found: ID does not exist" containerID="1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.760975 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132"} err="failed to get container status \"1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132\": rpc error: code = NotFound desc = could not find container \"1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132\": container with ID starting with 1521ccb1eae125b0a9ed97a7975aa304ca5b953ab3f6f2f95dd860fe19e2d132 not found: ID does not exist" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.760994 4885 scope.go:117] "RemoveContainer" containerID="faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba" Mar 08 20:28:51 crc kubenswrapper[4885]: E0308 20:28:51.761460 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba\": container with ID starting with faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba not found: ID does not exist" containerID="faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba" Mar 08 20:28:51 crc kubenswrapper[4885]: I0308 20:28:51.761512 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba"} err="failed to get container status \"faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba\": rpc error: code = NotFound desc = could not find container \"faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba\": container with ID starting with faeec83ad71248b64c7473307c75e8f4408d036a335c5891612ee809cefa77ba not found: ID does not exist" Mar 08 20:28:53 crc kubenswrapper[4885]: I0308 20:28:53.386701 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" path="/var/lib/kubelet/pods/21cc25eb-5e12-4a42-ae79-a8ed17b3a437/volumes" Mar 08 20:29:01 crc kubenswrapper[4885]: I0308 20:29:01.368897 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:29:01 crc kubenswrapper[4885]: E0308 20:29:01.370081 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:29:01 crc kubenswrapper[4885]: I0308 20:29:01.580924 4885 scope.go:117] "RemoveContainer" containerID="02239c6874bdb24f3dcd1bad4a5e3559f6f779758a7821e5dae46f6c1d9294ea" Mar 08 20:29:15 crc kubenswrapper[4885]: I0308 20:29:15.368065 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:29:15 crc kubenswrapper[4885]: E0308 20:29:15.369013 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:29:27 crc kubenswrapper[4885]: I0308 20:29:27.369239 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:29:27 crc kubenswrapper[4885]: E0308 20:29:27.369959 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:29:39 crc kubenswrapper[4885]: I0308 20:29:39.376304 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:29:39 crc kubenswrapper[4885]: E0308 20:29:39.376980 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:29:51 crc kubenswrapper[4885]: I0308 20:29:51.368451 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:29:51 crc kubenswrapper[4885]: E0308 20:29:51.369411 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.142677 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550030-trf4w"] Mar 08 20:30:00 crc kubenswrapper[4885]: E0308 20:30:00.143590 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="extract-content" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.143607 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="extract-content" Mar 08 20:30:00 crc kubenswrapper[4885]: E0308 20:30:00.143637 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="registry-server" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.143644 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="registry-server" Mar 08 20:30:00 crc kubenswrapper[4885]: E0308 20:30:00.143658 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="extract-utilities" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.143666 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="extract-utilities" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.143830 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="21cc25eb-5e12-4a42-ae79-a8ed17b3a437" containerName="registry-server" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.144412 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.147855 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.148231 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.150532 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.158936 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b"] Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.160257 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.163248 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.167914 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.171306 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550030-trf4w"] Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.180417 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b"] Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.326486 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c873212d-4c8c-4d2c-ad89-be5ff96db764-config-volume\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.326659 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c873212d-4c8c-4d2c-ad89-be5ff96db764-secret-volume\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.326735 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fttm9\" (UniqueName: \"kubernetes.io/projected/c873212d-4c8c-4d2c-ad89-be5ff96db764-kube-api-access-fttm9\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.327056 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q96rd\" (UniqueName: \"kubernetes.io/projected/b137f1c0-32d5-44b8-b0e3-a7ae07052e53-kube-api-access-q96rd\") pod \"auto-csr-approver-29550030-trf4w\" (UID: \"b137f1c0-32d5-44b8-b0e3-a7ae07052e53\") " pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.429282 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q96rd\" (UniqueName: \"kubernetes.io/projected/b137f1c0-32d5-44b8-b0e3-a7ae07052e53-kube-api-access-q96rd\") pod \"auto-csr-approver-29550030-trf4w\" (UID: \"b137f1c0-32d5-44b8-b0e3-a7ae07052e53\") " pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.429401 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c873212d-4c8c-4d2c-ad89-be5ff96db764-config-volume\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.429488 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c873212d-4c8c-4d2c-ad89-be5ff96db764-secret-volume\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.429543 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fttm9\" (UniqueName: \"kubernetes.io/projected/c873212d-4c8c-4d2c-ad89-be5ff96db764-kube-api-access-fttm9\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.431916 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c873212d-4c8c-4d2c-ad89-be5ff96db764-config-volume\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.445598 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c873212d-4c8c-4d2c-ad89-be5ff96db764-secret-volume\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.451613 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fttm9\" (UniqueName: \"kubernetes.io/projected/c873212d-4c8c-4d2c-ad89-be5ff96db764-kube-api-access-fttm9\") pod \"collect-profiles-29550030-n2t6b\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.467149 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q96rd\" (UniqueName: \"kubernetes.io/projected/b137f1c0-32d5-44b8-b0e3-a7ae07052e53-kube-api-access-q96rd\") pod \"auto-csr-approver-29550030-trf4w\" (UID: \"b137f1c0-32d5-44b8-b0e3-a7ae07052e53\") " pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.474325 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.486812 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:00 crc kubenswrapper[4885]: I0308 20:30:00.789743 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b"] Mar 08 20:30:00 crc kubenswrapper[4885]: W0308 20:30:00.999559 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb137f1c0_32d5_44b8_b0e3_a7ae07052e53.slice/crio-d6e17f7ef1179062915aa5879859b30f9c0d9c45962beb77f4bbb7539e1beaa0 WatchSource:0}: Error finding container d6e17f7ef1179062915aa5879859b30f9c0d9c45962beb77f4bbb7539e1beaa0: Status 404 returned error can't find the container with id d6e17f7ef1179062915aa5879859b30f9c0d9c45962beb77f4bbb7539e1beaa0 Mar 08 20:30:01 crc kubenswrapper[4885]: I0308 20:30:01.000679 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550030-trf4w"] Mar 08 20:30:01 crc kubenswrapper[4885]: I0308 20:30:01.260114 4885 generic.go:334] "Generic (PLEG): container finished" podID="c873212d-4c8c-4d2c-ad89-be5ff96db764" containerID="ee3f4f74f30598ca5d1ebc5c4a12e553e2064229a545cf14384e548c26e071ad" exitCode=0 Mar 08 20:30:01 crc kubenswrapper[4885]: I0308 20:30:01.260754 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" event={"ID":"c873212d-4c8c-4d2c-ad89-be5ff96db764","Type":"ContainerDied","Data":"ee3f4f74f30598ca5d1ebc5c4a12e553e2064229a545cf14384e548c26e071ad"} Mar 08 20:30:01 crc kubenswrapper[4885]: I0308 20:30:01.260796 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" event={"ID":"c873212d-4c8c-4d2c-ad89-be5ff96db764","Type":"ContainerStarted","Data":"109a0dc3b12545c1b7867317e71c20cf01662f060e19c3067bb03d039c4e0570"} Mar 08 20:30:01 crc kubenswrapper[4885]: I0308 20:30:01.262206 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550030-trf4w" event={"ID":"b137f1c0-32d5-44b8-b0e3-a7ae07052e53","Type":"ContainerStarted","Data":"d6e17f7ef1179062915aa5879859b30f9c0d9c45962beb77f4bbb7539e1beaa0"} Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.633785 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.684351 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fttm9\" (UniqueName: \"kubernetes.io/projected/c873212d-4c8c-4d2c-ad89-be5ff96db764-kube-api-access-fttm9\") pod \"c873212d-4c8c-4d2c-ad89-be5ff96db764\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.684482 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c873212d-4c8c-4d2c-ad89-be5ff96db764-secret-volume\") pod \"c873212d-4c8c-4d2c-ad89-be5ff96db764\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.685075 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c873212d-4c8c-4d2c-ad89-be5ff96db764-config-volume\") pod \"c873212d-4c8c-4d2c-ad89-be5ff96db764\" (UID: \"c873212d-4c8c-4d2c-ad89-be5ff96db764\") " Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.685709 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c873212d-4c8c-4d2c-ad89-be5ff96db764-config-volume" (OuterVolumeSpecName: "config-volume") pod "c873212d-4c8c-4d2c-ad89-be5ff96db764" (UID: "c873212d-4c8c-4d2c-ad89-be5ff96db764"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.691762 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c873212d-4c8c-4d2c-ad89-be5ff96db764-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c873212d-4c8c-4d2c-ad89-be5ff96db764" (UID: "c873212d-4c8c-4d2c-ad89-be5ff96db764"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.693036 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c873212d-4c8c-4d2c-ad89-be5ff96db764-kube-api-access-fttm9" (OuterVolumeSpecName: "kube-api-access-fttm9") pod "c873212d-4c8c-4d2c-ad89-be5ff96db764" (UID: "c873212d-4c8c-4d2c-ad89-be5ff96db764"). InnerVolumeSpecName "kube-api-access-fttm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.785961 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c873212d-4c8c-4d2c-ad89-be5ff96db764-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.786000 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c873212d-4c8c-4d2c-ad89-be5ff96db764-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:30:02 crc kubenswrapper[4885]: I0308 20:30:02.786013 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fttm9\" (UniqueName: \"kubernetes.io/projected/c873212d-4c8c-4d2c-ad89-be5ff96db764-kube-api-access-fttm9\") on node \"crc\" DevicePath \"\"" Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.283852 4885 generic.go:334] "Generic (PLEG): container finished" podID="b137f1c0-32d5-44b8-b0e3-a7ae07052e53" containerID="f14e28da607b9cdf53f4fec9037b95180b5cd2506e58c13ecacb85cc348f41e4" exitCode=0 Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.284007 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550030-trf4w" event={"ID":"b137f1c0-32d5-44b8-b0e3-a7ae07052e53","Type":"ContainerDied","Data":"f14e28da607b9cdf53f4fec9037b95180b5cd2506e58c13ecacb85cc348f41e4"} Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.290979 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" event={"ID":"c873212d-4c8c-4d2c-ad89-be5ff96db764","Type":"ContainerDied","Data":"109a0dc3b12545c1b7867317e71c20cf01662f060e19c3067bb03d039c4e0570"} Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.291048 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="109a0dc3b12545c1b7867317e71c20cf01662f060e19c3067bb03d039c4e0570" Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.291151 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b" Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.728569 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn"] Mar 08 20:30:03 crc kubenswrapper[4885]: I0308 20:30:03.735958 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29549985-v28gn"] Mar 08 20:30:04 crc kubenswrapper[4885]: I0308 20:30:04.628212 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:04 crc kubenswrapper[4885]: I0308 20:30:04.722986 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q96rd\" (UniqueName: \"kubernetes.io/projected/b137f1c0-32d5-44b8-b0e3-a7ae07052e53-kube-api-access-q96rd\") pod \"b137f1c0-32d5-44b8-b0e3-a7ae07052e53\" (UID: \"b137f1c0-32d5-44b8-b0e3-a7ae07052e53\") " Mar 08 20:30:04 crc kubenswrapper[4885]: I0308 20:30:04.732991 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b137f1c0-32d5-44b8-b0e3-a7ae07052e53-kube-api-access-q96rd" (OuterVolumeSpecName: "kube-api-access-q96rd") pod "b137f1c0-32d5-44b8-b0e3-a7ae07052e53" (UID: "b137f1c0-32d5-44b8-b0e3-a7ae07052e53"). InnerVolumeSpecName "kube-api-access-q96rd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:30:04 crc kubenswrapper[4885]: I0308 20:30:04.824468 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q96rd\" (UniqueName: \"kubernetes.io/projected/b137f1c0-32d5-44b8-b0e3-a7ae07052e53-kube-api-access-q96rd\") on node \"crc\" DevicePath \"\"" Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.311060 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550030-trf4w" event={"ID":"b137f1c0-32d5-44b8-b0e3-a7ae07052e53","Type":"ContainerDied","Data":"d6e17f7ef1179062915aa5879859b30f9c0d9c45962beb77f4bbb7539e1beaa0"} Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.311117 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e17f7ef1179062915aa5879859b30f9c0d9c45962beb77f4bbb7539e1beaa0" Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.311171 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550030-trf4w" Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.368812 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.385136 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8673a65-b7c8-4c06-9713-a095b399358a" path="/var/lib/kubelet/pods/f8673a65-b7c8-4c06-9713-a095b399358a/volumes" Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.691380 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550024-9zz2p"] Mar 08 20:30:05 crc kubenswrapper[4885]: I0308 20:30:05.701501 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550024-9zz2p"] Mar 08 20:30:06 crc kubenswrapper[4885]: I0308 20:30:06.321955 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"d25a5af4779cf4a9087bc38f1595551232392f2b22b234a85d5d3906024eb796"} Mar 08 20:30:07 crc kubenswrapper[4885]: I0308 20:30:07.385584 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104" path="/var/lib/kubelet/pods/1c6f3f1b-eeb4-487f-b4b8-e50f19ecb104/volumes" Mar 08 20:31:01 crc kubenswrapper[4885]: I0308 20:31:01.699197 4885 scope.go:117] "RemoveContainer" containerID="dc7b1fe292df06f58ac62305ed639526799d6857e418c3744dffefa96ddd2209" Mar 08 20:31:01 crc kubenswrapper[4885]: I0308 20:31:01.733810 4885 scope.go:117] "RemoveContainer" containerID="43f525f1dbdfe3e17033373036412190ee626493c56e86fcd87bd80de646fc57" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.158621 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550032-rvpv6"] Mar 08 20:32:00 crc kubenswrapper[4885]: E0308 20:32:00.159739 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c873212d-4c8c-4d2c-ad89-be5ff96db764" containerName="collect-profiles" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.159764 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c873212d-4c8c-4d2c-ad89-be5ff96db764" containerName="collect-profiles" Mar 08 20:32:00 crc kubenswrapper[4885]: E0308 20:32:00.159788 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b137f1c0-32d5-44b8-b0e3-a7ae07052e53" containerName="oc" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.159802 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b137f1c0-32d5-44b8-b0e3-a7ae07052e53" containerName="oc" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.160059 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c873212d-4c8c-4d2c-ad89-be5ff96db764" containerName="collect-profiles" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.160098 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b137f1c0-32d5-44b8-b0e3-a7ae07052e53" containerName="oc" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.160817 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.165073 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.165570 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.167283 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.174210 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550032-rvpv6"] Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.361136 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flgv7\" (UniqueName: \"kubernetes.io/projected/d9ba305f-b091-419d-ba98-701437bceab1-kube-api-access-flgv7\") pod \"auto-csr-approver-29550032-rvpv6\" (UID: \"d9ba305f-b091-419d-ba98-701437bceab1\") " pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.463389 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flgv7\" (UniqueName: \"kubernetes.io/projected/d9ba305f-b091-419d-ba98-701437bceab1-kube-api-access-flgv7\") pod \"auto-csr-approver-29550032-rvpv6\" (UID: \"d9ba305f-b091-419d-ba98-701437bceab1\") " pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.495881 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flgv7\" (UniqueName: \"kubernetes.io/projected/d9ba305f-b091-419d-ba98-701437bceab1-kube-api-access-flgv7\") pod \"auto-csr-approver-29550032-rvpv6\" (UID: \"d9ba305f-b091-419d-ba98-701437bceab1\") " pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.521941 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:00 crc kubenswrapper[4885]: I0308 20:32:00.774350 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550032-rvpv6"] Mar 08 20:32:01 crc kubenswrapper[4885]: I0308 20:32:01.419442 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" event={"ID":"d9ba305f-b091-419d-ba98-701437bceab1","Type":"ContainerStarted","Data":"ae05435fd41fbf09dd6c4b2d5283f6b690a30cd8c1b3ed75a554b2d6e3b3794b"} Mar 08 20:32:02 crc kubenswrapper[4885]: I0308 20:32:02.432195 4885 generic.go:334] "Generic (PLEG): container finished" podID="d9ba305f-b091-419d-ba98-701437bceab1" containerID="bdf74962d126ba2f43f277a948cdef5d47d9a79c0f03133bc1218e4128ca8e51" exitCode=0 Mar 08 20:32:02 crc kubenswrapper[4885]: I0308 20:32:02.432493 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" event={"ID":"d9ba305f-b091-419d-ba98-701437bceab1","Type":"ContainerDied","Data":"bdf74962d126ba2f43f277a948cdef5d47d9a79c0f03133bc1218e4128ca8e51"} Mar 08 20:32:03 crc kubenswrapper[4885]: I0308 20:32:03.767137 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:03 crc kubenswrapper[4885]: I0308 20:32:03.920636 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flgv7\" (UniqueName: \"kubernetes.io/projected/d9ba305f-b091-419d-ba98-701437bceab1-kube-api-access-flgv7\") pod \"d9ba305f-b091-419d-ba98-701437bceab1\" (UID: \"d9ba305f-b091-419d-ba98-701437bceab1\") " Mar 08 20:32:03 crc kubenswrapper[4885]: I0308 20:32:03.928129 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9ba305f-b091-419d-ba98-701437bceab1-kube-api-access-flgv7" (OuterVolumeSpecName: "kube-api-access-flgv7") pod "d9ba305f-b091-419d-ba98-701437bceab1" (UID: "d9ba305f-b091-419d-ba98-701437bceab1"). InnerVolumeSpecName "kube-api-access-flgv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:32:04 crc kubenswrapper[4885]: I0308 20:32:04.022870 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flgv7\" (UniqueName: \"kubernetes.io/projected/d9ba305f-b091-419d-ba98-701437bceab1-kube-api-access-flgv7\") on node \"crc\" DevicePath \"\"" Mar 08 20:32:04 crc kubenswrapper[4885]: I0308 20:32:04.451680 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" event={"ID":"d9ba305f-b091-419d-ba98-701437bceab1","Type":"ContainerDied","Data":"ae05435fd41fbf09dd6c4b2d5283f6b690a30cd8c1b3ed75a554b2d6e3b3794b"} Mar 08 20:32:04 crc kubenswrapper[4885]: I0308 20:32:04.451788 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae05435fd41fbf09dd6c4b2d5283f6b690a30cd8c1b3ed75a554b2d6e3b3794b" Mar 08 20:32:04 crc kubenswrapper[4885]: I0308 20:32:04.451832 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550032-rvpv6" Mar 08 20:32:04 crc kubenswrapper[4885]: I0308 20:32:04.860684 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550026-vw7xs"] Mar 08 20:32:04 crc kubenswrapper[4885]: I0308 20:32:04.867552 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550026-vw7xs"] Mar 08 20:32:05 crc kubenswrapper[4885]: I0308 20:32:05.384891 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99bd2190-0abe-4434-b2f6-3707852e2d43" path="/var/lib/kubelet/pods/99bd2190-0abe-4434-b2f6-3707852e2d43/volumes" Mar 08 20:32:32 crc kubenswrapper[4885]: I0308 20:32:32.818713 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:32:32 crc kubenswrapper[4885]: I0308 20:32:32.819604 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:33:01 crc kubenswrapper[4885]: I0308 20:33:01.885716 4885 scope.go:117] "RemoveContainer" containerID="d37e225980dc2c4d236b82dd40a6a7c22fa7155e659179661d10d8de1d9aabfd" Mar 08 20:33:02 crc kubenswrapper[4885]: I0308 20:33:02.818381 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:33:02 crc kubenswrapper[4885]: I0308 20:33:02.818791 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:33:32 crc kubenswrapper[4885]: I0308 20:33:32.819128 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:33:32 crc kubenswrapper[4885]: I0308 20:33:32.819889 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:33:32 crc kubenswrapper[4885]: I0308 20:33:32.820006 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:33:32 crc kubenswrapper[4885]: I0308 20:33:32.821458 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d25a5af4779cf4a9087bc38f1595551232392f2b22b234a85d5d3906024eb796"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:33:32 crc kubenswrapper[4885]: I0308 20:33:32.821576 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://d25a5af4779cf4a9087bc38f1595551232392f2b22b234a85d5d3906024eb796" gracePeriod=600 Mar 08 20:33:33 crc kubenswrapper[4885]: I0308 20:33:33.283157 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="d25a5af4779cf4a9087bc38f1595551232392f2b22b234a85d5d3906024eb796" exitCode=0 Mar 08 20:33:33 crc kubenswrapper[4885]: I0308 20:33:33.283205 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"d25a5af4779cf4a9087bc38f1595551232392f2b22b234a85d5d3906024eb796"} Mar 08 20:33:33 crc kubenswrapper[4885]: I0308 20:33:33.283273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4"} Mar 08 20:33:33 crc kubenswrapper[4885]: I0308 20:33:33.283301 4885 scope.go:117] "RemoveContainer" containerID="757e909a7b053ccd60009f591d9cf82708b30b20b8063ac70b759ef7cad8ca36" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.158487 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550034-jl6bg"] Mar 08 20:34:00 crc kubenswrapper[4885]: E0308 20:34:00.159530 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9ba305f-b091-419d-ba98-701437bceab1" containerName="oc" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.159554 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9ba305f-b091-419d-ba98-701437bceab1" containerName="oc" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.159801 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9ba305f-b091-419d-ba98-701437bceab1" containerName="oc" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.160573 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.163626 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.164052 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.164443 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.177955 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550034-jl6bg"] Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.267204 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqxt2\" (UniqueName: \"kubernetes.io/projected/d02d14a7-00be-4808-9f97-ac3c16ae727a-kube-api-access-jqxt2\") pod \"auto-csr-approver-29550034-jl6bg\" (UID: \"d02d14a7-00be-4808-9f97-ac3c16ae727a\") " pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.368857 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqxt2\" (UniqueName: \"kubernetes.io/projected/d02d14a7-00be-4808-9f97-ac3c16ae727a-kube-api-access-jqxt2\") pod \"auto-csr-approver-29550034-jl6bg\" (UID: \"d02d14a7-00be-4808-9f97-ac3c16ae727a\") " pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.401275 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqxt2\" (UniqueName: \"kubernetes.io/projected/d02d14a7-00be-4808-9f97-ac3c16ae727a-kube-api-access-jqxt2\") pod \"auto-csr-approver-29550034-jl6bg\" (UID: \"d02d14a7-00be-4808-9f97-ac3c16ae727a\") " pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.499300 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.779437 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550034-jl6bg"] Mar 08 20:34:00 crc kubenswrapper[4885]: I0308 20:34:00.785642 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:34:01 crc kubenswrapper[4885]: I0308 20:34:01.558780 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" event={"ID":"d02d14a7-00be-4808-9f97-ac3c16ae727a","Type":"ContainerStarted","Data":"8735bc76cb7f7581185453813c78ba75fd8707fc04a1cec4e153ee10ab492db1"} Mar 08 20:34:02 crc kubenswrapper[4885]: I0308 20:34:02.571601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" event={"ID":"d02d14a7-00be-4808-9f97-ac3c16ae727a","Type":"ContainerStarted","Data":"4ed950f44e01488b5e84b2e6cb1b702242d1797af5d3e1be27eec2846e142c48"} Mar 08 20:34:02 crc kubenswrapper[4885]: I0308 20:34:02.592747 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" podStartSLOduration=1.276775999 podStartE2EDuration="2.592714429s" podCreationTimestamp="2026-03-08 20:34:00 +0000 UTC" firstStartedPulling="2026-03-08 20:34:00.785293325 +0000 UTC m=+3742.181347358" lastFinishedPulling="2026-03-08 20:34:02.101231735 +0000 UTC m=+3743.497285788" observedRunningTime="2026-03-08 20:34:02.589215116 +0000 UTC m=+3743.985269199" watchObservedRunningTime="2026-03-08 20:34:02.592714429 +0000 UTC m=+3743.988768492" Mar 08 20:34:03 crc kubenswrapper[4885]: I0308 20:34:03.582796 4885 generic.go:334] "Generic (PLEG): container finished" podID="d02d14a7-00be-4808-9f97-ac3c16ae727a" containerID="4ed950f44e01488b5e84b2e6cb1b702242d1797af5d3e1be27eec2846e142c48" exitCode=0 Mar 08 20:34:03 crc kubenswrapper[4885]: I0308 20:34:03.583006 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" event={"ID":"d02d14a7-00be-4808-9f97-ac3c16ae727a","Type":"ContainerDied","Data":"4ed950f44e01488b5e84b2e6cb1b702242d1797af5d3e1be27eec2846e142c48"} Mar 08 20:34:04 crc kubenswrapper[4885]: I0308 20:34:04.957666 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.051690 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqxt2\" (UniqueName: \"kubernetes.io/projected/d02d14a7-00be-4808-9f97-ac3c16ae727a-kube-api-access-jqxt2\") pod \"d02d14a7-00be-4808-9f97-ac3c16ae727a\" (UID: \"d02d14a7-00be-4808-9f97-ac3c16ae727a\") " Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.060901 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02d14a7-00be-4808-9f97-ac3c16ae727a-kube-api-access-jqxt2" (OuterVolumeSpecName: "kube-api-access-jqxt2") pod "d02d14a7-00be-4808-9f97-ac3c16ae727a" (UID: "d02d14a7-00be-4808-9f97-ac3c16ae727a"). InnerVolumeSpecName "kube-api-access-jqxt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.153996 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqxt2\" (UniqueName: \"kubernetes.io/projected/d02d14a7-00be-4808-9f97-ac3c16ae727a-kube-api-access-jqxt2\") on node \"crc\" DevicePath \"\"" Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.604794 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" event={"ID":"d02d14a7-00be-4808-9f97-ac3c16ae727a","Type":"ContainerDied","Data":"8735bc76cb7f7581185453813c78ba75fd8707fc04a1cec4e153ee10ab492db1"} Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.605148 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8735bc76cb7f7581185453813c78ba75fd8707fc04a1cec4e153ee10ab492db1" Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.605021 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550034-jl6bg" Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.688648 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550028-gfnbr"] Mar 08 20:34:05 crc kubenswrapper[4885]: I0308 20:34:05.694969 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550028-gfnbr"] Mar 08 20:34:07 crc kubenswrapper[4885]: I0308 20:34:07.384861 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67320c5c-bbf3-4828-8a46-effb28e4d9a1" path="/var/lib/kubelet/pods/67320c5c-bbf3-4828-8a46-effb28e4d9a1/volumes" Mar 08 20:35:02 crc kubenswrapper[4885]: I0308 20:35:02.033481 4885 scope.go:117] "RemoveContainer" containerID="ef07ccde27b0d39793669ff65783adc2f054456be78c379a19e61fa8a99b06b3" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.485402 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kwm5v"] Mar 08 20:35:22 crc kubenswrapper[4885]: E0308 20:35:22.487234 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02d14a7-00be-4808-9f97-ac3c16ae727a" containerName="oc" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.487255 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02d14a7-00be-4808-9f97-ac3c16ae727a" containerName="oc" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.487459 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02d14a7-00be-4808-9f97-ac3c16ae727a" containerName="oc" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.489146 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.503751 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwm5v"] Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.506408 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p62bz\" (UniqueName: \"kubernetes.io/projected/668c7890-77ad-445e-bee1-d40844c077ce-kube-api-access-p62bz\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.506454 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-catalog-content\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.506477 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-utilities\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.607972 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p62bz\" (UniqueName: \"kubernetes.io/projected/668c7890-77ad-445e-bee1-d40844c077ce-kube-api-access-p62bz\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.608029 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-catalog-content\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.608054 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-utilities\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.608623 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-utilities\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.608807 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-catalog-content\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.642576 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p62bz\" (UniqueName: \"kubernetes.io/projected/668c7890-77ad-445e-bee1-d40844c077ce-kube-api-access-p62bz\") pod \"redhat-operators-kwm5v\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:22 crc kubenswrapper[4885]: I0308 20:35:22.856761 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:23 crc kubenswrapper[4885]: I0308 20:35:23.322430 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwm5v"] Mar 08 20:35:23 crc kubenswrapper[4885]: I0308 20:35:23.379972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerStarted","Data":"95664d8f3885947db08492e1323e820bc6d9db6195795acb46f2951a840b2a88"} Mar 08 20:35:24 crc kubenswrapper[4885]: I0308 20:35:24.391094 4885 generic.go:334] "Generic (PLEG): container finished" podID="668c7890-77ad-445e-bee1-d40844c077ce" containerID="a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3" exitCode=0 Mar 08 20:35:24 crc kubenswrapper[4885]: I0308 20:35:24.391178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerDied","Data":"a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3"} Mar 08 20:35:24 crc kubenswrapper[4885]: I0308 20:35:24.859603 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-76dq7"] Mar 08 20:35:24 crc kubenswrapper[4885]: I0308 20:35:24.862519 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:24 crc kubenswrapper[4885]: I0308 20:35:24.879328 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-76dq7"] Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.044638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-catalog-content\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.045004 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6v92\" (UniqueName: \"kubernetes.io/projected/091a1ce3-4352-409e-aa25-b111c2b266f2-kube-api-access-r6v92\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.045435 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-utilities\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.147071 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-catalog-content\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.147154 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6v92\" (UniqueName: \"kubernetes.io/projected/091a1ce3-4352-409e-aa25-b111c2b266f2-kube-api-access-r6v92\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.147201 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-utilities\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.147579 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-catalog-content\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.147850 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-utilities\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.174076 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6v92\" (UniqueName: \"kubernetes.io/projected/091a1ce3-4352-409e-aa25-b111c2b266f2-kube-api-access-r6v92\") pod \"certified-operators-76dq7\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.198052 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.403267 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerStarted","Data":"95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741"} Mar 08 20:35:25 crc kubenswrapper[4885]: I0308 20:35:25.498812 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-76dq7"] Mar 08 20:35:26 crc kubenswrapper[4885]: I0308 20:35:26.418030 4885 generic.go:334] "Generic (PLEG): container finished" podID="668c7890-77ad-445e-bee1-d40844c077ce" containerID="95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741" exitCode=0 Mar 08 20:35:26 crc kubenswrapper[4885]: I0308 20:35:26.418163 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerDied","Data":"95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741"} Mar 08 20:35:26 crc kubenswrapper[4885]: I0308 20:35:26.433227 4885 generic.go:334] "Generic (PLEG): container finished" podID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerID="27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe" exitCode=0 Mar 08 20:35:26 crc kubenswrapper[4885]: I0308 20:35:26.433378 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerDied","Data":"27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe"} Mar 08 20:35:26 crc kubenswrapper[4885]: I0308 20:35:26.433585 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerStarted","Data":"04c997db414c675653cb091d0ddce7408e517b59dd141334c3f4342fbea0c087"} Mar 08 20:35:27 crc kubenswrapper[4885]: I0308 20:35:27.441046 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerStarted","Data":"0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188"} Mar 08 20:35:27 crc kubenswrapper[4885]: I0308 20:35:27.442605 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerStarted","Data":"236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339"} Mar 08 20:35:27 crc kubenswrapper[4885]: I0308 20:35:27.481816 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kwm5v" podStartSLOduration=3.050130042 podStartE2EDuration="5.481801455s" podCreationTimestamp="2026-03-08 20:35:22 +0000 UTC" firstStartedPulling="2026-03-08 20:35:24.392977204 +0000 UTC m=+3825.789031267" lastFinishedPulling="2026-03-08 20:35:26.824648617 +0000 UTC m=+3828.220702680" observedRunningTime="2026-03-08 20:35:27.463323322 +0000 UTC m=+3828.859377355" watchObservedRunningTime="2026-03-08 20:35:27.481801455 +0000 UTC m=+3828.877855478" Mar 08 20:35:28 crc kubenswrapper[4885]: I0308 20:35:28.455346 4885 generic.go:334] "Generic (PLEG): container finished" podID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerID="236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339" exitCode=0 Mar 08 20:35:28 crc kubenswrapper[4885]: I0308 20:35:28.455512 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerDied","Data":"236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339"} Mar 08 20:35:29 crc kubenswrapper[4885]: I0308 20:35:29.468123 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerStarted","Data":"e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113"} Mar 08 20:35:29 crc kubenswrapper[4885]: I0308 20:35:29.502467 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-76dq7" podStartSLOduration=3.012944995 podStartE2EDuration="5.502444192s" podCreationTimestamp="2026-03-08 20:35:24 +0000 UTC" firstStartedPulling="2026-03-08 20:35:26.436950735 +0000 UTC m=+3827.833004788" lastFinishedPulling="2026-03-08 20:35:28.926449932 +0000 UTC m=+3830.322503985" observedRunningTime="2026-03-08 20:35:29.495887678 +0000 UTC m=+3830.891941731" watchObservedRunningTime="2026-03-08 20:35:29.502444192 +0000 UTC m=+3830.898498245" Mar 08 20:35:32 crc kubenswrapper[4885]: I0308 20:35:32.857031 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:32 crc kubenswrapper[4885]: I0308 20:35:32.858169 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:33 crc kubenswrapper[4885]: I0308 20:35:33.916704 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kwm5v" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="registry-server" probeResult="failure" output=< Mar 08 20:35:33 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 20:35:33 crc kubenswrapper[4885]: > Mar 08 20:35:35 crc kubenswrapper[4885]: I0308 20:35:35.199391 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:35 crc kubenswrapper[4885]: I0308 20:35:35.199483 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:35 crc kubenswrapper[4885]: I0308 20:35:35.271075 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:35 crc kubenswrapper[4885]: I0308 20:35:35.593595 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:35 crc kubenswrapper[4885]: I0308 20:35:35.658542 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-76dq7"] Mar 08 20:35:37 crc kubenswrapper[4885]: I0308 20:35:37.546854 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-76dq7" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="registry-server" containerID="cri-o://e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113" gracePeriod=2 Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.096124 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.265843 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-utilities\") pod \"091a1ce3-4352-409e-aa25-b111c2b266f2\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.267490 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-utilities" (OuterVolumeSpecName: "utilities") pod "091a1ce3-4352-409e-aa25-b111c2b266f2" (UID: "091a1ce3-4352-409e-aa25-b111c2b266f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.267862 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-catalog-content\") pod \"091a1ce3-4352-409e-aa25-b111c2b266f2\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.268003 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6v92\" (UniqueName: \"kubernetes.io/projected/091a1ce3-4352-409e-aa25-b111c2b266f2-kube-api-access-r6v92\") pod \"091a1ce3-4352-409e-aa25-b111c2b266f2\" (UID: \"091a1ce3-4352-409e-aa25-b111c2b266f2\") " Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.268762 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.276048 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/091a1ce3-4352-409e-aa25-b111c2b266f2-kube-api-access-r6v92" (OuterVolumeSpecName: "kube-api-access-r6v92") pod "091a1ce3-4352-409e-aa25-b111c2b266f2" (UID: "091a1ce3-4352-409e-aa25-b111c2b266f2"). InnerVolumeSpecName "kube-api-access-r6v92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.370821 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6v92\" (UniqueName: \"kubernetes.io/projected/091a1ce3-4352-409e-aa25-b111c2b266f2-kube-api-access-r6v92\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.510482 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "091a1ce3-4352-409e-aa25-b111c2b266f2" (UID: "091a1ce3-4352-409e-aa25-b111c2b266f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.563504 4885 generic.go:334] "Generic (PLEG): container finished" podID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerID="e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113" exitCode=0 Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.563561 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerDied","Data":"e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113"} Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.563618 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-76dq7" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.563644 4885 scope.go:117] "RemoveContainer" containerID="e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.563624 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-76dq7" event={"ID":"091a1ce3-4352-409e-aa25-b111c2b266f2","Type":"ContainerDied","Data":"04c997db414c675653cb091d0ddce7408e517b59dd141334c3f4342fbea0c087"} Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.575630 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/091a1ce3-4352-409e-aa25-b111c2b266f2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.597243 4885 scope.go:117] "RemoveContainer" containerID="236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.620281 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-76dq7"] Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.631584 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-76dq7"] Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.640534 4885 scope.go:117] "RemoveContainer" containerID="27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.684390 4885 scope.go:117] "RemoveContainer" containerID="e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113" Mar 08 20:35:38 crc kubenswrapper[4885]: E0308 20:35:38.684986 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113\": container with ID starting with e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113 not found: ID does not exist" containerID="e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.685039 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113"} err="failed to get container status \"e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113\": rpc error: code = NotFound desc = could not find container \"e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113\": container with ID starting with e17db25fd69b9aa046d838a62b69e094d69563e43516cd18c28655dd4a7e7113 not found: ID does not exist" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.685071 4885 scope.go:117] "RemoveContainer" containerID="236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339" Mar 08 20:35:38 crc kubenswrapper[4885]: E0308 20:35:38.685527 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339\": container with ID starting with 236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339 not found: ID does not exist" containerID="236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.685566 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339"} err="failed to get container status \"236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339\": rpc error: code = NotFound desc = could not find container \"236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339\": container with ID starting with 236bf356f8096882979743e2d957b3b6c85850cc70ccfb44752ea1864c9da339 not found: ID does not exist" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.685622 4885 scope.go:117] "RemoveContainer" containerID="27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe" Mar 08 20:35:38 crc kubenswrapper[4885]: E0308 20:35:38.686199 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe\": container with ID starting with 27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe not found: ID does not exist" containerID="27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe" Mar 08 20:35:38 crc kubenswrapper[4885]: I0308 20:35:38.686259 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe"} err="failed to get container status \"27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe\": rpc error: code = NotFound desc = could not find container \"27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe\": container with ID starting with 27dad427a57334b15cb12d174924279eb093a70b8e6a330642651629af3962fe not found: ID does not exist" Mar 08 20:35:39 crc kubenswrapper[4885]: I0308 20:35:39.391739 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" path="/var/lib/kubelet/pods/091a1ce3-4352-409e-aa25-b111c2b266f2/volumes" Mar 08 20:35:42 crc kubenswrapper[4885]: I0308 20:35:42.932599 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:43 crc kubenswrapper[4885]: I0308 20:35:43.011434 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:43 crc kubenswrapper[4885]: I0308 20:35:43.180387 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwm5v"] Mar 08 20:35:44 crc kubenswrapper[4885]: I0308 20:35:44.623366 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kwm5v" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="registry-server" containerID="cri-o://0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188" gracePeriod=2 Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.095185 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.186480 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-utilities\") pod \"668c7890-77ad-445e-bee1-d40844c077ce\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.186602 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-catalog-content\") pod \"668c7890-77ad-445e-bee1-d40844c077ce\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.186712 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p62bz\" (UniqueName: \"kubernetes.io/projected/668c7890-77ad-445e-bee1-d40844c077ce-kube-api-access-p62bz\") pod \"668c7890-77ad-445e-bee1-d40844c077ce\" (UID: \"668c7890-77ad-445e-bee1-d40844c077ce\") " Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.188779 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-utilities" (OuterVolumeSpecName: "utilities") pod "668c7890-77ad-445e-bee1-d40844c077ce" (UID: "668c7890-77ad-445e-bee1-d40844c077ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.192777 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668c7890-77ad-445e-bee1-d40844c077ce-kube-api-access-p62bz" (OuterVolumeSpecName: "kube-api-access-p62bz") pod "668c7890-77ad-445e-bee1-d40844c077ce" (UID: "668c7890-77ad-445e-bee1-d40844c077ce"). InnerVolumeSpecName "kube-api-access-p62bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.288729 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.288794 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p62bz\" (UniqueName: \"kubernetes.io/projected/668c7890-77ad-445e-bee1-d40844c077ce-kube-api-access-p62bz\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.416596 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "668c7890-77ad-445e-bee1-d40844c077ce" (UID: "668c7890-77ad-445e-bee1-d40844c077ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.491800 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/668c7890-77ad-445e-bee1-d40844c077ce-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.635069 4885 generic.go:334] "Generic (PLEG): container finished" podID="668c7890-77ad-445e-bee1-d40844c077ce" containerID="0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188" exitCode=0 Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.635150 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerDied","Data":"0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188"} Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.635176 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwm5v" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.635182 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwm5v" event={"ID":"668c7890-77ad-445e-bee1-d40844c077ce","Type":"ContainerDied","Data":"95664d8f3885947db08492e1323e820bc6d9db6195795acb46f2951a840b2a88"} Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.635229 4885 scope.go:117] "RemoveContainer" containerID="0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.674795 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwm5v"] Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.682082 4885 scope.go:117] "RemoveContainer" containerID="95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.685534 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kwm5v"] Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.712307 4885 scope.go:117] "RemoveContainer" containerID="a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.747747 4885 scope.go:117] "RemoveContainer" containerID="0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.748396 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188\": container with ID starting with 0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188 not found: ID does not exist" containerID="0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.748467 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188"} err="failed to get container status \"0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188\": rpc error: code = NotFound desc = could not find container \"0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188\": container with ID starting with 0ac438ca491d494193032a59393a23f4b138d27549d93dbf77cfaadcec9f7188 not found: ID does not exist" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.748512 4885 scope.go:117] "RemoveContainer" containerID="95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.749077 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741\": container with ID starting with 95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741 not found: ID does not exist" containerID="95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.749162 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741"} err="failed to get container status \"95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741\": rpc error: code = NotFound desc = could not find container \"95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741\": container with ID starting with 95f5857e2654ad88a753aa272918092ff6891ac2da9f84a26575bbd41e914741 not found: ID does not exist" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.749201 4885 scope.go:117] "RemoveContainer" containerID="a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.749700 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3\": container with ID starting with a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3 not found: ID does not exist" containerID="a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.749735 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3"} err="failed to get container status \"a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3\": rpc error: code = NotFound desc = could not find container \"a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3\": container with ID starting with a2e29367b771622671d6bfd7133226a2773bd47dffad768ef326f8d7a6b06bb3 not found: ID does not exist" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.993307 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdb6"] Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.994539 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="registry-server" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.994612 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="registry-server" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.994664 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="extract-utilities" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.994683 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="extract-utilities" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.994699 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="extract-content" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.994714 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="extract-content" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.994740 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="registry-server" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.994758 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="registry-server" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.994799 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="extract-content" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.994815 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="extract-content" Mar 08 20:35:45 crc kubenswrapper[4885]: E0308 20:35:45.994854 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="extract-utilities" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.994871 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="extract-utilities" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.995396 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="668c7890-77ad-445e-bee1-d40844c077ce" containerName="registry-server" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.995442 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="091a1ce3-4352-409e-aa25-b111c2b266f2" containerName="registry-server" Mar 08 20:35:45 crc kubenswrapper[4885]: I0308 20:35:45.997410 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.008240 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdb6"] Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.102095 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-utilities\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.102157 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-catalog-content\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.102205 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p24zl\" (UniqueName: \"kubernetes.io/projected/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-kube-api-access-p24zl\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.203658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-utilities\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.203730 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-catalog-content\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.203782 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p24zl\" (UniqueName: \"kubernetes.io/projected/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-kube-api-access-p24zl\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.204302 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-utilities\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.204534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-catalog-content\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.230142 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p24zl\" (UniqueName: \"kubernetes.io/projected/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-kube-api-access-p24zl\") pod \"redhat-marketplace-6pdb6\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.332706 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:46 crc kubenswrapper[4885]: I0308 20:35:46.671541 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdb6"] Mar 08 20:35:47 crc kubenswrapper[4885]: I0308 20:35:47.379816 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="668c7890-77ad-445e-bee1-d40844c077ce" path="/var/lib/kubelet/pods/668c7890-77ad-445e-bee1-d40844c077ce/volumes" Mar 08 20:35:47 crc kubenswrapper[4885]: I0308 20:35:47.656999 4885 generic.go:334] "Generic (PLEG): container finished" podID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerID="3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96" exitCode=0 Mar 08 20:35:47 crc kubenswrapper[4885]: I0308 20:35:47.657070 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerDied","Data":"3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96"} Mar 08 20:35:47 crc kubenswrapper[4885]: I0308 20:35:47.657113 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerStarted","Data":"c5ee7465965ec77635d8950124463e0afc190160350b13cc77ce43e82b2b54fd"} Mar 08 20:35:48 crc kubenswrapper[4885]: I0308 20:35:48.674807 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerStarted","Data":"e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f"} Mar 08 20:35:49 crc kubenswrapper[4885]: I0308 20:35:49.686819 4885 generic.go:334] "Generic (PLEG): container finished" podID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerID="e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f" exitCode=0 Mar 08 20:35:49 crc kubenswrapper[4885]: I0308 20:35:49.686981 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerDied","Data":"e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f"} Mar 08 20:35:50 crc kubenswrapper[4885]: I0308 20:35:50.698747 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerStarted","Data":"ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8"} Mar 08 20:35:50 crc kubenswrapper[4885]: I0308 20:35:50.727632 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6pdb6" podStartSLOduration=3.285112199 podStartE2EDuration="5.727607851s" podCreationTimestamp="2026-03-08 20:35:45 +0000 UTC" firstStartedPulling="2026-03-08 20:35:47.660036788 +0000 UTC m=+3849.056090841" lastFinishedPulling="2026-03-08 20:35:50.10253243 +0000 UTC m=+3851.498586493" observedRunningTime="2026-03-08 20:35:50.724846178 +0000 UTC m=+3852.120900241" watchObservedRunningTime="2026-03-08 20:35:50.727607851 +0000 UTC m=+3852.123661904" Mar 08 20:35:56 crc kubenswrapper[4885]: I0308 20:35:56.332982 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:56 crc kubenswrapper[4885]: I0308 20:35:56.333608 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:56 crc kubenswrapper[4885]: I0308 20:35:56.410141 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:56 crc kubenswrapper[4885]: I0308 20:35:56.826423 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:56 crc kubenswrapper[4885]: I0308 20:35:56.883068 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdb6"] Mar 08 20:35:58 crc kubenswrapper[4885]: I0308 20:35:58.777391 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6pdb6" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="registry-server" containerID="cri-o://ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8" gracePeriod=2 Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.318594 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.439590 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-utilities\") pod \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.439813 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p24zl\" (UniqueName: \"kubernetes.io/projected/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-kube-api-access-p24zl\") pod \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.439870 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-catalog-content\") pod \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\" (UID: \"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb\") " Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.441311 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-utilities" (OuterVolumeSpecName: "utilities") pod "552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" (UID: "552d2b0b-b147-4d5a-92d4-0c01c9a14cfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.450176 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-kube-api-access-p24zl" (OuterVolumeSpecName: "kube-api-access-p24zl") pod "552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" (UID: "552d2b0b-b147-4d5a-92d4-0c01c9a14cfb"). InnerVolumeSpecName "kube-api-access-p24zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.493142 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" (UID: "552d2b0b-b147-4d5a-92d4-0c01c9a14cfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.541803 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p24zl\" (UniqueName: \"kubernetes.io/projected/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-kube-api-access-p24zl\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.542023 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.542232 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.790985 4885 generic.go:334] "Generic (PLEG): container finished" podID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerID="ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8" exitCode=0 Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.791059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerDied","Data":"ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8"} Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.791147 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6pdb6" event={"ID":"552d2b0b-b147-4d5a-92d4-0c01c9a14cfb","Type":"ContainerDied","Data":"c5ee7465965ec77635d8950124463e0afc190160350b13cc77ce43e82b2b54fd"} Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.791185 4885 scope.go:117] "RemoveContainer" containerID="ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.791188 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6pdb6" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.828640 4885 scope.go:117] "RemoveContainer" containerID="e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.855404 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdb6"] Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.867888 4885 scope.go:117] "RemoveContainer" containerID="3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.870201 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6pdb6"] Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.905801 4885 scope.go:117] "RemoveContainer" containerID="ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8" Mar 08 20:35:59 crc kubenswrapper[4885]: E0308 20:35:59.906689 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8\": container with ID starting with ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8 not found: ID does not exist" containerID="ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.907034 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8"} err="failed to get container status \"ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8\": rpc error: code = NotFound desc = could not find container \"ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8\": container with ID starting with ce9e6565dadea382c28e349f86d92b773ca9c98e0d6471618ce86fe7590f1ba8 not found: ID does not exist" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.907910 4885 scope.go:117] "RemoveContainer" containerID="e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f" Mar 08 20:35:59 crc kubenswrapper[4885]: E0308 20:35:59.908824 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f\": container with ID starting with e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f not found: ID does not exist" containerID="e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.908890 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f"} err="failed to get container status \"e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f\": rpc error: code = NotFound desc = could not find container \"e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f\": container with ID starting with e7181fc70e1cc5ec7a05e73e986298d79beb91df7e33b7661d7abc2234cf1c1f not found: ID does not exist" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.908965 4885 scope.go:117] "RemoveContainer" containerID="3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96" Mar 08 20:35:59 crc kubenswrapper[4885]: E0308 20:35:59.909462 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96\": container with ID starting with 3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96 not found: ID does not exist" containerID="3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96" Mar 08 20:35:59 crc kubenswrapper[4885]: I0308 20:35:59.909515 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96"} err="failed to get container status \"3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96\": rpc error: code = NotFound desc = could not find container \"3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96\": container with ID starting with 3807a6a63dcfcfb3a3d0528105f8c1fa5c145a4ed48a23912706fd4c616b9e96 not found: ID does not exist" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.157750 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550036-mmj2c"] Mar 08 20:36:00 crc kubenswrapper[4885]: E0308 20:36:00.158305 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="registry-server" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.158318 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="registry-server" Mar 08 20:36:00 crc kubenswrapper[4885]: E0308 20:36:00.158340 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="extract-content" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.158348 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="extract-content" Mar 08 20:36:00 crc kubenswrapper[4885]: E0308 20:36:00.158361 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="extract-utilities" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.158367 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="extract-utilities" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.158512 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" containerName="registry-server" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.158928 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.163355 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.163576 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.164053 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.226346 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550036-mmj2c"] Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.253805 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggcgc\" (UniqueName: \"kubernetes.io/projected/4ba33559-23ce-4dec-a0fb-d479e47d6f1c-kube-api-access-ggcgc\") pod \"auto-csr-approver-29550036-mmj2c\" (UID: \"4ba33559-23ce-4dec-a0fb-d479e47d6f1c\") " pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.355034 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggcgc\" (UniqueName: \"kubernetes.io/projected/4ba33559-23ce-4dec-a0fb-d479e47d6f1c-kube-api-access-ggcgc\") pod \"auto-csr-approver-29550036-mmj2c\" (UID: \"4ba33559-23ce-4dec-a0fb-d479e47d6f1c\") " pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.379454 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggcgc\" (UniqueName: \"kubernetes.io/projected/4ba33559-23ce-4dec-a0fb-d479e47d6f1c-kube-api-access-ggcgc\") pod \"auto-csr-approver-29550036-mmj2c\" (UID: \"4ba33559-23ce-4dec-a0fb-d479e47d6f1c\") " pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.532510 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:00 crc kubenswrapper[4885]: I0308 20:36:00.812810 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550036-mmj2c"] Mar 08 20:36:01 crc kubenswrapper[4885]: I0308 20:36:01.383476 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="552d2b0b-b147-4d5a-92d4-0c01c9a14cfb" path="/var/lib/kubelet/pods/552d2b0b-b147-4d5a-92d4-0c01c9a14cfb/volumes" Mar 08 20:36:01 crc kubenswrapper[4885]: I0308 20:36:01.821665 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" event={"ID":"4ba33559-23ce-4dec-a0fb-d479e47d6f1c","Type":"ContainerStarted","Data":"c0b3dbd75ade62cc9e4c59bf23c43bb523fd418630f7cc03d789576b7a9de4d5"} Mar 08 20:36:02 crc kubenswrapper[4885]: I0308 20:36:02.818735 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:36:02 crc kubenswrapper[4885]: I0308 20:36:02.819156 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:36:02 crc kubenswrapper[4885]: I0308 20:36:02.834761 4885 generic.go:334] "Generic (PLEG): container finished" podID="4ba33559-23ce-4dec-a0fb-d479e47d6f1c" containerID="4655c509628eeb67ead0ef189a244c640e2a8d6513d351ea58b54fb12caa9de4" exitCode=0 Mar 08 20:36:02 crc kubenswrapper[4885]: I0308 20:36:02.834845 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" event={"ID":"4ba33559-23ce-4dec-a0fb-d479e47d6f1c","Type":"ContainerDied","Data":"4655c509628eeb67ead0ef189a244c640e2a8d6513d351ea58b54fb12caa9de4"} Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.269135 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.417730 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggcgc\" (UniqueName: \"kubernetes.io/projected/4ba33559-23ce-4dec-a0fb-d479e47d6f1c-kube-api-access-ggcgc\") pod \"4ba33559-23ce-4dec-a0fb-d479e47d6f1c\" (UID: \"4ba33559-23ce-4dec-a0fb-d479e47d6f1c\") " Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.426267 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ba33559-23ce-4dec-a0fb-d479e47d6f1c-kube-api-access-ggcgc" (OuterVolumeSpecName: "kube-api-access-ggcgc") pod "4ba33559-23ce-4dec-a0fb-d479e47d6f1c" (UID: "4ba33559-23ce-4dec-a0fb-d479e47d6f1c"). InnerVolumeSpecName "kube-api-access-ggcgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.524387 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggcgc\" (UniqueName: \"kubernetes.io/projected/4ba33559-23ce-4dec-a0fb-d479e47d6f1c-kube-api-access-ggcgc\") on node \"crc\" DevicePath \"\"" Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.856717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" event={"ID":"4ba33559-23ce-4dec-a0fb-d479e47d6f1c","Type":"ContainerDied","Data":"c0b3dbd75ade62cc9e4c59bf23c43bb523fd418630f7cc03d789576b7a9de4d5"} Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.856777 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0b3dbd75ade62cc9e4c59bf23c43bb523fd418630f7cc03d789576b7a9de4d5" Mar 08 20:36:04 crc kubenswrapper[4885]: I0308 20:36:04.856795 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550036-mmj2c" Mar 08 20:36:05 crc kubenswrapper[4885]: I0308 20:36:05.356201 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550030-trf4w"] Mar 08 20:36:05 crc kubenswrapper[4885]: I0308 20:36:05.384561 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550030-trf4w"] Mar 08 20:36:07 crc kubenswrapper[4885]: I0308 20:36:07.386196 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b137f1c0-32d5-44b8-b0e3-a7ae07052e53" path="/var/lib/kubelet/pods/b137f1c0-32d5-44b8-b0e3-a7ae07052e53/volumes" Mar 08 20:36:32 crc kubenswrapper[4885]: I0308 20:36:32.818805 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:36:32 crc kubenswrapper[4885]: I0308 20:36:32.819813 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:37:02 crc kubenswrapper[4885]: I0308 20:37:02.232593 4885 scope.go:117] "RemoveContainer" containerID="f14e28da607b9cdf53f4fec9037b95180b5cd2506e58c13ecacb85cc348f41e4" Mar 08 20:37:02 crc kubenswrapper[4885]: I0308 20:37:02.818741 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:37:02 crc kubenswrapper[4885]: I0308 20:37:02.819007 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:37:02 crc kubenswrapper[4885]: I0308 20:37:02.819208 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:37:02 crc kubenswrapper[4885]: I0308 20:37:02.821098 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:37:02 crc kubenswrapper[4885]: I0308 20:37:02.821273 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" gracePeriod=600 Mar 08 20:37:03 crc kubenswrapper[4885]: E0308 20:37:03.020609 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:37:03 crc kubenswrapper[4885]: I0308 20:37:03.442016 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" exitCode=0 Mar 08 20:37:03 crc kubenswrapper[4885]: I0308 20:37:03.442077 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4"} Mar 08 20:37:03 crc kubenswrapper[4885]: I0308 20:37:03.442126 4885 scope.go:117] "RemoveContainer" containerID="d25a5af4779cf4a9087bc38f1595551232392f2b22b234a85d5d3906024eb796" Mar 08 20:37:03 crc kubenswrapper[4885]: I0308 20:37:03.444234 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:37:03 crc kubenswrapper[4885]: E0308 20:37:03.444817 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:37:18 crc kubenswrapper[4885]: I0308 20:37:18.368603 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:37:18 crc kubenswrapper[4885]: E0308 20:37:18.369322 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:37:31 crc kubenswrapper[4885]: I0308 20:37:31.370080 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:37:31 crc kubenswrapper[4885]: E0308 20:37:31.374974 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:37:43 crc kubenswrapper[4885]: I0308 20:37:43.369027 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:37:43 crc kubenswrapper[4885]: E0308 20:37:43.369894 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:37:55 crc kubenswrapper[4885]: I0308 20:37:55.368383 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:37:55 crc kubenswrapper[4885]: E0308 20:37:55.369446 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.154155 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550038-r82w9"] Mar 08 20:38:00 crc kubenswrapper[4885]: E0308 20:38:00.154885 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ba33559-23ce-4dec-a0fb-d479e47d6f1c" containerName="oc" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.154904 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ba33559-23ce-4dec-a0fb-d479e47d6f1c" containerName="oc" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.155197 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ba33559-23ce-4dec-a0fb-d479e47d6f1c" containerName="oc" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.156137 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.158865 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.159289 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.159348 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.174097 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550038-r82w9"] Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.331035 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swfkf\" (UniqueName: \"kubernetes.io/projected/5d4b8072-1e19-4a26-b038-2a3c6d634760-kube-api-access-swfkf\") pod \"auto-csr-approver-29550038-r82w9\" (UID: \"5d4b8072-1e19-4a26-b038-2a3c6d634760\") " pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.432577 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swfkf\" (UniqueName: \"kubernetes.io/projected/5d4b8072-1e19-4a26-b038-2a3c6d634760-kube-api-access-swfkf\") pod \"auto-csr-approver-29550038-r82w9\" (UID: \"5d4b8072-1e19-4a26-b038-2a3c6d634760\") " pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.467118 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swfkf\" (UniqueName: \"kubernetes.io/projected/5d4b8072-1e19-4a26-b038-2a3c6d634760-kube-api-access-swfkf\") pod \"auto-csr-approver-29550038-r82w9\" (UID: \"5d4b8072-1e19-4a26-b038-2a3c6d634760\") " pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:00 crc kubenswrapper[4885]: I0308 20:38:00.481069 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:01 crc kubenswrapper[4885]: I0308 20:38:01.003386 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550038-r82w9"] Mar 08 20:38:01 crc kubenswrapper[4885]: I0308 20:38:01.945518 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550038-r82w9" event={"ID":"5d4b8072-1e19-4a26-b038-2a3c6d634760","Type":"ContainerStarted","Data":"9aec129bb802cca7c4bc295b3ce196351757ed4ddafbf8261cbec14420051973"} Mar 08 20:38:02 crc kubenswrapper[4885]: I0308 20:38:02.954201 4885 generic.go:334] "Generic (PLEG): container finished" podID="5d4b8072-1e19-4a26-b038-2a3c6d634760" containerID="dce1374ad69b07a0e467431de66db375a54313b87f63c7ea37e12c9eb571e627" exitCode=0 Mar 08 20:38:02 crc kubenswrapper[4885]: I0308 20:38:02.954263 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550038-r82w9" event={"ID":"5d4b8072-1e19-4a26-b038-2a3c6d634760","Type":"ContainerDied","Data":"dce1374ad69b07a0e467431de66db375a54313b87f63c7ea37e12c9eb571e627"} Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.411673 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.602836 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swfkf\" (UniqueName: \"kubernetes.io/projected/5d4b8072-1e19-4a26-b038-2a3c6d634760-kube-api-access-swfkf\") pod \"5d4b8072-1e19-4a26-b038-2a3c6d634760\" (UID: \"5d4b8072-1e19-4a26-b038-2a3c6d634760\") " Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.611367 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d4b8072-1e19-4a26-b038-2a3c6d634760-kube-api-access-swfkf" (OuterVolumeSpecName: "kube-api-access-swfkf") pod "5d4b8072-1e19-4a26-b038-2a3c6d634760" (UID: "5d4b8072-1e19-4a26-b038-2a3c6d634760"). InnerVolumeSpecName "kube-api-access-swfkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.704965 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swfkf\" (UniqueName: \"kubernetes.io/projected/5d4b8072-1e19-4a26-b038-2a3c6d634760-kube-api-access-swfkf\") on node \"crc\" DevicePath \"\"" Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.972983 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550038-r82w9" event={"ID":"5d4b8072-1e19-4a26-b038-2a3c6d634760","Type":"ContainerDied","Data":"9aec129bb802cca7c4bc295b3ce196351757ed4ddafbf8261cbec14420051973"} Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.973038 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9aec129bb802cca7c4bc295b3ce196351757ed4ddafbf8261cbec14420051973" Mar 08 20:38:04 crc kubenswrapper[4885]: I0308 20:38:04.973119 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550038-r82w9" Mar 08 20:38:05 crc kubenswrapper[4885]: I0308 20:38:05.508734 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550032-rvpv6"] Mar 08 20:38:05 crc kubenswrapper[4885]: I0308 20:38:05.519180 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550032-rvpv6"] Mar 08 20:38:07 crc kubenswrapper[4885]: I0308 20:38:07.385206 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9ba305f-b091-419d-ba98-701437bceab1" path="/var/lib/kubelet/pods/d9ba305f-b091-419d-ba98-701437bceab1/volumes" Mar 08 20:38:09 crc kubenswrapper[4885]: I0308 20:38:09.378744 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:38:09 crc kubenswrapper[4885]: E0308 20:38:09.379627 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:38:21 crc kubenswrapper[4885]: I0308 20:38:21.368256 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:38:21 crc kubenswrapper[4885]: E0308 20:38:21.369036 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:38:36 crc kubenswrapper[4885]: I0308 20:38:36.368043 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:38:36 crc kubenswrapper[4885]: E0308 20:38:36.369832 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.468476 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hd4x5"] Mar 08 20:38:41 crc kubenswrapper[4885]: E0308 20:38:41.469356 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d4b8072-1e19-4a26-b038-2a3c6d634760" containerName="oc" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.469372 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d4b8072-1e19-4a26-b038-2a3c6d634760" containerName="oc" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.469531 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d4b8072-1e19-4a26-b038-2a3c6d634760" containerName="oc" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.470723 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.494770 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hd4x5"] Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.639304 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkr6r\" (UniqueName: \"kubernetes.io/projected/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-kube-api-access-rkr6r\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.639416 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-utilities\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.639456 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-catalog-content\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.741026 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-catalog-content\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.741124 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkr6r\" (UniqueName: \"kubernetes.io/projected/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-kube-api-access-rkr6r\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.741173 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-utilities\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.741671 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-catalog-content\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.741694 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-utilities\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.765279 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkr6r\" (UniqueName: \"kubernetes.io/projected/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-kube-api-access-rkr6r\") pod \"community-operators-hd4x5\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:41 crc kubenswrapper[4885]: I0308 20:38:41.810062 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:42 crc kubenswrapper[4885]: I0308 20:38:42.313435 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hd4x5"] Mar 08 20:38:42 crc kubenswrapper[4885]: W0308 20:38:42.319375 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d8f1f01_8c81_4823_8ef7_85a09c6b6363.slice/crio-37dc900768c50f23591c69e25613183edcd7a4d2ecb14b71b92b4ce3c88eeb04 WatchSource:0}: Error finding container 37dc900768c50f23591c69e25613183edcd7a4d2ecb14b71b92b4ce3c88eeb04: Status 404 returned error can't find the container with id 37dc900768c50f23591c69e25613183edcd7a4d2ecb14b71b92b4ce3c88eeb04 Mar 08 20:38:42 crc kubenswrapper[4885]: I0308 20:38:42.534297 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerStarted","Data":"37dc900768c50f23591c69e25613183edcd7a4d2ecb14b71b92b4ce3c88eeb04"} Mar 08 20:38:43 crc kubenswrapper[4885]: I0308 20:38:43.551496 4885 generic.go:334] "Generic (PLEG): container finished" podID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerID="e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff" exitCode=0 Mar 08 20:38:43 crc kubenswrapper[4885]: I0308 20:38:43.551575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerDied","Data":"e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff"} Mar 08 20:38:44 crc kubenswrapper[4885]: I0308 20:38:44.560057 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerStarted","Data":"2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd"} Mar 08 20:38:45 crc kubenswrapper[4885]: I0308 20:38:45.571110 4885 generic.go:334] "Generic (PLEG): container finished" podID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerID="2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd" exitCode=0 Mar 08 20:38:45 crc kubenswrapper[4885]: I0308 20:38:45.571163 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerDied","Data":"2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd"} Mar 08 20:38:46 crc kubenswrapper[4885]: I0308 20:38:46.582486 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerStarted","Data":"16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c"} Mar 08 20:38:46 crc kubenswrapper[4885]: I0308 20:38:46.605580 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hd4x5" podStartSLOduration=2.979132731 podStartE2EDuration="5.605557525s" podCreationTimestamp="2026-03-08 20:38:41 +0000 UTC" firstStartedPulling="2026-03-08 20:38:43.554839521 +0000 UTC m=+4024.950893554" lastFinishedPulling="2026-03-08 20:38:46.181264315 +0000 UTC m=+4027.577318348" observedRunningTime="2026-03-08 20:38:46.603370796 +0000 UTC m=+4027.999424819" watchObservedRunningTime="2026-03-08 20:38:46.605557525 +0000 UTC m=+4028.001611588" Mar 08 20:38:50 crc kubenswrapper[4885]: I0308 20:38:50.368991 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:38:50 crc kubenswrapper[4885]: E0308 20:38:50.369532 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:38:51 crc kubenswrapper[4885]: I0308 20:38:51.810406 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:51 crc kubenswrapper[4885]: I0308 20:38:51.810504 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:51 crc kubenswrapper[4885]: I0308 20:38:51.888044 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:52 crc kubenswrapper[4885]: I0308 20:38:52.706771 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:52 crc kubenswrapper[4885]: I0308 20:38:52.786550 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hd4x5"] Mar 08 20:38:54 crc kubenswrapper[4885]: I0308 20:38:54.649671 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hd4x5" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="registry-server" containerID="cri-o://16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c" gracePeriod=2 Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.650759 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.657998 4885 generic.go:334] "Generic (PLEG): container finished" podID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerID="16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c" exitCode=0 Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.658044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerDied","Data":"16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c"} Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.658086 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hd4x5" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.658106 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hd4x5" event={"ID":"8d8f1f01-8c81-4823-8ef7-85a09c6b6363","Type":"ContainerDied","Data":"37dc900768c50f23591c69e25613183edcd7a4d2ecb14b71b92b4ce3c88eeb04"} Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.658129 4885 scope.go:117] "RemoveContainer" containerID="16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.664679 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkr6r\" (UniqueName: \"kubernetes.io/projected/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-kube-api-access-rkr6r\") pod \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.664750 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-utilities\") pod \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.664801 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-catalog-content\") pod \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\" (UID: \"8d8f1f01-8c81-4823-8ef7-85a09c6b6363\") " Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.668103 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-utilities" (OuterVolumeSpecName: "utilities") pod "8d8f1f01-8c81-4823-8ef7-85a09c6b6363" (UID: "8d8f1f01-8c81-4823-8ef7-85a09c6b6363"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.672009 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-kube-api-access-rkr6r" (OuterVolumeSpecName: "kube-api-access-rkr6r") pod "8d8f1f01-8c81-4823-8ef7-85a09c6b6363" (UID: "8d8f1f01-8c81-4823-8ef7-85a09c6b6363"). InnerVolumeSpecName "kube-api-access-rkr6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.702178 4885 scope.go:117] "RemoveContainer" containerID="2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.718758 4885 scope.go:117] "RemoveContainer" containerID="e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.742866 4885 scope.go:117] "RemoveContainer" containerID="16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c" Mar 08 20:38:55 crc kubenswrapper[4885]: E0308 20:38:55.743298 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c\": container with ID starting with 16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c not found: ID does not exist" containerID="16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.743327 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c"} err="failed to get container status \"16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c\": rpc error: code = NotFound desc = could not find container \"16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c\": container with ID starting with 16884aca492ae5023f2ab7da3f3f73da80caf3754408621e1812f95b79480e1c not found: ID does not exist" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.743346 4885 scope.go:117] "RemoveContainer" containerID="2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd" Mar 08 20:38:55 crc kubenswrapper[4885]: E0308 20:38:55.743657 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd\": container with ID starting with 2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd not found: ID does not exist" containerID="2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.743683 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd"} err="failed to get container status \"2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd\": rpc error: code = NotFound desc = could not find container \"2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd\": container with ID starting with 2f8aa6897e9ca9a3ce2c73d781241212d505375b5f209a4739ecd920d040c1bd not found: ID does not exist" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.743696 4885 scope.go:117] "RemoveContainer" containerID="e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.743718 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d8f1f01-8c81-4823-8ef7-85a09c6b6363" (UID: "8d8f1f01-8c81-4823-8ef7-85a09c6b6363"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:38:55 crc kubenswrapper[4885]: E0308 20:38:55.744230 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff\": container with ID starting with e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff not found: ID does not exist" containerID="e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.744276 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff"} err="failed to get container status \"e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff\": rpc error: code = NotFound desc = could not find container \"e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff\": container with ID starting with e8977b792e7ed092ac37fcbe33a79b49c9261ccc4543ebceab1f356288970bff not found: ID does not exist" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.765890 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.765943 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkr6r\" (UniqueName: \"kubernetes.io/projected/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-kube-api-access-rkr6r\") on node \"crc\" DevicePath \"\"" Mar 08 20:38:55 crc kubenswrapper[4885]: I0308 20:38:55.765958 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d8f1f01-8c81-4823-8ef7-85a09c6b6363-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:38:56 crc kubenswrapper[4885]: I0308 20:38:56.010051 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hd4x5"] Mar 08 20:38:56 crc kubenswrapper[4885]: I0308 20:38:56.016792 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hd4x5"] Mar 08 20:38:57 crc kubenswrapper[4885]: I0308 20:38:57.381053 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" path="/var/lib/kubelet/pods/8d8f1f01-8c81-4823-8ef7-85a09c6b6363/volumes" Mar 08 20:39:02 crc kubenswrapper[4885]: I0308 20:39:02.356611 4885 scope.go:117] "RemoveContainer" containerID="bdf74962d126ba2f43f277a948cdef5d47d9a79c0f03133bc1218e4128ca8e51" Mar 08 20:39:02 crc kubenswrapper[4885]: I0308 20:39:02.368832 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:39:02 crc kubenswrapper[4885]: E0308 20:39:02.369459 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:39:16 crc kubenswrapper[4885]: I0308 20:39:16.369244 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:39:16 crc kubenswrapper[4885]: E0308 20:39:16.370291 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:39:29 crc kubenswrapper[4885]: I0308 20:39:29.374868 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:39:29 crc kubenswrapper[4885]: E0308 20:39:29.375834 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:39:44 crc kubenswrapper[4885]: I0308 20:39:44.368776 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:39:44 crc kubenswrapper[4885]: E0308 20:39:44.370746 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:39:58 crc kubenswrapper[4885]: I0308 20:39:58.369214 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:39:58 crc kubenswrapper[4885]: E0308 20:39:58.372330 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.147919 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550040-n7d55"] Mar 08 20:40:00 crc kubenswrapper[4885]: E0308 20:40:00.148190 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="extract-content" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.148202 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="extract-content" Mar 08 20:40:00 crc kubenswrapper[4885]: E0308 20:40:00.148213 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="extract-utilities" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.148219 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="extract-utilities" Mar 08 20:40:00 crc kubenswrapper[4885]: E0308 20:40:00.148244 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="registry-server" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.148251 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="registry-server" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.148367 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d8f1f01-8c81-4823-8ef7-85a09c6b6363" containerName="registry-server" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.148768 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.153453 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.163538 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.163786 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.167672 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550040-n7d55"] Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.308551 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vttcs\" (UniqueName: \"kubernetes.io/projected/f06a86c2-3f71-42f0-8f33-558ffba8e527-kube-api-access-vttcs\") pod \"auto-csr-approver-29550040-n7d55\" (UID: \"f06a86c2-3f71-42f0-8f33-558ffba8e527\") " pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.410065 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vttcs\" (UniqueName: \"kubernetes.io/projected/f06a86c2-3f71-42f0-8f33-558ffba8e527-kube-api-access-vttcs\") pod \"auto-csr-approver-29550040-n7d55\" (UID: \"f06a86c2-3f71-42f0-8f33-558ffba8e527\") " pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.430228 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vttcs\" (UniqueName: \"kubernetes.io/projected/f06a86c2-3f71-42f0-8f33-558ffba8e527-kube-api-access-vttcs\") pod \"auto-csr-approver-29550040-n7d55\" (UID: \"f06a86c2-3f71-42f0-8f33-558ffba8e527\") " pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.481767 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:00 crc kubenswrapper[4885]: I0308 20:40:00.816705 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550040-n7d55"] Mar 08 20:40:01 crc kubenswrapper[4885]: I0308 20:40:01.173125 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:40:01 crc kubenswrapper[4885]: I0308 20:40:01.294004 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550040-n7d55" event={"ID":"f06a86c2-3f71-42f0-8f33-558ffba8e527","Type":"ContainerStarted","Data":"6a5ae7dc153802abf9af020cc9f0aebaff2d895024e66d13bba8161078742691"} Mar 08 20:40:04 crc kubenswrapper[4885]: I0308 20:40:04.324597 4885 generic.go:334] "Generic (PLEG): container finished" podID="f06a86c2-3f71-42f0-8f33-558ffba8e527" containerID="cd2399263f696f150807aac189b8392d7375833b5910349cae536de7fbfd333a" exitCode=0 Mar 08 20:40:04 crc kubenswrapper[4885]: I0308 20:40:04.324669 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550040-n7d55" event={"ID":"f06a86c2-3f71-42f0-8f33-558ffba8e527","Type":"ContainerDied","Data":"cd2399263f696f150807aac189b8392d7375833b5910349cae536de7fbfd333a"} Mar 08 20:40:05 crc kubenswrapper[4885]: I0308 20:40:05.744009 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:05 crc kubenswrapper[4885]: I0308 20:40:05.795724 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vttcs\" (UniqueName: \"kubernetes.io/projected/f06a86c2-3f71-42f0-8f33-558ffba8e527-kube-api-access-vttcs\") pod \"f06a86c2-3f71-42f0-8f33-558ffba8e527\" (UID: \"f06a86c2-3f71-42f0-8f33-558ffba8e527\") " Mar 08 20:40:05 crc kubenswrapper[4885]: I0308 20:40:05.803870 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f06a86c2-3f71-42f0-8f33-558ffba8e527-kube-api-access-vttcs" (OuterVolumeSpecName: "kube-api-access-vttcs") pod "f06a86c2-3f71-42f0-8f33-558ffba8e527" (UID: "f06a86c2-3f71-42f0-8f33-558ffba8e527"). InnerVolumeSpecName "kube-api-access-vttcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:40:05 crc kubenswrapper[4885]: I0308 20:40:05.897400 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vttcs\" (UniqueName: \"kubernetes.io/projected/f06a86c2-3f71-42f0-8f33-558ffba8e527-kube-api-access-vttcs\") on node \"crc\" DevicePath \"\"" Mar 08 20:40:06 crc kubenswrapper[4885]: I0308 20:40:06.345443 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550040-n7d55" event={"ID":"f06a86c2-3f71-42f0-8f33-558ffba8e527","Type":"ContainerDied","Data":"6a5ae7dc153802abf9af020cc9f0aebaff2d895024e66d13bba8161078742691"} Mar 08 20:40:06 crc kubenswrapper[4885]: I0308 20:40:06.345833 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a5ae7dc153802abf9af020cc9f0aebaff2d895024e66d13bba8161078742691" Mar 08 20:40:06 crc kubenswrapper[4885]: I0308 20:40:06.346066 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550040-n7d55" Mar 08 20:40:06 crc kubenswrapper[4885]: I0308 20:40:06.837829 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550034-jl6bg"] Mar 08 20:40:06 crc kubenswrapper[4885]: I0308 20:40:06.845248 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550034-jl6bg"] Mar 08 20:40:07 crc kubenswrapper[4885]: I0308 20:40:07.385692 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02d14a7-00be-4808-9f97-ac3c16ae727a" path="/var/lib/kubelet/pods/d02d14a7-00be-4808-9f97-ac3c16ae727a/volumes" Mar 08 20:40:13 crc kubenswrapper[4885]: I0308 20:40:13.368204 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:40:13 crc kubenswrapper[4885]: E0308 20:40:13.368810 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:40:24 crc kubenswrapper[4885]: I0308 20:40:24.368728 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:40:24 crc kubenswrapper[4885]: E0308 20:40:24.369822 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:40:36 crc kubenswrapper[4885]: I0308 20:40:36.369074 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:40:36 crc kubenswrapper[4885]: E0308 20:40:36.369981 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:40:51 crc kubenswrapper[4885]: I0308 20:40:51.369452 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:40:51 crc kubenswrapper[4885]: E0308 20:40:51.370694 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:41:02 crc kubenswrapper[4885]: I0308 20:41:02.477756 4885 scope.go:117] "RemoveContainer" containerID="4ed950f44e01488b5e84b2e6cb1b702242d1797af5d3e1be27eec2846e142c48" Mar 08 20:41:03 crc kubenswrapper[4885]: I0308 20:41:03.369358 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:41:03 crc kubenswrapper[4885]: E0308 20:41:03.369811 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:41:15 crc kubenswrapper[4885]: I0308 20:41:15.377753 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:41:15 crc kubenswrapper[4885]: E0308 20:41:15.378687 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:41:29 crc kubenswrapper[4885]: I0308 20:41:29.371671 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:41:29 crc kubenswrapper[4885]: E0308 20:41:29.372402 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:41:40 crc kubenswrapper[4885]: I0308 20:41:40.368689 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:41:40 crc kubenswrapper[4885]: E0308 20:41:40.369915 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:41:52 crc kubenswrapper[4885]: I0308 20:41:52.368419 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:41:52 crc kubenswrapper[4885]: E0308 20:41:52.369640 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.162662 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550042-d4xpj"] Mar 08 20:42:00 crc kubenswrapper[4885]: E0308 20:42:00.163789 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f06a86c2-3f71-42f0-8f33-558ffba8e527" containerName="oc" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.163815 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f06a86c2-3f71-42f0-8f33-558ffba8e527" containerName="oc" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.164143 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f06a86c2-3f71-42f0-8f33-558ffba8e527" containerName="oc" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.164843 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.168241 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.168876 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.168881 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.181896 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jqgt\" (UniqueName: \"kubernetes.io/projected/34acfccb-db62-40e1-b46c-3227ce6e32ab-kube-api-access-7jqgt\") pod \"auto-csr-approver-29550042-d4xpj\" (UID: \"34acfccb-db62-40e1-b46c-3227ce6e32ab\") " pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.186579 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550042-d4xpj"] Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.284214 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jqgt\" (UniqueName: \"kubernetes.io/projected/34acfccb-db62-40e1-b46c-3227ce6e32ab-kube-api-access-7jqgt\") pod \"auto-csr-approver-29550042-d4xpj\" (UID: \"34acfccb-db62-40e1-b46c-3227ce6e32ab\") " pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.312108 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jqgt\" (UniqueName: \"kubernetes.io/projected/34acfccb-db62-40e1-b46c-3227ce6e32ab-kube-api-access-7jqgt\") pod \"auto-csr-approver-29550042-d4xpj\" (UID: \"34acfccb-db62-40e1-b46c-3227ce6e32ab\") " pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.503569 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:00 crc kubenswrapper[4885]: I0308 20:42:00.769230 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550042-d4xpj"] Mar 08 20:42:01 crc kubenswrapper[4885]: I0308 20:42:01.417091 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" event={"ID":"34acfccb-db62-40e1-b46c-3227ce6e32ab","Type":"ContainerStarted","Data":"d01447b3c7e3221c3dee3e2c801f3a57b4a606171c98022c9edfe738108226cb"} Mar 08 20:42:02 crc kubenswrapper[4885]: I0308 20:42:02.426473 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" event={"ID":"34acfccb-db62-40e1-b46c-3227ce6e32ab","Type":"ContainerStarted","Data":"f37032ea52adfa6ddae3677872b806d5bfa71e165ea8806d1e82b81025d0feb8"} Mar 08 20:42:02 crc kubenswrapper[4885]: I0308 20:42:02.447804 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" podStartSLOduration=1.418156971 podStartE2EDuration="2.447786514s" podCreationTimestamp="2026-03-08 20:42:00 +0000 UTC" firstStartedPulling="2026-03-08 20:42:00.881442401 +0000 UTC m=+4222.277496484" lastFinishedPulling="2026-03-08 20:42:01.911071994 +0000 UTC m=+4223.307126027" observedRunningTime="2026-03-08 20:42:02.44170563 +0000 UTC m=+4223.837759683" watchObservedRunningTime="2026-03-08 20:42:02.447786514 +0000 UTC m=+4223.843840547" Mar 08 20:42:03 crc kubenswrapper[4885]: I0308 20:42:03.437402 4885 generic.go:334] "Generic (PLEG): container finished" podID="34acfccb-db62-40e1-b46c-3227ce6e32ab" containerID="f37032ea52adfa6ddae3677872b806d5bfa71e165ea8806d1e82b81025d0feb8" exitCode=0 Mar 08 20:42:03 crc kubenswrapper[4885]: I0308 20:42:03.437542 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" event={"ID":"34acfccb-db62-40e1-b46c-3227ce6e32ab","Type":"ContainerDied","Data":"f37032ea52adfa6ddae3677872b806d5bfa71e165ea8806d1e82b81025d0feb8"} Mar 08 20:42:04 crc kubenswrapper[4885]: I0308 20:42:04.797237 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:04 crc kubenswrapper[4885]: I0308 20:42:04.857242 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jqgt\" (UniqueName: \"kubernetes.io/projected/34acfccb-db62-40e1-b46c-3227ce6e32ab-kube-api-access-7jqgt\") pod \"34acfccb-db62-40e1-b46c-3227ce6e32ab\" (UID: \"34acfccb-db62-40e1-b46c-3227ce6e32ab\") " Mar 08 20:42:04 crc kubenswrapper[4885]: I0308 20:42:04.864146 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34acfccb-db62-40e1-b46c-3227ce6e32ab-kube-api-access-7jqgt" (OuterVolumeSpecName: "kube-api-access-7jqgt") pod "34acfccb-db62-40e1-b46c-3227ce6e32ab" (UID: "34acfccb-db62-40e1-b46c-3227ce6e32ab"). InnerVolumeSpecName "kube-api-access-7jqgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:42:04 crc kubenswrapper[4885]: I0308 20:42:04.960038 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jqgt\" (UniqueName: \"kubernetes.io/projected/34acfccb-db62-40e1-b46c-3227ce6e32ab-kube-api-access-7jqgt\") on node \"crc\" DevicePath \"\"" Mar 08 20:42:05 crc kubenswrapper[4885]: I0308 20:42:05.456736 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" event={"ID":"34acfccb-db62-40e1-b46c-3227ce6e32ab","Type":"ContainerDied","Data":"d01447b3c7e3221c3dee3e2c801f3a57b4a606171c98022c9edfe738108226cb"} Mar 08 20:42:05 crc kubenswrapper[4885]: I0308 20:42:05.456790 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d01447b3c7e3221c3dee3e2c801f3a57b4a606171c98022c9edfe738108226cb" Mar 08 20:42:05 crc kubenswrapper[4885]: I0308 20:42:05.456808 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550042-d4xpj" Mar 08 20:42:05 crc kubenswrapper[4885]: I0308 20:42:05.533954 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550036-mmj2c"] Mar 08 20:42:05 crc kubenswrapper[4885]: I0308 20:42:05.546071 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550036-mmj2c"] Mar 08 20:42:06 crc kubenswrapper[4885]: I0308 20:42:06.368326 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:42:07 crc kubenswrapper[4885]: I0308 20:42:07.381177 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ba33559-23ce-4dec-a0fb-d479e47d6f1c" path="/var/lib/kubelet/pods/4ba33559-23ce-4dec-a0fb-d479e47d6f1c/volumes" Mar 08 20:42:07 crc kubenswrapper[4885]: I0308 20:42:07.499702 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"e901ac3813cd54c99c26fb9995667f5096a1aafbc4c007ea9eda1c542b1947a8"} Mar 08 20:43:02 crc kubenswrapper[4885]: I0308 20:43:02.869645 4885 scope.go:117] "RemoveContainer" containerID="4655c509628eeb67ead0ef189a244c640e2a8d6513d351ea58b54fb12caa9de4" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.174403 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550044-682qm"] Mar 08 20:44:00 crc kubenswrapper[4885]: E0308 20:44:00.175838 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34acfccb-db62-40e1-b46c-3227ce6e32ab" containerName="oc" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.175872 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="34acfccb-db62-40e1-b46c-3227ce6e32ab" containerName="oc" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.176242 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="34acfccb-db62-40e1-b46c-3227ce6e32ab" containerName="oc" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.177264 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.181670 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550044-682qm"] Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.182667 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.182966 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.183327 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.332871 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t98fw\" (UniqueName: \"kubernetes.io/projected/8cbaad9b-e652-438c-9b41-f414447382c5-kube-api-access-t98fw\") pod \"auto-csr-approver-29550044-682qm\" (UID: \"8cbaad9b-e652-438c-9b41-f414447382c5\") " pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.434406 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t98fw\" (UniqueName: \"kubernetes.io/projected/8cbaad9b-e652-438c-9b41-f414447382c5-kube-api-access-t98fw\") pod \"auto-csr-approver-29550044-682qm\" (UID: \"8cbaad9b-e652-438c-9b41-f414447382c5\") " pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.455945 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t98fw\" (UniqueName: \"kubernetes.io/projected/8cbaad9b-e652-438c-9b41-f414447382c5-kube-api-access-t98fw\") pod \"auto-csr-approver-29550044-682qm\" (UID: \"8cbaad9b-e652-438c-9b41-f414447382c5\") " pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.506955 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:00 crc kubenswrapper[4885]: I0308 20:44:00.965842 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550044-682qm"] Mar 08 20:44:01 crc kubenswrapper[4885]: I0308 20:44:01.582800 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550044-682qm" event={"ID":"8cbaad9b-e652-438c-9b41-f414447382c5","Type":"ContainerStarted","Data":"2f4b9b024ce81457b0f9f5b00402fd0eeb7b7ae7e5bc5372f49638c5ccad9a3a"} Mar 08 20:44:02 crc kubenswrapper[4885]: I0308 20:44:02.594123 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550044-682qm" event={"ID":"8cbaad9b-e652-438c-9b41-f414447382c5","Type":"ContainerStarted","Data":"fc9b989616ec5b500765230953c308a829faaac795ceed2b8cacd49b9b6ec121"} Mar 08 20:44:02 crc kubenswrapper[4885]: I0308 20:44:02.617655 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550044-682qm" podStartSLOduration=1.451696281 podStartE2EDuration="2.617628468s" podCreationTimestamp="2026-03-08 20:44:00 +0000 UTC" firstStartedPulling="2026-03-08 20:44:00.974289652 +0000 UTC m=+4342.370343725" lastFinishedPulling="2026-03-08 20:44:02.140221849 +0000 UTC m=+4343.536275912" observedRunningTime="2026-03-08 20:44:02.616835447 +0000 UTC m=+4344.012889510" watchObservedRunningTime="2026-03-08 20:44:02.617628468 +0000 UTC m=+4344.013682531" Mar 08 20:44:03 crc kubenswrapper[4885]: I0308 20:44:03.604443 4885 generic.go:334] "Generic (PLEG): container finished" podID="8cbaad9b-e652-438c-9b41-f414447382c5" containerID="fc9b989616ec5b500765230953c308a829faaac795ceed2b8cacd49b9b6ec121" exitCode=0 Mar 08 20:44:03 crc kubenswrapper[4885]: I0308 20:44:03.604520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550044-682qm" event={"ID":"8cbaad9b-e652-438c-9b41-f414447382c5","Type":"ContainerDied","Data":"fc9b989616ec5b500765230953c308a829faaac795ceed2b8cacd49b9b6ec121"} Mar 08 20:44:04 crc kubenswrapper[4885]: I0308 20:44:04.960828 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.134656 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t98fw\" (UniqueName: \"kubernetes.io/projected/8cbaad9b-e652-438c-9b41-f414447382c5-kube-api-access-t98fw\") pod \"8cbaad9b-e652-438c-9b41-f414447382c5\" (UID: \"8cbaad9b-e652-438c-9b41-f414447382c5\") " Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.142438 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cbaad9b-e652-438c-9b41-f414447382c5-kube-api-access-t98fw" (OuterVolumeSpecName: "kube-api-access-t98fw") pod "8cbaad9b-e652-438c-9b41-f414447382c5" (UID: "8cbaad9b-e652-438c-9b41-f414447382c5"). InnerVolumeSpecName "kube-api-access-t98fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.236671 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t98fw\" (UniqueName: \"kubernetes.io/projected/8cbaad9b-e652-438c-9b41-f414447382c5-kube-api-access-t98fw\") on node \"crc\" DevicePath \"\"" Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.626677 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550044-682qm" event={"ID":"8cbaad9b-e652-438c-9b41-f414447382c5","Type":"ContainerDied","Data":"2f4b9b024ce81457b0f9f5b00402fd0eeb7b7ae7e5bc5372f49638c5ccad9a3a"} Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.626714 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f4b9b024ce81457b0f9f5b00402fd0eeb7b7ae7e5bc5372f49638c5ccad9a3a" Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.626782 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550044-682qm" Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.720435 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550038-r82w9"] Mar 08 20:44:05 crc kubenswrapper[4885]: I0308 20:44:05.728652 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550038-r82w9"] Mar 08 20:44:07 crc kubenswrapper[4885]: I0308 20:44:07.386183 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d4b8072-1e19-4a26-b038-2a3c6d634760" path="/var/lib/kubelet/pods/5d4b8072-1e19-4a26-b038-2a3c6d634760/volumes" Mar 08 20:44:32 crc kubenswrapper[4885]: I0308 20:44:32.818459 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:44:32 crc kubenswrapper[4885]: I0308 20:44:32.821319 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.180896 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r"] Mar 08 20:45:00 crc kubenswrapper[4885]: E0308 20:45:00.182833 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cbaad9b-e652-438c-9b41-f414447382c5" containerName="oc" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.182873 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cbaad9b-e652-438c-9b41-f414447382c5" containerName="oc" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.183214 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cbaad9b-e652-438c-9b41-f414447382c5" containerName="oc" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.184139 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.190970 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.191253 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.242656 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r"] Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.263748 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/414af8b3-3809-477a-a110-9acaf82a7a3b-secret-volume\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.264031 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8m8q\" (UniqueName: \"kubernetes.io/projected/414af8b3-3809-477a-a110-9acaf82a7a3b-kube-api-access-j8m8q\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.264055 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/414af8b3-3809-477a-a110-9acaf82a7a3b-config-volume\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.365458 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8m8q\" (UniqueName: \"kubernetes.io/projected/414af8b3-3809-477a-a110-9acaf82a7a3b-kube-api-access-j8m8q\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.365517 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/414af8b3-3809-477a-a110-9acaf82a7a3b-config-volume\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.365583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/414af8b3-3809-477a-a110-9acaf82a7a3b-secret-volume\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.367174 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/414af8b3-3809-477a-a110-9acaf82a7a3b-config-volume\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.372342 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/414af8b3-3809-477a-a110-9acaf82a7a3b-secret-volume\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.393027 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8m8q\" (UniqueName: \"kubernetes.io/projected/414af8b3-3809-477a-a110-9acaf82a7a3b-kube-api-access-j8m8q\") pod \"collect-profiles-29550045-jnm6r\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:00 crc kubenswrapper[4885]: I0308 20:45:00.542774 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:01 crc kubenswrapper[4885]: I0308 20:45:01.058525 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r"] Mar 08 20:45:01 crc kubenswrapper[4885]: I0308 20:45:01.178017 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" event={"ID":"414af8b3-3809-477a-a110-9acaf82a7a3b","Type":"ContainerStarted","Data":"8832d09ad306d4bc97a3537f40f92acf02565c06ec3f87254517f9c5fcc5af75"} Mar 08 20:45:02 crc kubenswrapper[4885]: I0308 20:45:02.198295 4885 generic.go:334] "Generic (PLEG): container finished" podID="414af8b3-3809-477a-a110-9acaf82a7a3b" containerID="62ad3a335e07200b3e1dfc3daa3934ac465add0682b6dca882716bb449686e0a" exitCode=0 Mar 08 20:45:02 crc kubenswrapper[4885]: I0308 20:45:02.198536 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" event={"ID":"414af8b3-3809-477a-a110-9acaf82a7a3b","Type":"ContainerDied","Data":"62ad3a335e07200b3e1dfc3daa3934ac465add0682b6dca882716bb449686e0a"} Mar 08 20:45:02 crc kubenswrapper[4885]: I0308 20:45:02.843993 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:45:02 crc kubenswrapper[4885]: I0308 20:45:02.844076 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.001597 4885 scope.go:117] "RemoveContainer" containerID="dce1374ad69b07a0e467431de66db375a54313b87f63c7ea37e12c9eb571e627" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.549306 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.623708 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/414af8b3-3809-477a-a110-9acaf82a7a3b-config-volume\") pod \"414af8b3-3809-477a-a110-9acaf82a7a3b\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.623761 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8m8q\" (UniqueName: \"kubernetes.io/projected/414af8b3-3809-477a-a110-9acaf82a7a3b-kube-api-access-j8m8q\") pod \"414af8b3-3809-477a-a110-9acaf82a7a3b\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.624026 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/414af8b3-3809-477a-a110-9acaf82a7a3b-secret-volume\") pod \"414af8b3-3809-477a-a110-9acaf82a7a3b\" (UID: \"414af8b3-3809-477a-a110-9acaf82a7a3b\") " Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.624466 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/414af8b3-3809-477a-a110-9acaf82a7a3b-config-volume" (OuterVolumeSpecName: "config-volume") pod "414af8b3-3809-477a-a110-9acaf82a7a3b" (UID: "414af8b3-3809-477a-a110-9acaf82a7a3b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.629476 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414af8b3-3809-477a-a110-9acaf82a7a3b-kube-api-access-j8m8q" (OuterVolumeSpecName: "kube-api-access-j8m8q") pod "414af8b3-3809-477a-a110-9acaf82a7a3b" (UID: "414af8b3-3809-477a-a110-9acaf82a7a3b"). InnerVolumeSpecName "kube-api-access-j8m8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.630362 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/414af8b3-3809-477a-a110-9acaf82a7a3b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "414af8b3-3809-477a-a110-9acaf82a7a3b" (UID: "414af8b3-3809-477a-a110-9acaf82a7a3b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.724833 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/414af8b3-3809-477a-a110-9acaf82a7a3b-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.724869 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/414af8b3-3809-477a-a110-9acaf82a7a3b-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 20:45:03 crc kubenswrapper[4885]: I0308 20:45:03.724883 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8m8q\" (UniqueName: \"kubernetes.io/projected/414af8b3-3809-477a-a110-9acaf82a7a3b-kube-api-access-j8m8q\") on node \"crc\" DevicePath \"\"" Mar 08 20:45:04 crc kubenswrapper[4885]: I0308 20:45:04.218497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" event={"ID":"414af8b3-3809-477a-a110-9acaf82a7a3b","Type":"ContainerDied","Data":"8832d09ad306d4bc97a3537f40f92acf02565c06ec3f87254517f9c5fcc5af75"} Mar 08 20:45:04 crc kubenswrapper[4885]: I0308 20:45:04.218557 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8832d09ad306d4bc97a3537f40f92acf02565c06ec3f87254517f9c5fcc5af75" Mar 08 20:45:04 crc kubenswrapper[4885]: I0308 20:45:04.218613 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r" Mar 08 20:45:04 crc kubenswrapper[4885]: I0308 20:45:04.651447 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll"] Mar 08 20:45:04 crc kubenswrapper[4885]: I0308 20:45:04.663282 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550000-m9vll"] Mar 08 20:45:05 crc kubenswrapper[4885]: I0308 20:45:05.382328 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0336d864-07a3-41ed-9327-8a39d16d667f" path="/var/lib/kubelet/pods/0336d864-07a3-41ed-9327-8a39d16d667f/volumes" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.000381 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-knn77"] Mar 08 20:45:32 crc kubenswrapper[4885]: E0308 20:45:32.001529 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414af8b3-3809-477a-a110-9acaf82a7a3b" containerName="collect-profiles" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.001551 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="414af8b3-3809-477a-a110-9acaf82a7a3b" containerName="collect-profiles" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.001819 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="414af8b3-3809-477a-a110-9acaf82a7a3b" containerName="collect-profiles" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.003672 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.009715 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knn77"] Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.115908 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xsxg\" (UniqueName: \"kubernetes.io/projected/8135c179-1825-4687-93d5-8498573a991c-kube-api-access-6xsxg\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.116071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-catalog-content\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.116277 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-utilities\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.218382 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-utilities\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.218689 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xsxg\" (UniqueName: \"kubernetes.io/projected/8135c179-1825-4687-93d5-8498573a991c-kube-api-access-6xsxg\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.218768 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-catalog-content\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.218987 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-utilities\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.219387 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-catalog-content\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.251573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xsxg\" (UniqueName: \"kubernetes.io/projected/8135c179-1825-4687-93d5-8498573a991c-kube-api-access-6xsxg\") pod \"redhat-operators-knn77\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.338878 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.795064 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-knn77"] Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.818872 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.818950 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.819011 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.819616 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e901ac3813cd54c99c26fb9995667f5096a1aafbc4c007ea9eda1c542b1947a8"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:45:32 crc kubenswrapper[4885]: I0308 20:45:32.819690 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://e901ac3813cd54c99c26fb9995667f5096a1aafbc4c007ea9eda1c542b1947a8" gracePeriod=600 Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.483322 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="e901ac3813cd54c99c26fb9995667f5096a1aafbc4c007ea9eda1c542b1947a8" exitCode=0 Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.483418 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"e901ac3813cd54c99c26fb9995667f5096a1aafbc4c007ea9eda1c542b1947a8"} Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.483708 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6"} Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.483729 4885 scope.go:117] "RemoveContainer" containerID="4334c93d948de31823fbe66392660f04fe08e163869595a5cb6103ea88216cb4" Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.487438 4885 generic.go:334] "Generic (PLEG): container finished" podID="8135c179-1825-4687-93d5-8498573a991c" containerID="34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778" exitCode=0 Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.487574 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerDied","Data":"34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778"} Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.487697 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerStarted","Data":"59098b090e1126b0e7d7b18062b58caa8e180d7c1dec6b100ccbb38d0edcbeef"} Mar 08 20:45:33 crc kubenswrapper[4885]: I0308 20:45:33.489711 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:45:35 crc kubenswrapper[4885]: I0308 20:45:35.512277 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerStarted","Data":"146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729"} Mar 08 20:45:36 crc kubenswrapper[4885]: I0308 20:45:36.523277 4885 generic.go:334] "Generic (PLEG): container finished" podID="8135c179-1825-4687-93d5-8498573a991c" containerID="146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729" exitCode=0 Mar 08 20:45:36 crc kubenswrapper[4885]: I0308 20:45:36.523341 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerDied","Data":"146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729"} Mar 08 20:45:37 crc kubenswrapper[4885]: I0308 20:45:37.539135 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerStarted","Data":"d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966"} Mar 08 20:45:37 crc kubenswrapper[4885]: I0308 20:45:37.577747 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-knn77" podStartSLOduration=3.172405744 podStartE2EDuration="6.577720829s" podCreationTimestamp="2026-03-08 20:45:31 +0000 UTC" firstStartedPulling="2026-03-08 20:45:33.489443474 +0000 UTC m=+4434.885497507" lastFinishedPulling="2026-03-08 20:45:36.894758529 +0000 UTC m=+4438.290812592" observedRunningTime="2026-03-08 20:45:37.566399226 +0000 UTC m=+4438.962453319" watchObservedRunningTime="2026-03-08 20:45:37.577720829 +0000 UTC m=+4438.973774862" Mar 08 20:45:42 crc kubenswrapper[4885]: I0308 20:45:42.339651 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:42 crc kubenswrapper[4885]: I0308 20:45:42.340261 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:43 crc kubenswrapper[4885]: I0308 20:45:43.417180 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-knn77" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="registry-server" probeResult="failure" output=< Mar 08 20:45:43 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 20:45:43 crc kubenswrapper[4885]: > Mar 08 20:45:52 crc kubenswrapper[4885]: I0308 20:45:52.428593 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:52 crc kubenswrapper[4885]: I0308 20:45:52.509968 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:52 crc kubenswrapper[4885]: I0308 20:45:52.685978 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knn77"] Mar 08 20:45:53 crc kubenswrapper[4885]: I0308 20:45:53.701710 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-knn77" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="registry-server" containerID="cri-o://d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966" gracePeriod=2 Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.243450 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.433666 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-catalog-content\") pod \"8135c179-1825-4687-93d5-8498573a991c\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.433886 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-utilities\") pod \"8135c179-1825-4687-93d5-8498573a991c\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.434335 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xsxg\" (UniqueName: \"kubernetes.io/projected/8135c179-1825-4687-93d5-8498573a991c-kube-api-access-6xsxg\") pod \"8135c179-1825-4687-93d5-8498573a991c\" (UID: \"8135c179-1825-4687-93d5-8498573a991c\") " Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.435852 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-utilities" (OuterVolumeSpecName: "utilities") pod "8135c179-1825-4687-93d5-8498573a991c" (UID: "8135c179-1825-4687-93d5-8498573a991c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.447161 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8135c179-1825-4687-93d5-8498573a991c-kube-api-access-6xsxg" (OuterVolumeSpecName: "kube-api-access-6xsxg") pod "8135c179-1825-4687-93d5-8498573a991c" (UID: "8135c179-1825-4687-93d5-8498573a991c"). InnerVolumeSpecName "kube-api-access-6xsxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.537305 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xsxg\" (UniqueName: \"kubernetes.io/projected/8135c179-1825-4687-93d5-8498573a991c-kube-api-access-6xsxg\") on node \"crc\" DevicePath \"\"" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.537353 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.611524 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8135c179-1825-4687-93d5-8498573a991c" (UID: "8135c179-1825-4687-93d5-8498573a991c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.639120 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8135c179-1825-4687-93d5-8498573a991c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.716706 4885 generic.go:334] "Generic (PLEG): container finished" podID="8135c179-1825-4687-93d5-8498573a991c" containerID="d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966" exitCode=0 Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.716735 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-knn77" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.716773 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerDied","Data":"d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966"} Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.716857 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-knn77" event={"ID":"8135c179-1825-4687-93d5-8498573a991c","Type":"ContainerDied","Data":"59098b090e1126b0e7d7b18062b58caa8e180d7c1dec6b100ccbb38d0edcbeef"} Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.716892 4885 scope.go:117] "RemoveContainer" containerID="d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.748812 4885 scope.go:117] "RemoveContainer" containerID="146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.781200 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-knn77"] Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.793953 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-knn77"] Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.798852 4885 scope.go:117] "RemoveContainer" containerID="34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.832221 4885 scope.go:117] "RemoveContainer" containerID="d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966" Mar 08 20:45:54 crc kubenswrapper[4885]: E0308 20:45:54.832986 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966\": container with ID starting with d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966 not found: ID does not exist" containerID="d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.833065 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966"} err="failed to get container status \"d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966\": rpc error: code = NotFound desc = could not find container \"d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966\": container with ID starting with d843dbc83341d0bfb51e479341133853a468e7b36b523f3e1429cdb4a3a52966 not found: ID does not exist" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.833114 4885 scope.go:117] "RemoveContainer" containerID="146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729" Mar 08 20:45:54 crc kubenswrapper[4885]: E0308 20:45:54.833701 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729\": container with ID starting with 146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729 not found: ID does not exist" containerID="146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.833776 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729"} err="failed to get container status \"146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729\": rpc error: code = NotFound desc = could not find container \"146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729\": container with ID starting with 146aeba5ac06e3cd5880aa3a0c53da2d04b17efce1aaafa6dddcf3fb0c672729 not found: ID does not exist" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.833838 4885 scope.go:117] "RemoveContainer" containerID="34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778" Mar 08 20:45:54 crc kubenswrapper[4885]: E0308 20:45:54.834456 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778\": container with ID starting with 34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778 not found: ID does not exist" containerID="34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.834510 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778"} err="failed to get container status \"34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778\": rpc error: code = NotFound desc = could not find container \"34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778\": container with ID starting with 34729dbb9626809f31faa3d65455df74773e3f67e8791fcdde1b008231988778 not found: ID does not exist" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.898211 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t78sc"] Mar 08 20:45:54 crc kubenswrapper[4885]: E0308 20:45:54.899060 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="extract-content" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.899097 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="extract-content" Mar 08 20:45:54 crc kubenswrapper[4885]: E0308 20:45:54.899186 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="registry-server" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.899202 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="registry-server" Mar 08 20:45:54 crc kubenswrapper[4885]: E0308 20:45:54.899257 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="extract-utilities" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.899273 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="extract-utilities" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.899851 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8135c179-1825-4687-93d5-8498573a991c" containerName="registry-server" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.903601 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.922833 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t78sc"] Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.946430 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jtkr\" (UniqueName: \"kubernetes.io/projected/40daff17-4ce3-4cda-844e-8c2690d94d31-kube-api-access-5jtkr\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.946603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-catalog-content\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:54 crc kubenswrapper[4885]: I0308 20:45:54.946701 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-utilities\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.047987 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-utilities\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.048131 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jtkr\" (UniqueName: \"kubernetes.io/projected/40daff17-4ce3-4cda-844e-8c2690d94d31-kube-api-access-5jtkr\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.048288 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-catalog-content\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.049160 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-catalog-content\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.049357 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-utilities\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.077037 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jtkr\" (UniqueName: \"kubernetes.io/projected/40daff17-4ce3-4cda-844e-8c2690d94d31-kube-api-access-5jtkr\") pod \"certified-operators-t78sc\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.223138 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.387889 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8135c179-1825-4687-93d5-8498573a991c" path="/var/lib/kubelet/pods/8135c179-1825-4687-93d5-8498573a991c/volumes" Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.603350 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t78sc"] Mar 08 20:45:55 crc kubenswrapper[4885]: W0308 20:45:55.616246 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40daff17_4ce3_4cda_844e_8c2690d94d31.slice/crio-07923089e1a75f94847a0d361ce5c3079d7eb16c5943f0f5cb6462a2153ce03d WatchSource:0}: Error finding container 07923089e1a75f94847a0d361ce5c3079d7eb16c5943f0f5cb6462a2153ce03d: Status 404 returned error can't find the container with id 07923089e1a75f94847a0d361ce5c3079d7eb16c5943f0f5cb6462a2153ce03d Mar 08 20:45:55 crc kubenswrapper[4885]: I0308 20:45:55.724054 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerStarted","Data":"07923089e1a75f94847a0d361ce5c3079d7eb16c5943f0f5cb6462a2153ce03d"} Mar 08 20:45:56 crc kubenswrapper[4885]: I0308 20:45:56.738841 4885 generic.go:334] "Generic (PLEG): container finished" podID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerID="2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e" exitCode=0 Mar 08 20:45:56 crc kubenswrapper[4885]: I0308 20:45:56.738915 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerDied","Data":"2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e"} Mar 08 20:45:57 crc kubenswrapper[4885]: I0308 20:45:57.750999 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerStarted","Data":"bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813"} Mar 08 20:45:58 crc kubenswrapper[4885]: I0308 20:45:58.762887 4885 generic.go:334] "Generic (PLEG): container finished" podID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerID="bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813" exitCode=0 Mar 08 20:45:58 crc kubenswrapper[4885]: I0308 20:45:58.762966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerDied","Data":"bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813"} Mar 08 20:45:59 crc kubenswrapper[4885]: I0308 20:45:59.773706 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerStarted","Data":"923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd"} Mar 08 20:45:59 crc kubenswrapper[4885]: I0308 20:45:59.806682 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t78sc" podStartSLOduration=3.403241506 podStartE2EDuration="5.80665173s" podCreationTimestamp="2026-03-08 20:45:54 +0000 UTC" firstStartedPulling="2026-03-08 20:45:56.741476089 +0000 UTC m=+4458.137530142" lastFinishedPulling="2026-03-08 20:45:59.144886303 +0000 UTC m=+4460.540940366" observedRunningTime="2026-03-08 20:45:59.802472437 +0000 UTC m=+4461.198526530" watchObservedRunningTime="2026-03-08 20:45:59.80665173 +0000 UTC m=+4461.202705793" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.155386 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550046-f4q77"] Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.156834 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.159900 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.160014 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.161035 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.167448 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550046-f4q77"] Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.231717 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nh96\" (UniqueName: \"kubernetes.io/projected/b82d1463-69f5-455a-b2bf-493366c067f7-kube-api-access-6nh96\") pod \"auto-csr-approver-29550046-f4q77\" (UID: \"b82d1463-69f5-455a-b2bf-493366c067f7\") " pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.333577 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nh96\" (UniqueName: \"kubernetes.io/projected/b82d1463-69f5-455a-b2bf-493366c067f7-kube-api-access-6nh96\") pod \"auto-csr-approver-29550046-f4q77\" (UID: \"b82d1463-69f5-455a-b2bf-493366c067f7\") " pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.366323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nh96\" (UniqueName: \"kubernetes.io/projected/b82d1463-69f5-455a-b2bf-493366c067f7-kube-api-access-6nh96\") pod \"auto-csr-approver-29550046-f4q77\" (UID: \"b82d1463-69f5-455a-b2bf-493366c067f7\") " pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.486327 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:00 crc kubenswrapper[4885]: I0308 20:46:00.994416 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550046-f4q77"] Mar 08 20:46:01 crc kubenswrapper[4885]: W0308 20:46:01.067277 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb82d1463_69f5_455a_b2bf_493366c067f7.slice/crio-33cf436d7906db02d52aa806f4d4df05c0de922a193579b3d72b4138774e5ebb WatchSource:0}: Error finding container 33cf436d7906db02d52aa806f4d4df05c0de922a193579b3d72b4138774e5ebb: Status 404 returned error can't find the container with id 33cf436d7906db02d52aa806f4d4df05c0de922a193579b3d72b4138774e5ebb Mar 08 20:46:01 crc kubenswrapper[4885]: I0308 20:46:01.793857 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550046-f4q77" event={"ID":"b82d1463-69f5-455a-b2bf-493366c067f7","Type":"ContainerStarted","Data":"33cf436d7906db02d52aa806f4d4df05c0de922a193579b3d72b4138774e5ebb"} Mar 08 20:46:02 crc kubenswrapper[4885]: I0308 20:46:02.806333 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550046-f4q77" event={"ID":"b82d1463-69f5-455a-b2bf-493366c067f7","Type":"ContainerStarted","Data":"28fd3b4e0daaabbfc53b90770763cfd958469e45abae3ae9fd53f9c9e8ab327b"} Mar 08 20:46:03 crc kubenswrapper[4885]: I0308 20:46:03.101693 4885 scope.go:117] "RemoveContainer" containerID="04ce20a75f575125843cdf885d5d1cfa9b696f27d4253665b2071884d88ab3e4" Mar 08 20:46:03 crc kubenswrapper[4885]: I0308 20:46:03.818712 4885 generic.go:334] "Generic (PLEG): container finished" podID="b82d1463-69f5-455a-b2bf-493366c067f7" containerID="28fd3b4e0daaabbfc53b90770763cfd958469e45abae3ae9fd53f9c9e8ab327b" exitCode=0 Mar 08 20:46:03 crc kubenswrapper[4885]: I0308 20:46:03.818779 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550046-f4q77" event={"ID":"b82d1463-69f5-455a-b2bf-493366c067f7","Type":"ContainerDied","Data":"28fd3b4e0daaabbfc53b90770763cfd958469e45abae3ae9fd53f9c9e8ab327b"} Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.210896 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.225467 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.225533 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.309348 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nh96\" (UniqueName: \"kubernetes.io/projected/b82d1463-69f5-455a-b2bf-493366c067f7-kube-api-access-6nh96\") pod \"b82d1463-69f5-455a-b2bf-493366c067f7\" (UID: \"b82d1463-69f5-455a-b2bf-493366c067f7\") " Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.318217 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b82d1463-69f5-455a-b2bf-493366c067f7-kube-api-access-6nh96" (OuterVolumeSpecName: "kube-api-access-6nh96") pod "b82d1463-69f5-455a-b2bf-493366c067f7" (UID: "b82d1463-69f5-455a-b2bf-493366c067f7"). InnerVolumeSpecName "kube-api-access-6nh96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.327738 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.410782 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nh96\" (UniqueName: \"kubernetes.io/projected/b82d1463-69f5-455a-b2bf-493366c067f7-kube-api-access-6nh96\") on node \"crc\" DevicePath \"\"" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.837697 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550046-f4q77" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.838134 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550046-f4q77" event={"ID":"b82d1463-69f5-455a-b2bf-493366c067f7","Type":"ContainerDied","Data":"33cf436d7906db02d52aa806f4d4df05c0de922a193579b3d72b4138774e5ebb"} Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.838204 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33cf436d7906db02d52aa806f4d4df05c0de922a193579b3d72b4138774e5ebb" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.887372 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.909482 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550040-n7d55"] Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.921057 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550040-n7d55"] Mar 08 20:46:05 crc kubenswrapper[4885]: I0308 20:46:05.952469 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t78sc"] Mar 08 20:46:07 crc kubenswrapper[4885]: I0308 20:46:07.381530 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f06a86c2-3f71-42f0-8f33-558ffba8e527" path="/var/lib/kubelet/pods/f06a86c2-3f71-42f0-8f33-558ffba8e527/volumes" Mar 08 20:46:07 crc kubenswrapper[4885]: I0308 20:46:07.856253 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t78sc" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="registry-server" containerID="cri-o://923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd" gracePeriod=2 Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.311062 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.452002 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-catalog-content\") pod \"40daff17-4ce3-4cda-844e-8c2690d94d31\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.452155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-utilities\") pod \"40daff17-4ce3-4cda-844e-8c2690d94d31\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.452556 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jtkr\" (UniqueName: \"kubernetes.io/projected/40daff17-4ce3-4cda-844e-8c2690d94d31-kube-api-access-5jtkr\") pod \"40daff17-4ce3-4cda-844e-8c2690d94d31\" (UID: \"40daff17-4ce3-4cda-844e-8c2690d94d31\") " Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.453900 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-utilities" (OuterVolumeSpecName: "utilities") pod "40daff17-4ce3-4cda-844e-8c2690d94d31" (UID: "40daff17-4ce3-4cda-844e-8c2690d94d31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.459464 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40daff17-4ce3-4cda-844e-8c2690d94d31-kube-api-access-5jtkr" (OuterVolumeSpecName: "kube-api-access-5jtkr") pod "40daff17-4ce3-4cda-844e-8c2690d94d31" (UID: "40daff17-4ce3-4cda-844e-8c2690d94d31"). InnerVolumeSpecName "kube-api-access-5jtkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.555418 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.555483 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jtkr\" (UniqueName: \"kubernetes.io/projected/40daff17-4ce3-4cda-844e-8c2690d94d31-kube-api-access-5jtkr\") on node \"crc\" DevicePath \"\"" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.803693 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40daff17-4ce3-4cda-844e-8c2690d94d31" (UID: "40daff17-4ce3-4cda-844e-8c2690d94d31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.859595 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40daff17-4ce3-4cda-844e-8c2690d94d31-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.868598 4885 generic.go:334] "Generic (PLEG): container finished" podID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerID="923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd" exitCode=0 Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.868665 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerDied","Data":"923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd"} Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.868706 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t78sc" event={"ID":"40daff17-4ce3-4cda-844e-8c2690d94d31","Type":"ContainerDied","Data":"07923089e1a75f94847a0d361ce5c3079d7eb16c5943f0f5cb6462a2153ce03d"} Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.868735 4885 scope.go:117] "RemoveContainer" containerID="923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.868960 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t78sc" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.899904 4885 scope.go:117] "RemoveContainer" containerID="bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.920279 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t78sc"] Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.927238 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t78sc"] Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.935254 4885 scope.go:117] "RemoveContainer" containerID="2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.960522 4885 scope.go:117] "RemoveContainer" containerID="923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd" Mar 08 20:46:08 crc kubenswrapper[4885]: E0308 20:46:08.961186 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd\": container with ID starting with 923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd not found: ID does not exist" containerID="923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.961228 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd"} err="failed to get container status \"923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd\": rpc error: code = NotFound desc = could not find container \"923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd\": container with ID starting with 923912f42448f18e16b799c02303c73a5d97f64988b04121ff1f2714792945fd not found: ID does not exist" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.961285 4885 scope.go:117] "RemoveContainer" containerID="bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813" Mar 08 20:46:08 crc kubenswrapper[4885]: E0308 20:46:08.961727 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813\": container with ID starting with bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813 not found: ID does not exist" containerID="bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.961766 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813"} err="failed to get container status \"bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813\": rpc error: code = NotFound desc = could not find container \"bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813\": container with ID starting with bb65dbc4ee2a30613e05d72cae96ccb264b4442bfabaab19c2de1b0ed2347813 not found: ID does not exist" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.961793 4885 scope.go:117] "RemoveContainer" containerID="2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e" Mar 08 20:46:08 crc kubenswrapper[4885]: E0308 20:46:08.962168 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e\": container with ID starting with 2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e not found: ID does not exist" containerID="2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e" Mar 08 20:46:08 crc kubenswrapper[4885]: I0308 20:46:08.962193 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e"} err="failed to get container status \"2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e\": rpc error: code = NotFound desc = could not find container \"2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e\": container with ID starting with 2bd5034f7ddafe2c91905bfc36f2be9fec75839c699efbbf556c484bd1726b3e not found: ID does not exist" Mar 08 20:46:09 crc kubenswrapper[4885]: I0308 20:46:09.383499 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" path="/var/lib/kubelet/pods/40daff17-4ce3-4cda-844e-8c2690d94d31/volumes" Mar 08 20:47:03 crc kubenswrapper[4885]: I0308 20:47:03.238834 4885 scope.go:117] "RemoveContainer" containerID="cd2399263f696f150807aac189b8392d7375833b5910349cae536de7fbfd333a" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.448528 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-psfrk"] Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.456980 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-psfrk"] Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.543634 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-lnm22"] Mar 08 20:47:27 crc kubenswrapper[4885]: E0308 20:47:27.543948 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="extract-utilities" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.543962 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="extract-utilities" Mar 08 20:47:27 crc kubenswrapper[4885]: E0308 20:47:27.543999 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="registry-server" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.544009 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="registry-server" Mar 08 20:47:27 crc kubenswrapper[4885]: E0308 20:47:27.544022 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="extract-content" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.544030 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="extract-content" Mar 08 20:47:27 crc kubenswrapper[4885]: E0308 20:47:27.544050 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82d1463-69f5-455a-b2bf-493366c067f7" containerName="oc" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.544058 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82d1463-69f5-455a-b2bf-493366c067f7" containerName="oc" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.544226 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b82d1463-69f5-455a-b2bf-493366c067f7" containerName="oc" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.544257 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="40daff17-4ce3-4cda-844e-8c2690d94d31" containerName="registry-server" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.544743 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.547838 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.548342 4885 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-lm6tv" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.548416 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.548475 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.558652 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-lnm22"] Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.643261 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/92e8ccce-5f4b-4070-a497-9967a01c5897-crc-storage\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.643314 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5l82\" (UniqueName: \"kubernetes.io/projected/92e8ccce-5f4b-4070-a497-9967a01c5897-kube-api-access-b5l82\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.643466 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/92e8ccce-5f4b-4070-a497-9967a01c5897-node-mnt\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.745378 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/92e8ccce-5f4b-4070-a497-9967a01c5897-crc-storage\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.745428 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5l82\" (UniqueName: \"kubernetes.io/projected/92e8ccce-5f4b-4070-a497-9967a01c5897-kube-api-access-b5l82\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.745482 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/92e8ccce-5f4b-4070-a497-9967a01c5897-node-mnt\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.745785 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/92e8ccce-5f4b-4070-a497-9967a01c5897-node-mnt\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.746189 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/92e8ccce-5f4b-4070-a497-9967a01c5897-crc-storage\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.764374 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5l82\" (UniqueName: \"kubernetes.io/projected/92e8ccce-5f4b-4070-a497-9967a01c5897-kube-api-access-b5l82\") pod \"crc-storage-crc-lnm22\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:27 crc kubenswrapper[4885]: I0308 20:47:27.865612 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:28 crc kubenswrapper[4885]: I0308 20:47:28.435811 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-lnm22"] Mar 08 20:47:28 crc kubenswrapper[4885]: I0308 20:47:28.713356 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lnm22" event={"ID":"92e8ccce-5f4b-4070-a497-9967a01c5897","Type":"ContainerStarted","Data":"017bda305aadc3ff871118200374074c94a2db8f972d0461cf0b20699579b547"} Mar 08 20:47:29 crc kubenswrapper[4885]: I0308 20:47:29.377542 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a3c0554-f8ec-4a68-a332-1eba738b28c6" path="/var/lib/kubelet/pods/1a3c0554-f8ec-4a68-a332-1eba738b28c6/volumes" Mar 08 20:47:29 crc kubenswrapper[4885]: I0308 20:47:29.724061 4885 generic.go:334] "Generic (PLEG): container finished" podID="92e8ccce-5f4b-4070-a497-9967a01c5897" containerID="71660241cb857dd4a39450381f9b1b87218e1ce546c14f61658581a4f1a6ae9d" exitCode=0 Mar 08 20:47:29 crc kubenswrapper[4885]: I0308 20:47:29.724098 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lnm22" event={"ID":"92e8ccce-5f4b-4070-a497-9967a01c5897","Type":"ContainerDied","Data":"71660241cb857dd4a39450381f9b1b87218e1ce546c14f61658581a4f1a6ae9d"} Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.063354 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.101688 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5l82\" (UniqueName: \"kubernetes.io/projected/92e8ccce-5f4b-4070-a497-9967a01c5897-kube-api-access-b5l82\") pod \"92e8ccce-5f4b-4070-a497-9967a01c5897\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.101769 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/92e8ccce-5f4b-4070-a497-9967a01c5897-crc-storage\") pod \"92e8ccce-5f4b-4070-a497-9967a01c5897\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.101881 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/92e8ccce-5f4b-4070-a497-9967a01c5897-node-mnt\") pod \"92e8ccce-5f4b-4070-a497-9967a01c5897\" (UID: \"92e8ccce-5f4b-4070-a497-9967a01c5897\") " Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.102251 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/92e8ccce-5f4b-4070-a497-9967a01c5897-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "92e8ccce-5f4b-4070-a497-9967a01c5897" (UID: "92e8ccce-5f4b-4070-a497-9967a01c5897"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.107310 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92e8ccce-5f4b-4070-a497-9967a01c5897-kube-api-access-b5l82" (OuterVolumeSpecName: "kube-api-access-b5l82") pod "92e8ccce-5f4b-4070-a497-9967a01c5897" (UID: "92e8ccce-5f4b-4070-a497-9967a01c5897"). InnerVolumeSpecName "kube-api-access-b5l82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.126681 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92e8ccce-5f4b-4070-a497-9967a01c5897-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "92e8ccce-5f4b-4070-a497-9967a01c5897" (UID: "92e8ccce-5f4b-4070-a497-9967a01c5897"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.203384 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5l82\" (UniqueName: \"kubernetes.io/projected/92e8ccce-5f4b-4070-a497-9967a01c5897-kube-api-access-b5l82\") on node \"crc\" DevicePath \"\"" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.203439 4885 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/92e8ccce-5f4b-4070-a497-9967a01c5897-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.203461 4885 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/92e8ccce-5f4b-4070-a497-9967a01c5897-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.744383 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-lnm22" event={"ID":"92e8ccce-5f4b-4070-a497-9967a01c5897","Type":"ContainerDied","Data":"017bda305aadc3ff871118200374074c94a2db8f972d0461cf0b20699579b547"} Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.744443 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="017bda305aadc3ff871118200374074c94a2db8f972d0461cf0b20699579b547" Mar 08 20:47:31 crc kubenswrapper[4885]: I0308 20:47:31.744511 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-lnm22" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.634811 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-lnm22"] Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.646012 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-lnm22"] Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.742657 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-9ps2g"] Mar 08 20:47:33 crc kubenswrapper[4885]: E0308 20:47:33.743487 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92e8ccce-5f4b-4070-a497-9967a01c5897" containerName="storage" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.743675 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="92e8ccce-5f4b-4070-a497-9967a01c5897" containerName="storage" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.744243 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="92e8ccce-5f4b-4070-a497-9967a01c5897" containerName="storage" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.747318 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.754557 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.754756 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.755301 4885 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-lm6tv" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.755436 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.759890 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9ps2g"] Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.843038 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f8dcf555-f050-40ed-a174-5a5160c3124d-crc-storage\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.843469 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2kmh\" (UniqueName: \"kubernetes.io/projected/f8dcf555-f050-40ed-a174-5a5160c3124d-kube-api-access-l2kmh\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.843663 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f8dcf555-f050-40ed-a174-5a5160c3124d-node-mnt\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.946028 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f8dcf555-f050-40ed-a174-5a5160c3124d-crc-storage\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.946398 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2kmh\" (UniqueName: \"kubernetes.io/projected/f8dcf555-f050-40ed-a174-5a5160c3124d-kube-api-access-l2kmh\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.946851 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f8dcf555-f050-40ed-a174-5a5160c3124d-crc-storage\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.947176 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f8dcf555-f050-40ed-a174-5a5160c3124d-node-mnt\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.947460 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f8dcf555-f050-40ed-a174-5a5160c3124d-node-mnt\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:33 crc kubenswrapper[4885]: I0308 20:47:33.981333 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2kmh\" (UniqueName: \"kubernetes.io/projected/f8dcf555-f050-40ed-a174-5a5160c3124d-kube-api-access-l2kmh\") pod \"crc-storage-crc-9ps2g\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:34 crc kubenswrapper[4885]: I0308 20:47:34.081684 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:34 crc kubenswrapper[4885]: I0308 20:47:34.575435 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-9ps2g"] Mar 08 20:47:34 crc kubenswrapper[4885]: I0308 20:47:34.769740 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9ps2g" event={"ID":"f8dcf555-f050-40ed-a174-5a5160c3124d","Type":"ContainerStarted","Data":"2b195445549dba750c914ded4bfa8f8063806ddc02e4016755dc26a49f8b949a"} Mar 08 20:47:35 crc kubenswrapper[4885]: I0308 20:47:35.379955 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92e8ccce-5f4b-4070-a497-9967a01c5897" path="/var/lib/kubelet/pods/92e8ccce-5f4b-4070-a497-9967a01c5897/volumes" Mar 08 20:47:35 crc kubenswrapper[4885]: I0308 20:47:35.778119 4885 generic.go:334] "Generic (PLEG): container finished" podID="f8dcf555-f050-40ed-a174-5a5160c3124d" containerID="c4be26f875d4819cf8da13b6c1d96b277d86ed1ad18bb162bc1f8d6584a5fe61" exitCode=0 Mar 08 20:47:35 crc kubenswrapper[4885]: I0308 20:47:35.778161 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9ps2g" event={"ID":"f8dcf555-f050-40ed-a174-5a5160c3124d","Type":"ContainerDied","Data":"c4be26f875d4819cf8da13b6c1d96b277d86ed1ad18bb162bc1f8d6584a5fe61"} Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.096900 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.191634 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f8dcf555-f050-40ed-a174-5a5160c3124d-node-mnt\") pod \"f8dcf555-f050-40ed-a174-5a5160c3124d\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.191739 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2kmh\" (UniqueName: \"kubernetes.io/projected/f8dcf555-f050-40ed-a174-5a5160c3124d-kube-api-access-l2kmh\") pod \"f8dcf555-f050-40ed-a174-5a5160c3124d\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.191766 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f8dcf555-f050-40ed-a174-5a5160c3124d-crc-storage\") pod \"f8dcf555-f050-40ed-a174-5a5160c3124d\" (UID: \"f8dcf555-f050-40ed-a174-5a5160c3124d\") " Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.191900 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8dcf555-f050-40ed-a174-5a5160c3124d-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "f8dcf555-f050-40ed-a174-5a5160c3124d" (UID: "f8dcf555-f050-40ed-a174-5a5160c3124d"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.192179 4885 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/f8dcf555-f050-40ed-a174-5a5160c3124d-node-mnt\") on node \"crc\" DevicePath \"\"" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.197529 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8dcf555-f050-40ed-a174-5a5160c3124d-kube-api-access-l2kmh" (OuterVolumeSpecName: "kube-api-access-l2kmh") pod "f8dcf555-f050-40ed-a174-5a5160c3124d" (UID: "f8dcf555-f050-40ed-a174-5a5160c3124d"). InnerVolumeSpecName "kube-api-access-l2kmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.219846 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8dcf555-f050-40ed-a174-5a5160c3124d-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "f8dcf555-f050-40ed-a174-5a5160c3124d" (UID: "f8dcf555-f050-40ed-a174-5a5160c3124d"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.293194 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2kmh\" (UniqueName: \"kubernetes.io/projected/f8dcf555-f050-40ed-a174-5a5160c3124d-kube-api-access-l2kmh\") on node \"crc\" DevicePath \"\"" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.293244 4885 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/f8dcf555-f050-40ed-a174-5a5160c3124d-crc-storage\") on node \"crc\" DevicePath \"\"" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.807732 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-9ps2g" event={"ID":"f8dcf555-f050-40ed-a174-5a5160c3124d","Type":"ContainerDied","Data":"2b195445549dba750c914ded4bfa8f8063806ddc02e4016755dc26a49f8b949a"} Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.808078 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b195445549dba750c914ded4bfa8f8063806ddc02e4016755dc26a49f8b949a" Mar 08 20:47:37 crc kubenswrapper[4885]: I0308 20:47:37.808148 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-9ps2g" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.175084 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550048-bs8p7"] Mar 08 20:48:00 crc kubenswrapper[4885]: E0308 20:48:00.175899 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8dcf555-f050-40ed-a174-5a5160c3124d" containerName="storage" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.175917 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8dcf555-f050-40ed-a174-5a5160c3124d" containerName="storage" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.176142 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8dcf555-f050-40ed-a174-5a5160c3124d" containerName="storage" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.176631 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.179431 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.179604 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.184097 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.202202 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550048-bs8p7"] Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.256151 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmt97\" (UniqueName: \"kubernetes.io/projected/8cc070c5-7e69-4caa-82a5-b21b8fa66256-kube-api-access-mmt97\") pod \"auto-csr-approver-29550048-bs8p7\" (UID: \"8cc070c5-7e69-4caa-82a5-b21b8fa66256\") " pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.356900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmt97\" (UniqueName: \"kubernetes.io/projected/8cc070c5-7e69-4caa-82a5-b21b8fa66256-kube-api-access-mmt97\") pod \"auto-csr-approver-29550048-bs8p7\" (UID: \"8cc070c5-7e69-4caa-82a5-b21b8fa66256\") " pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.379422 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmt97\" (UniqueName: \"kubernetes.io/projected/8cc070c5-7e69-4caa-82a5-b21b8fa66256-kube-api-access-mmt97\") pod \"auto-csr-approver-29550048-bs8p7\" (UID: \"8cc070c5-7e69-4caa-82a5-b21b8fa66256\") " pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:00 crc kubenswrapper[4885]: I0308 20:48:00.509775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:01 crc kubenswrapper[4885]: I0308 20:48:01.041150 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550048-bs8p7"] Mar 08 20:48:02 crc kubenswrapper[4885]: I0308 20:48:02.030187 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" event={"ID":"8cc070c5-7e69-4caa-82a5-b21b8fa66256","Type":"ContainerStarted","Data":"d559168086f0f7fae799a2dd202de00b0967f6d50607ac476a3dd2bdaabb72ee"} Mar 08 20:48:02 crc kubenswrapper[4885]: I0308 20:48:02.818562 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:48:02 crc kubenswrapper[4885]: I0308 20:48:02.818891 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:48:03 crc kubenswrapper[4885]: I0308 20:48:03.041592 4885 generic.go:334] "Generic (PLEG): container finished" podID="8cc070c5-7e69-4caa-82a5-b21b8fa66256" containerID="4768620fbc443f87f40deff915eceaf069ee28b8e25c0efabb4228990e81cee6" exitCode=0 Mar 08 20:48:03 crc kubenswrapper[4885]: I0308 20:48:03.041680 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" event={"ID":"8cc070c5-7e69-4caa-82a5-b21b8fa66256","Type":"ContainerDied","Data":"4768620fbc443f87f40deff915eceaf069ee28b8e25c0efabb4228990e81cee6"} Mar 08 20:48:03 crc kubenswrapper[4885]: I0308 20:48:03.369058 4885 scope.go:117] "RemoveContainer" containerID="b57ab7fa02bc0b3cc8cdcea97c2fd6abc762d9cf4cd62e7caa0369ec5c53eef8" Mar 08 20:48:04 crc kubenswrapper[4885]: I0308 20:48:04.444745 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:04 crc kubenswrapper[4885]: I0308 20:48:04.529471 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmt97\" (UniqueName: \"kubernetes.io/projected/8cc070c5-7e69-4caa-82a5-b21b8fa66256-kube-api-access-mmt97\") pod \"8cc070c5-7e69-4caa-82a5-b21b8fa66256\" (UID: \"8cc070c5-7e69-4caa-82a5-b21b8fa66256\") " Mar 08 20:48:04 crc kubenswrapper[4885]: I0308 20:48:04.549255 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cc070c5-7e69-4caa-82a5-b21b8fa66256-kube-api-access-mmt97" (OuterVolumeSpecName: "kube-api-access-mmt97") pod "8cc070c5-7e69-4caa-82a5-b21b8fa66256" (UID: "8cc070c5-7e69-4caa-82a5-b21b8fa66256"). InnerVolumeSpecName "kube-api-access-mmt97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:48:04 crc kubenswrapper[4885]: I0308 20:48:04.631465 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmt97\" (UniqueName: \"kubernetes.io/projected/8cc070c5-7e69-4caa-82a5-b21b8fa66256-kube-api-access-mmt97\") on node \"crc\" DevicePath \"\"" Mar 08 20:48:05 crc kubenswrapper[4885]: I0308 20:48:05.067085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" event={"ID":"8cc070c5-7e69-4caa-82a5-b21b8fa66256","Type":"ContainerDied","Data":"d559168086f0f7fae799a2dd202de00b0967f6d50607ac476a3dd2bdaabb72ee"} Mar 08 20:48:05 crc kubenswrapper[4885]: I0308 20:48:05.067149 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d559168086f0f7fae799a2dd202de00b0967f6d50607ac476a3dd2bdaabb72ee" Mar 08 20:48:05 crc kubenswrapper[4885]: I0308 20:48:05.067426 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550048-bs8p7" Mar 08 20:48:05 crc kubenswrapper[4885]: I0308 20:48:05.560682 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550042-d4xpj"] Mar 08 20:48:05 crc kubenswrapper[4885]: I0308 20:48:05.570496 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550042-d4xpj"] Mar 08 20:48:07 crc kubenswrapper[4885]: I0308 20:48:07.385009 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34acfccb-db62-40e1-b46c-3227ce6e32ab" path="/var/lib/kubelet/pods/34acfccb-db62-40e1-b46c-3227ce6e32ab/volumes" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.570189 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pqdpm"] Mar 08 20:48:09 crc kubenswrapper[4885]: E0308 20:48:09.571714 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cc070c5-7e69-4caa-82a5-b21b8fa66256" containerName="oc" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.571856 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cc070c5-7e69-4caa-82a5-b21b8fa66256" containerName="oc" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.572263 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cc070c5-7e69-4caa-82a5-b21b8fa66256" containerName="oc" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.574129 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.592449 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqdpm"] Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.608131 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-catalog-content\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.608453 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-utilities\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.608599 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtnnj\" (UniqueName: \"kubernetes.io/projected/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-kube-api-access-wtnnj\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.710378 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-catalog-content\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.710640 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-utilities\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.711057 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-catalog-content\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.711438 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-utilities\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.711624 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtnnj\" (UniqueName: \"kubernetes.io/projected/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-kube-api-access-wtnnj\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.751208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtnnj\" (UniqueName: \"kubernetes.io/projected/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-kube-api-access-wtnnj\") pod \"redhat-marketplace-pqdpm\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:09 crc kubenswrapper[4885]: I0308 20:48:09.918529 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:10 crc kubenswrapper[4885]: I0308 20:48:10.422157 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqdpm"] Mar 08 20:48:11 crc kubenswrapper[4885]: I0308 20:48:11.117737 4885 generic.go:334] "Generic (PLEG): container finished" podID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerID="d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf" exitCode=0 Mar 08 20:48:11 crc kubenswrapper[4885]: I0308 20:48:11.117807 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqdpm" event={"ID":"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2","Type":"ContainerDied","Data":"d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf"} Mar 08 20:48:11 crc kubenswrapper[4885]: I0308 20:48:11.117839 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqdpm" event={"ID":"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2","Type":"ContainerStarted","Data":"4a4290465a756fe6805abd596855031f80b77d73d18b519f278c9aa3f2cd0de5"} Mar 08 20:48:12 crc kubenswrapper[4885]: I0308 20:48:12.127483 4885 generic.go:334] "Generic (PLEG): container finished" podID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerID="812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda" exitCode=0 Mar 08 20:48:12 crc kubenswrapper[4885]: I0308 20:48:12.127589 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqdpm" event={"ID":"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2","Type":"ContainerDied","Data":"812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda"} Mar 08 20:48:13 crc kubenswrapper[4885]: I0308 20:48:13.138178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqdpm" event={"ID":"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2","Type":"ContainerStarted","Data":"baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a"} Mar 08 20:48:13 crc kubenswrapper[4885]: I0308 20:48:13.177485 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pqdpm" podStartSLOduration=2.74004976 podStartE2EDuration="4.177465642s" podCreationTimestamp="2026-03-08 20:48:09 +0000 UTC" firstStartedPulling="2026-03-08 20:48:11.120070724 +0000 UTC m=+4592.516124787" lastFinishedPulling="2026-03-08 20:48:12.557486606 +0000 UTC m=+4593.953540669" observedRunningTime="2026-03-08 20:48:13.172649452 +0000 UTC m=+4594.568703505" watchObservedRunningTime="2026-03-08 20:48:13.177465642 +0000 UTC m=+4594.573519675" Mar 08 20:48:19 crc kubenswrapper[4885]: I0308 20:48:19.919539 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:19 crc kubenswrapper[4885]: I0308 20:48:19.920295 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:20 crc kubenswrapper[4885]: I0308 20:48:20.006393 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:20 crc kubenswrapper[4885]: I0308 20:48:20.274085 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:20 crc kubenswrapper[4885]: I0308 20:48:20.340475 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqdpm"] Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.218962 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pqdpm" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="registry-server" containerID="cri-o://baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a" gracePeriod=2 Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.646423 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.817427 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-utilities\") pod \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.817664 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-catalog-content\") pod \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.817696 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtnnj\" (UniqueName: \"kubernetes.io/projected/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-kube-api-access-wtnnj\") pod \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\" (UID: \"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2\") " Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.818422 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-utilities" (OuterVolumeSpecName: "utilities") pod "31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" (UID: "31054b94-2c60-4e3d-a5e7-3a7b9789a2c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.822533 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-kube-api-access-wtnnj" (OuterVolumeSpecName: "kube-api-access-wtnnj") pod "31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" (UID: "31054b94-2c60-4e3d-a5e7-3a7b9789a2c2"). InnerVolumeSpecName "kube-api-access-wtnnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.852053 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" (UID: "31054b94-2c60-4e3d-a5e7-3a7b9789a2c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.919762 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.919815 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:48:22 crc kubenswrapper[4885]: I0308 20:48:22.919841 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtnnj\" (UniqueName: \"kubernetes.io/projected/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2-kube-api-access-wtnnj\") on node \"crc\" DevicePath \"\"" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.231429 4885 generic.go:334] "Generic (PLEG): container finished" podID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerID="baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a" exitCode=0 Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.231520 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pqdpm" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.231503 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqdpm" event={"ID":"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2","Type":"ContainerDied","Data":"baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a"} Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.231758 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pqdpm" event={"ID":"31054b94-2c60-4e3d-a5e7-3a7b9789a2c2","Type":"ContainerDied","Data":"4a4290465a756fe6805abd596855031f80b77d73d18b519f278c9aa3f2cd0de5"} Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.231805 4885 scope.go:117] "RemoveContainer" containerID="baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.259912 4885 scope.go:117] "RemoveContainer" containerID="812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.291333 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqdpm"] Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.297233 4885 scope.go:117] "RemoveContainer" containerID="d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.301494 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pqdpm"] Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.328537 4885 scope.go:117] "RemoveContainer" containerID="baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a" Mar 08 20:48:23 crc kubenswrapper[4885]: E0308 20:48:23.329124 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a\": container with ID starting with baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a not found: ID does not exist" containerID="baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.329178 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a"} err="failed to get container status \"baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a\": rpc error: code = NotFound desc = could not find container \"baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a\": container with ID starting with baf2b980182bc00f0a46753a50abb886b9fba4c3ac7abf8af1b8f6c3267c155a not found: ID does not exist" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.329214 4885 scope.go:117] "RemoveContainer" containerID="812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda" Mar 08 20:48:23 crc kubenswrapper[4885]: E0308 20:48:23.329558 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda\": container with ID starting with 812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda not found: ID does not exist" containerID="812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.329595 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda"} err="failed to get container status \"812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda\": rpc error: code = NotFound desc = could not find container \"812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda\": container with ID starting with 812f5ba5b8cf1c97d2f5fd57e4fb8cb3de2611852af9d4984cfdb75574a90fda not found: ID does not exist" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.329619 4885 scope.go:117] "RemoveContainer" containerID="d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf" Mar 08 20:48:23 crc kubenswrapper[4885]: E0308 20:48:23.330033 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf\": container with ID starting with d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf not found: ID does not exist" containerID="d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.330097 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf"} err="failed to get container status \"d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf\": rpc error: code = NotFound desc = could not find container \"d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf\": container with ID starting with d6aba3f0a46495e36587d1330758d3cba6e78fb68ae1012969e26470f34f2dbf not found: ID does not exist" Mar 08 20:48:23 crc kubenswrapper[4885]: I0308 20:48:23.387048 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" path="/var/lib/kubelet/pods/31054b94-2c60-4e3d-a5e7-3a7b9789a2c2/volumes" Mar 08 20:48:32 crc kubenswrapper[4885]: I0308 20:48:32.819111 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:48:32 crc kubenswrapper[4885]: I0308 20:48:32.819816 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:49:02 crc kubenswrapper[4885]: I0308 20:49:02.818690 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:49:02 crc kubenswrapper[4885]: I0308 20:49:02.819587 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:49:02 crc kubenswrapper[4885]: I0308 20:49:02.819656 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:49:02 crc kubenswrapper[4885]: I0308 20:49:02.820600 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:49:02 crc kubenswrapper[4885]: I0308 20:49:02.820705 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" gracePeriod=600 Mar 08 20:49:02 crc kubenswrapper[4885]: E0308 20:49:02.955956 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:49:03 crc kubenswrapper[4885]: I0308 20:49:03.440785 4885 scope.go:117] "RemoveContainer" containerID="f37032ea52adfa6ddae3677872b806d5bfa71e165ea8806d1e82b81025d0feb8" Mar 08 20:49:03 crc kubenswrapper[4885]: I0308 20:49:03.638461 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" exitCode=0 Mar 08 20:49:03 crc kubenswrapper[4885]: I0308 20:49:03.638584 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6"} Mar 08 20:49:03 crc kubenswrapper[4885]: I0308 20:49:03.638643 4885 scope.go:117] "RemoveContainer" containerID="e901ac3813cd54c99c26fb9995667f5096a1aafbc4c007ea9eda1c542b1947a8" Mar 08 20:49:03 crc kubenswrapper[4885]: I0308 20:49:03.639799 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:49:03 crc kubenswrapper[4885]: E0308 20:49:03.641862 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:49:17 crc kubenswrapper[4885]: I0308 20:49:17.368914 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:49:17 crc kubenswrapper[4885]: E0308 20:49:17.370084 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:49:29 crc kubenswrapper[4885]: I0308 20:49:29.375306 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:49:29 crc kubenswrapper[4885]: E0308 20:49:29.376426 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:49:44 crc kubenswrapper[4885]: I0308 20:49:44.367708 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:49:44 crc kubenswrapper[4885]: E0308 20:49:44.368402 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:49:59 crc kubenswrapper[4885]: I0308 20:49:59.376649 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:49:59 crc kubenswrapper[4885]: E0308 20:49:59.377634 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.156121 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550050-xrlpj"] Mar 08 20:50:00 crc kubenswrapper[4885]: E0308 20:50:00.157038 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="extract-utilities" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.157074 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="extract-utilities" Mar 08 20:50:00 crc kubenswrapper[4885]: E0308 20:50:00.157110 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="registry-server" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.157126 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="registry-server" Mar 08 20:50:00 crc kubenswrapper[4885]: E0308 20:50:00.157175 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="extract-content" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.157193 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="extract-content" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.157525 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="31054b94-2c60-4e3d-a5e7-3a7b9789a2c2" containerName="registry-server" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.158301 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.160722 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.163608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwmsz\" (UniqueName: \"kubernetes.io/projected/c364e354-f542-45ec-9322-125db18eb928-kube-api-access-wwmsz\") pod \"auto-csr-approver-29550050-xrlpj\" (UID: \"c364e354-f542-45ec-9322-125db18eb928\") " pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.172654 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550050-xrlpj"] Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.191282 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.191651 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.265108 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwmsz\" (UniqueName: \"kubernetes.io/projected/c364e354-f542-45ec-9322-125db18eb928-kube-api-access-wwmsz\") pod \"auto-csr-approver-29550050-xrlpj\" (UID: \"c364e354-f542-45ec-9322-125db18eb928\") " pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.297360 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwmsz\" (UniqueName: \"kubernetes.io/projected/c364e354-f542-45ec-9322-125db18eb928-kube-api-access-wwmsz\") pod \"auto-csr-approver-29550050-xrlpj\" (UID: \"c364e354-f542-45ec-9322-125db18eb928\") " pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:00 crc kubenswrapper[4885]: I0308 20:50:00.520151 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:01 crc kubenswrapper[4885]: I0308 20:50:01.025783 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550050-xrlpj"] Mar 08 20:50:01 crc kubenswrapper[4885]: I0308 20:50:01.191136 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" event={"ID":"c364e354-f542-45ec-9322-125db18eb928","Type":"ContainerStarted","Data":"bb773abed9327d88ac1406fbd8d50a08b4dad4d5fac82785dc3a6eb9776f0a9f"} Mar 08 20:50:03 crc kubenswrapper[4885]: I0308 20:50:03.210750 4885 generic.go:334] "Generic (PLEG): container finished" podID="c364e354-f542-45ec-9322-125db18eb928" containerID="00728729b1a54767f7bf5ead12746c7b79c9d4ef7c28991788ed27e082f5ef87" exitCode=0 Mar 08 20:50:03 crc kubenswrapper[4885]: I0308 20:50:03.210869 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" event={"ID":"c364e354-f542-45ec-9322-125db18eb928","Type":"ContainerDied","Data":"00728729b1a54767f7bf5ead12746c7b79c9d4ef7c28991788ed27e082f5ef87"} Mar 08 20:50:04 crc kubenswrapper[4885]: I0308 20:50:04.626643 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:04 crc kubenswrapper[4885]: I0308 20:50:04.742216 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwmsz\" (UniqueName: \"kubernetes.io/projected/c364e354-f542-45ec-9322-125db18eb928-kube-api-access-wwmsz\") pod \"c364e354-f542-45ec-9322-125db18eb928\" (UID: \"c364e354-f542-45ec-9322-125db18eb928\") " Mar 08 20:50:04 crc kubenswrapper[4885]: I0308 20:50:04.750350 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c364e354-f542-45ec-9322-125db18eb928-kube-api-access-wwmsz" (OuterVolumeSpecName: "kube-api-access-wwmsz") pod "c364e354-f542-45ec-9322-125db18eb928" (UID: "c364e354-f542-45ec-9322-125db18eb928"). InnerVolumeSpecName "kube-api-access-wwmsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:50:04 crc kubenswrapper[4885]: I0308 20:50:04.844163 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwmsz\" (UniqueName: \"kubernetes.io/projected/c364e354-f542-45ec-9322-125db18eb928-kube-api-access-wwmsz\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:05 crc kubenswrapper[4885]: I0308 20:50:05.239526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" event={"ID":"c364e354-f542-45ec-9322-125db18eb928","Type":"ContainerDied","Data":"bb773abed9327d88ac1406fbd8d50a08b4dad4d5fac82785dc3a6eb9776f0a9f"} Mar 08 20:50:05 crc kubenswrapper[4885]: I0308 20:50:05.239660 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb773abed9327d88ac1406fbd8d50a08b4dad4d5fac82785dc3a6eb9776f0a9f" Mar 08 20:50:05 crc kubenswrapper[4885]: I0308 20:50:05.239652 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550050-xrlpj" Mar 08 20:50:05 crc kubenswrapper[4885]: I0308 20:50:05.717777 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550044-682qm"] Mar 08 20:50:05 crc kubenswrapper[4885]: I0308 20:50:05.727441 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550044-682qm"] Mar 08 20:50:07 crc kubenswrapper[4885]: I0308 20:50:07.382464 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cbaad9b-e652-438c-9b41-f414447382c5" path="/var/lib/kubelet/pods/8cbaad9b-e652-438c-9b41-f414447382c5/volumes" Mar 08 20:50:12 crc kubenswrapper[4885]: I0308 20:50:12.367728 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:50:12 crc kubenswrapper[4885]: E0308 20:50:12.368506 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:50:27 crc kubenswrapper[4885]: I0308 20:50:27.369696 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:50:27 crc kubenswrapper[4885]: E0308 20:50:27.379067 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:50:42 crc kubenswrapper[4885]: I0308 20:50:42.368293 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:50:42 crc kubenswrapper[4885]: E0308 20:50:42.369432 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.357298 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c44667757-cn45b"] Mar 08 20:50:54 crc kubenswrapper[4885]: E0308 20:50:54.358204 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c364e354-f542-45ec-9322-125db18eb928" containerName="oc" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.358222 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c364e354-f542-45ec-9322-125db18eb928" containerName="oc" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.358368 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c364e354-f542-45ec-9322-125db18eb928" containerName="oc" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.359147 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.363642 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.363897 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-jtnpr" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.364168 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.367164 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.367985 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:50:54 crc kubenswrapper[4885]: E0308 20:50:54.368234 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.384703 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c44667757-cn45b"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.410294 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-d6s92"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.411388 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.414086 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.431100 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-d6s92"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.470632 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-d6s92"] Mar 08 20:50:54 crc kubenswrapper[4885]: E0308 20:50:54.471066 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-qk4ff], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" podUID="d7da12c3-52d7-40e9-b7ce-8140c98dc55d" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.492465 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-bphnt"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.493522 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.517354 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-bphnt"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.551872 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-config\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562701 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc958bb-3dbf-45b1-be56-8fc362a957a5-config\") pod \"dnsmasq-dns-c44667757-cn45b\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-config\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562764 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vv7gt\" (UniqueName: \"kubernetes.io/projected/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-kube-api-access-vv7gt\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562817 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562847 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562890 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk4ff\" (UniqueName: \"kubernetes.io/projected/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-kube-api-access-qk4ff\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.562990 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnx2p\" (UniqueName: \"kubernetes.io/projected/2cc958bb-3dbf-45b1-be56-8fc362a957a5-kube-api-access-qnx2p\") pod \"dnsmasq-dns-c44667757-cn45b\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-config\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664446 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc958bb-3dbf-45b1-be56-8fc362a957a5-config\") pod \"dnsmasq-dns-c44667757-cn45b\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664466 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-config\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664484 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vv7gt\" (UniqueName: \"kubernetes.io/projected/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-kube-api-access-vv7gt\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664520 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664540 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664569 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk4ff\" (UniqueName: \"kubernetes.io/projected/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-kube-api-access-qk4ff\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.664587 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnx2p\" (UniqueName: \"kubernetes.io/projected/2cc958bb-3dbf-45b1-be56-8fc362a957a5-kube-api-access-qnx2p\") pod \"dnsmasq-dns-c44667757-cn45b\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.665357 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc958bb-3dbf-45b1-be56-8fc362a957a5-config\") pod \"dnsmasq-dns-c44667757-cn45b\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.665381 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-config\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.665571 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-dns-svc\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.665578 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-dns-svc\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.665970 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-config\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.683685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnx2p\" (UniqueName: \"kubernetes.io/projected/2cc958bb-3dbf-45b1-be56-8fc362a957a5-kube-api-access-qnx2p\") pod \"dnsmasq-dns-c44667757-cn45b\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.703717 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vv7gt\" (UniqueName: \"kubernetes.io/projected/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-kube-api-access-vv7gt\") pod \"dnsmasq-dns-5fb77f9685-bphnt\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.706690 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk4ff\" (UniqueName: \"kubernetes.io/projected/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-kube-api-access-qk4ff\") pod \"dnsmasq-dns-55c76fd6b7-d6s92\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.738412 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-bphnt"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.738890 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.758559 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-6lbrp"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.759718 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.768529 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-config\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.768603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7z56t\" (UniqueName: \"kubernetes.io/projected/7084d0a7-6526-4b90-93a3-e16b9d374be2-kube-api-access-7z56t\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.768630 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-dns-svc\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.772254 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-6lbrp"] Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.869781 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7z56t\" (UniqueName: \"kubernetes.io/projected/7084d0a7-6526-4b90-93a3-e16b9d374be2-kube-api-access-7z56t\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.869834 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-dns-svc\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.869884 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-config\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.870645 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-config\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.870934 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-dns-svc\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.893511 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7z56t\" (UniqueName: \"kubernetes.io/projected/7084d0a7-6526-4b90-93a3-e16b9d374be2-kube-api-access-7z56t\") pod \"dnsmasq-dns-ff89b6977-6lbrp\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:54 crc kubenswrapper[4885]: I0308 20:50:54.974942 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.054749 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.073621 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.131227 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.178208 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk4ff\" (UniqueName: \"kubernetes.io/projected/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-kube-api-access-qk4ff\") pod \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.178261 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-dns-svc\") pod \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.178349 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-config\") pod \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\" (UID: \"d7da12c3-52d7-40e9-b7ce-8140c98dc55d\") " Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.179039 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-config" (OuterVolumeSpecName: "config") pod "d7da12c3-52d7-40e9-b7ce-8140c98dc55d" (UID: "d7da12c3-52d7-40e9-b7ce-8140c98dc55d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.179339 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7da12c3-52d7-40e9-b7ce-8140c98dc55d" (UID: "d7da12c3-52d7-40e9-b7ce-8140c98dc55d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.183096 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-kube-api-access-qk4ff" (OuterVolumeSpecName: "kube-api-access-qk4ff") pod "d7da12c3-52d7-40e9-b7ce-8140c98dc55d" (UID: "d7da12c3-52d7-40e9-b7ce-8140c98dc55d"). InnerVolumeSpecName "kube-api-access-qk4ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.280694 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.280725 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk4ff\" (UniqueName: \"kubernetes.io/projected/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-kube-api-access-qk4ff\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.280736 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7da12c3-52d7-40e9-b7ce-8140c98dc55d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.301508 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-bphnt"] Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.457101 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c44667757-cn45b"] Mar 08 20:50:55 crc kubenswrapper[4885]: W0308 20:50:55.459131 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cc958bb_3dbf_45b1_be56_8fc362a957a5.slice/crio-b821eaa20bfc8d91b91d7373dbd2e6c29ae06a68f990eb0c0e8cab6e3c9eeacf WatchSource:0}: Error finding container b821eaa20bfc8d91b91d7373dbd2e6c29ae06a68f990eb0c0e8cab6e3c9eeacf: Status 404 returned error can't find the container with id b821eaa20bfc8d91b91d7373dbd2e6c29ae06a68f990eb0c0e8cab6e3c9eeacf Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.605627 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.607224 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.609857 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.610637 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.611014 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.612299 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.612755 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lp8r9" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.618076 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.661976 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-6lbrp"] Mar 08 20:50:55 crc kubenswrapper[4885]: W0308 20:50:55.672364 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7084d0a7_6526_4b90_93a3_e16b9d374be2.slice/crio-cefbe5124552682fd6a4537cd48b7c8e69da3ba23b252f870be87ed5d1dd02f5 WatchSource:0}: Error finding container cefbe5124552682fd6a4537cd48b7c8e69da3ba23b252f870be87ed5d1dd02f5: Status 404 returned error can't find the container with id cefbe5124552682fd6a4537cd48b7c8e69da3ba23b252f870be87ed5d1dd02f5 Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785052 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785100 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785129 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785159 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785367 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785440 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785554 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785591 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.785611 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2lfc\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-kube-api-access-l2lfc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.858838 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.859940 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.862907 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.862944 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.863045 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.863134 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.863986 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qk9sg" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.881875 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886648 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886687 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886741 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886768 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886794 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2lfc\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-kube-api-access-l2lfc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886837 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886881 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.886945 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.887744 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.887992 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.888183 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.888500 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.905297 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.905510 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.905973 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.910325 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.910363 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c3a41ce06382fbd79ab5375af77b75afb5583c9d1133ff7b974e34c8338b5b2/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.913473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2lfc\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-kube-api-access-l2lfc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.965420 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987755 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-server-conf\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987806 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987834 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987849 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17485fbe-1c6a-4d1b-91b3-c465215cb4be-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987892 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987912 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987957 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17485fbe-1c6a-4d1b-91b3-c465215cb4be-pod-info\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987979 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:55 crc kubenswrapper[4885]: I0308 20:50:55.987997 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4w7l\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-kube-api-access-f4w7l\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.061009 4885 generic.go:334] "Generic (PLEG): container finished" podID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerID="4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd" exitCode=0 Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.061081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" event={"ID":"7084d0a7-6526-4b90-93a3-e16b9d374be2","Type":"ContainerDied","Data":"4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd"} Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.061108 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" event={"ID":"7084d0a7-6526-4b90-93a3-e16b9d374be2","Type":"ContainerStarted","Data":"cefbe5124552682fd6a4537cd48b7c8e69da3ba23b252f870be87ed5d1dd02f5"} Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.062505 4885 generic.go:334] "Generic (PLEG): container finished" podID="e057e697-8dde-4c0b-80f0-7c3a81c8bca0" containerID="3bd36cc2a7b6689ef1af665b0a95fa5d004c8cabcb49857d2fb7d819c7770e99" exitCode=0 Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.062577 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" event={"ID":"e057e697-8dde-4c0b-80f0-7c3a81c8bca0","Type":"ContainerDied","Data":"3bd36cc2a7b6689ef1af665b0a95fa5d004c8cabcb49857d2fb7d819c7770e99"} Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.062602 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" event={"ID":"e057e697-8dde-4c0b-80f0-7c3a81c8bca0","Type":"ContainerStarted","Data":"a27c7d52fd6267f66531fa416c4e6953885eaf21e7fea63ba5bad10bb9c6a4e5"} Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.067045 4885 generic.go:334] "Generic (PLEG): container finished" podID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerID="03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7" exitCode=0 Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.067129 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55c76fd6b7-d6s92" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.068189 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-cn45b" event={"ID":"2cc958bb-3dbf-45b1-be56-8fc362a957a5","Type":"ContainerDied","Data":"03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7"} Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.068220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-cn45b" event={"ID":"2cc958bb-3dbf-45b1-be56-8fc362a957a5","Type":"ContainerStarted","Data":"b821eaa20bfc8d91b91d7373dbd2e6c29ae06a68f990eb0c0e8cab6e3c9eeacf"} Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089394 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089434 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17485fbe-1c6a-4d1b-91b3-c465215cb4be-pod-info\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089478 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089497 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4w7l\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-kube-api-access-f4w7l\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089551 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-server-conf\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089575 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089612 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17485fbe-1c6a-4d1b-91b3-c465215cb4be-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.089869 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.090322 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.090709 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.091040 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-server-conf\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.093528 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.097781 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17485fbe-1c6a-4d1b-91b3-c465215cb4be-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.097874 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.097940 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4c9d1ef4ae2d748771a4a21e0f3a78df8503e1290141efbfe3cef4df6cc2ca2a/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.103525 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17485fbe-1c6a-4d1b-91b3-c465215cb4be-pod-info\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.109557 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4w7l\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-kube-api-access-f4w7l\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.129234 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-d6s92"] Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.174601 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55c76fd6b7-d6s92"] Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.179800 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.227290 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.373445 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.473860 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.507113 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vv7gt\" (UniqueName: \"kubernetes.io/projected/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-kube-api-access-vv7gt\") pod \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.507982 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-config\") pod \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.508079 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-dns-svc\") pod \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\" (UID: \"e057e697-8dde-4c0b-80f0-7c3a81c8bca0\") " Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.512595 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-kube-api-access-vv7gt" (OuterVolumeSpecName: "kube-api-access-vv7gt") pod "e057e697-8dde-4c0b-80f0-7c3a81c8bca0" (UID: "e057e697-8dde-4c0b-80f0-7c3a81c8bca0"). InnerVolumeSpecName "kube-api-access-vv7gt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.524261 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e057e697-8dde-4c0b-80f0-7c3a81c8bca0" (UID: "e057e697-8dde-4c0b-80f0-7c3a81c8bca0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.529586 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-config" (OuterVolumeSpecName: "config") pod "e057e697-8dde-4c0b-80f0-7c3a81c8bca0" (UID: "e057e697-8dde-4c0b-80f0-7c3a81c8bca0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.610574 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.610899 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.610962 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vv7gt\" (UniqueName: \"kubernetes.io/projected/e057e697-8dde-4c0b-80f0-7c3a81c8bca0-kube-api-access-vv7gt\") on node \"crc\" DevicePath \"\"" Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.637177 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:50:56 crc kubenswrapper[4885]: W0308 20:50:56.642651 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a3c4c5d_5d8f_4011_9ab6_94c44dfc2872.slice/crio-e0698a4264545dc8b5ff57f24a36650153833a6d5b158c71c19b171366b0d5c7 WatchSource:0}: Error finding container e0698a4264545dc8b5ff57f24a36650153833a6d5b158c71c19b171366b0d5c7: Status 404 returned error can't find the container with id e0698a4264545dc8b5ff57f24a36650153833a6d5b158c71c19b171366b0d5c7 Mar 08 20:50:56 crc kubenswrapper[4885]: I0308 20:50:56.932132 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.016394 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 08 20:50:57 crc kubenswrapper[4885]: E0308 20:50:57.017275 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e057e697-8dde-4c0b-80f0-7c3a81c8bca0" containerName="init" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.017306 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e057e697-8dde-4c0b-80f0-7c3a81c8bca0" containerName="init" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.017692 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e057e697-8dde-4c0b-80f0-7c3a81c8bca0" containerName="init" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.019169 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.023173 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.023721 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4vqmw" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.024455 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.024192 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.031787 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.031841 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.075184 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-cn45b" event={"ID":"2cc958bb-3dbf-45b1-be56-8fc362a957a5","Type":"ContainerStarted","Data":"36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1"} Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.076220 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.077570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872","Type":"ContainerStarted","Data":"e0698a4264545dc8b5ff57f24a36650153833a6d5b158c71c19b171366b0d5c7"} Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.080063 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" event={"ID":"7084d0a7-6526-4b90-93a3-e16b9d374be2","Type":"ContainerStarted","Data":"7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899"} Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.080223 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.081417 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" event={"ID":"e057e697-8dde-4c0b-80f0-7c3a81c8bca0","Type":"ContainerDied","Data":"a27c7d52fd6267f66531fa416c4e6953885eaf21e7fea63ba5bad10bb9c6a4e5"} Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.081464 4885 scope.go:117] "RemoveContainer" containerID="3bd36cc2a7b6689ef1af665b0a95fa5d004c8cabcb49857d2fb7d819c7770e99" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.082392 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fb77f9685-bphnt" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.082838 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17485fbe-1c6a-4d1b-91b3-c465215cb4be","Type":"ContainerStarted","Data":"348bdcbe9fe6c9c9545000fb7ce8f20f391b324558bd8ce452114903f6c00551"} Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.095261 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c44667757-cn45b" podStartSLOduration=3.095245305 podStartE2EDuration="3.095245305s" podCreationTimestamp="2026-03-08 20:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:50:57.087771426 +0000 UTC m=+4758.483825459" watchObservedRunningTime="2026-03-08 20:50:57.095245305 +0000 UTC m=+4758.491299338" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.112628 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" podStartSLOduration=3.112610759 podStartE2EDuration="3.112610759s" podCreationTimestamp="2026-03-08 20:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:50:57.105933231 +0000 UTC m=+4758.501987254" watchObservedRunningTime="2026-03-08 20:50:57.112610759 +0000 UTC m=+4758.508664792" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119609 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-config-data-default\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119644 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcbde6c-f104-4c3b-9937-24728ac572a8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119685 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcbde6c-f104-4c3b-9937-24728ac572a8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-kolla-config\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119727 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119750 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119782 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkvvj\" (UniqueName: \"kubernetes.io/projected/1fcbde6c-f104-4c3b-9937-24728ac572a8-kube-api-access-jkvvj\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.119828 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1fcbde6c-f104-4c3b-9937-24728ac572a8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.150673 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-bphnt"] Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.158567 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fb77f9685-bphnt"] Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221186 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkvvj\" (UniqueName: \"kubernetes.io/projected/1fcbde6c-f104-4c3b-9937-24728ac572a8-kube-api-access-jkvvj\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221284 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1fcbde6c-f104-4c3b-9937-24728ac572a8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221329 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-config-data-default\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221345 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcbde6c-f104-4c3b-9937-24728ac572a8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221393 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcbde6c-f104-4c3b-9937-24728ac572a8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221424 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-kolla-config\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221445 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.221464 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.222815 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.222991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-config-data-default\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.223232 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1fcbde6c-f104-4c3b-9937-24728ac572a8-kolla-config\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.223443 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1fcbde6c-f104-4c3b-9937-24728ac572a8-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.226485 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fcbde6c-f104-4c3b-9937-24728ac572a8-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.228903 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fcbde6c-f104-4c3b-9937-24728ac572a8-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.234427 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.234474 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/91cdede2446772955a07df46284552052b9252ce73facbf1f7a38d7c6d0d6763/globalmount\"" pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.239532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkvvj\" (UniqueName: \"kubernetes.io/projected/1fcbde6c-f104-4c3b-9937-24728ac572a8-kube-api-access-jkvvj\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.264824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6e5c5431-97bc-4d75-8e43-f4d9bb29d52f\") pod \"openstack-galera-0\" (UID: \"1fcbde6c-f104-4c3b-9937-24728ac572a8\") " pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.359014 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.379286 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7da12c3-52d7-40e9-b7ce-8140c98dc55d" path="/var/lib/kubelet/pods/d7da12c3-52d7-40e9-b7ce-8140c98dc55d/volumes" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.379743 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e057e697-8dde-4c0b-80f0-7c3a81c8bca0" path="/var/lib/kubelet/pods/e057e697-8dde-4c0b-80f0-7c3a81c8bca0/volumes" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.381831 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.382875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.385489 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.385751 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-qv44v" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.391653 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.526103 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-config-data\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.526551 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-kolla-config\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.526635 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84zqh\" (UniqueName: \"kubernetes.io/projected/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-kube-api-access-84zqh\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.628104 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-config-data\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.628183 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-kolla-config\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.628227 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84zqh\" (UniqueName: \"kubernetes.io/projected/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-kube-api-access-84zqh\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.629102 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-config-data\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.632496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-kolla-config\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.660099 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84zqh\" (UniqueName: \"kubernetes.io/projected/c0cb204d-b6bd-417e-9b6f-6a0c7faf4820-kube-api-access-84zqh\") pod \"memcached-0\" (UID: \"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820\") " pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.748951 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 08 20:50:57 crc kubenswrapper[4885]: I0308 20:50:57.870365 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.059158 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.119138 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820","Type":"ContainerStarted","Data":"771327ad1f03a668ab2f0f78172055e3b00ae9001ebc750840fa50e53e070890"} Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.142562 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17485fbe-1c6a-4d1b-91b3-c465215cb4be","Type":"ContainerStarted","Data":"665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea"} Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.144189 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872","Type":"ContainerStarted","Data":"7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873"} Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.146143 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1fcbde6c-f104-4c3b-9937-24728ac572a8","Type":"ContainerStarted","Data":"6f7078c8a1a13669613cf0c66f3e7c89c5596c34cb5f4ac3c549d6634c97d9ef"} Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.544402 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.546440 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.549006 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.549614 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.550543 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.551975 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-mrvsj" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.569962 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.650528 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6764t\" (UniqueName: \"kubernetes.io/projected/31392f16-4aaa-4512-982e-0c56d9af8200-kube-api-access-6764t\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31392f16-4aaa-4512-982e-0c56d9af8200-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651190 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31392f16-4aaa-4512-982e-0c56d9af8200-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651306 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651402 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651490 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31392f16-4aaa-4512-982e-0c56d9af8200-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651598 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.651806 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31392f16-4aaa-4512-982e-0c56d9af8200-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753353 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753421 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753480 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6764t\" (UniqueName: \"kubernetes.io/projected/31392f16-4aaa-4512-982e-0c56d9af8200-kube-api-access-6764t\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753534 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31392f16-4aaa-4512-982e-0c56d9af8200-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753570 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31392f16-4aaa-4512-982e-0c56d9af8200-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753625 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.753680 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.755899 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/31392f16-4aaa-4512-982e-0c56d9af8200-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.757072 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.757195 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.758695 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/31392f16-4aaa-4512-982e-0c56d9af8200-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.761033 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/31392f16-4aaa-4512-982e-0c56d9af8200-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.761154 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.761201 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c4b9c1ff5d2434b7037bc43fb21fb2c8a0d4afe3004f6980f97a489a205f8d37/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.763329 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31392f16-4aaa-4512-982e-0c56d9af8200-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.779993 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6764t\" (UniqueName: \"kubernetes.io/projected/31392f16-4aaa-4512-982e-0c56d9af8200-kube-api-access-6764t\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.808093 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-06038e74-c0b1-42b8-b6d7-f4cea12298af\") pod \"openstack-cell1-galera-0\" (UID: \"31392f16-4aaa-4512-982e-0c56d9af8200\") " pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:58 crc kubenswrapper[4885]: I0308 20:50:58.910338 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 08 20:50:59 crc kubenswrapper[4885]: I0308 20:50:59.158761 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c0cb204d-b6bd-417e-9b6f-6a0c7faf4820","Type":"ContainerStarted","Data":"1093ea04c47ee91a81dbc795ff87234498e2aa142aba1fb2808cf80f06a787cc"} Mar 08 20:50:59 crc kubenswrapper[4885]: I0308 20:50:59.158881 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 08 20:50:59 crc kubenswrapper[4885]: I0308 20:50:59.162517 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1fcbde6c-f104-4c3b-9937-24728ac572a8","Type":"ContainerStarted","Data":"1699306d6aff7e9d600087067f63f83aca2a81cb9ff13ee55984e4de5f1d006d"} Mar 08 20:50:59 crc kubenswrapper[4885]: I0308 20:50:59.201617 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.201593352 podStartE2EDuration="2.201593352s" podCreationTimestamp="2026-03-08 20:50:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:50:59.195421946 +0000 UTC m=+4760.591476039" watchObservedRunningTime="2026-03-08 20:50:59.201593352 +0000 UTC m=+4760.597647375" Mar 08 20:50:59 crc kubenswrapper[4885]: I0308 20:50:59.209749 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 08 20:50:59 crc kubenswrapper[4885]: W0308 20:50:59.211433 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31392f16_4aaa_4512_982e_0c56d9af8200.slice/crio-1b6a951b8ae66a7c98c9902d88b4d2af73a7740f745e62a65b5926f3df719ba1 WatchSource:0}: Error finding container 1b6a951b8ae66a7c98c9902d88b4d2af73a7740f745e62a65b5926f3df719ba1: Status 404 returned error can't find the container with id 1b6a951b8ae66a7c98c9902d88b4d2af73a7740f745e62a65b5926f3df719ba1 Mar 08 20:51:00 crc kubenswrapper[4885]: I0308 20:51:00.170023 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31392f16-4aaa-4512-982e-0c56d9af8200","Type":"ContainerStarted","Data":"3c59b827ce865260b9caea2ee52a7477867f098fcd784611a33e76a585e139ae"} Mar 08 20:51:00 crc kubenswrapper[4885]: I0308 20:51:00.171498 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31392f16-4aaa-4512-982e-0c56d9af8200","Type":"ContainerStarted","Data":"1b6a951b8ae66a7c98c9902d88b4d2af73a7740f745e62a65b5926f3df719ba1"} Mar 08 20:51:02 crc kubenswrapper[4885]: I0308 20:51:02.195547 4885 generic.go:334] "Generic (PLEG): container finished" podID="1fcbde6c-f104-4c3b-9937-24728ac572a8" containerID="1699306d6aff7e9d600087067f63f83aca2a81cb9ff13ee55984e4de5f1d006d" exitCode=0 Mar 08 20:51:02 crc kubenswrapper[4885]: I0308 20:51:02.195676 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1fcbde6c-f104-4c3b-9937-24728ac572a8","Type":"ContainerDied","Data":"1699306d6aff7e9d600087067f63f83aca2a81cb9ff13ee55984e4de5f1d006d"} Mar 08 20:51:03 crc kubenswrapper[4885]: I0308 20:51:03.205318 4885 generic.go:334] "Generic (PLEG): container finished" podID="31392f16-4aaa-4512-982e-0c56d9af8200" containerID="3c59b827ce865260b9caea2ee52a7477867f098fcd784611a33e76a585e139ae" exitCode=0 Mar 08 20:51:03 crc kubenswrapper[4885]: I0308 20:51:03.205427 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31392f16-4aaa-4512-982e-0c56d9af8200","Type":"ContainerDied","Data":"3c59b827ce865260b9caea2ee52a7477867f098fcd784611a33e76a585e139ae"} Mar 08 20:51:03 crc kubenswrapper[4885]: I0308 20:51:03.209244 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1fcbde6c-f104-4c3b-9937-24728ac572a8","Type":"ContainerStarted","Data":"a63c0dad7bb6aa0ed2276ddf85ccb822aed5a56c6e68d2083912752d804a9ed3"} Mar 08 20:51:03 crc kubenswrapper[4885]: I0308 20:51:03.261464 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.261445067 podStartE2EDuration="8.261445067s" podCreationTimestamp="2026-03-08 20:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:51:03.257193543 +0000 UTC m=+4764.653247596" watchObservedRunningTime="2026-03-08 20:51:03.261445067 +0000 UTC m=+4764.657499100" Mar 08 20:51:03 crc kubenswrapper[4885]: I0308 20:51:03.597067 4885 scope.go:117] "RemoveContainer" containerID="fc9b989616ec5b500765230953c308a829faaac795ceed2b8cacd49b9b6ec121" Mar 08 20:51:04 crc kubenswrapper[4885]: I0308 20:51:04.222259 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"31392f16-4aaa-4512-982e-0c56d9af8200","Type":"ContainerStarted","Data":"e9c3869c867e654ad6db645ee2d2b003762a2f82a7f211fa0201f8897d528392"} Mar 08 20:51:04 crc kubenswrapper[4885]: I0308 20:51:04.256287 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.256269864 podStartE2EDuration="7.256269864s" podCreationTimestamp="2026-03-08 20:50:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:51:04.249077843 +0000 UTC m=+4765.645131866" watchObservedRunningTime="2026-03-08 20:51:04.256269864 +0000 UTC m=+4765.652323887" Mar 08 20:51:04 crc kubenswrapper[4885]: I0308 20:51:04.977405 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.134394 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.204278 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-cn45b"] Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.235068 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c44667757-cn45b" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerName="dnsmasq-dns" containerID="cri-o://36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1" gracePeriod=10 Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.644373 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.770966 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnx2p\" (UniqueName: \"kubernetes.io/projected/2cc958bb-3dbf-45b1-be56-8fc362a957a5-kube-api-access-qnx2p\") pod \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.771161 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc958bb-3dbf-45b1-be56-8fc362a957a5-config\") pod \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\" (UID: \"2cc958bb-3dbf-45b1-be56-8fc362a957a5\") " Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.778131 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cc958bb-3dbf-45b1-be56-8fc362a957a5-kube-api-access-qnx2p" (OuterVolumeSpecName: "kube-api-access-qnx2p") pod "2cc958bb-3dbf-45b1-be56-8fc362a957a5" (UID: "2cc958bb-3dbf-45b1-be56-8fc362a957a5"). InnerVolumeSpecName "kube-api-access-qnx2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.812270 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2cc958bb-3dbf-45b1-be56-8fc362a957a5-config" (OuterVolumeSpecName: "config") pod "2cc958bb-3dbf-45b1-be56-8fc362a957a5" (UID: "2cc958bb-3dbf-45b1-be56-8fc362a957a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.873112 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnx2p\" (UniqueName: \"kubernetes.io/projected/2cc958bb-3dbf-45b1-be56-8fc362a957a5-kube-api-access-qnx2p\") on node \"crc\" DevicePath \"\"" Mar 08 20:51:05 crc kubenswrapper[4885]: I0308 20:51:05.873151 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2cc958bb-3dbf-45b1-be56-8fc362a957a5-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.248711 4885 generic.go:334] "Generic (PLEG): container finished" podID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerID="36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1" exitCode=0 Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.248823 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c44667757-cn45b" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.248815 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-cn45b" event={"ID":"2cc958bb-3dbf-45b1-be56-8fc362a957a5","Type":"ContainerDied","Data":"36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1"} Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.250539 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c44667757-cn45b" event={"ID":"2cc958bb-3dbf-45b1-be56-8fc362a957a5","Type":"ContainerDied","Data":"b821eaa20bfc8d91b91d7373dbd2e6c29ae06a68f990eb0c0e8cab6e3c9eeacf"} Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.250581 4885 scope.go:117] "RemoveContainer" containerID="36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.284187 4885 scope.go:117] "RemoveContainer" containerID="03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.321413 4885 scope.go:117] "RemoveContainer" containerID="36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1" Mar 08 20:51:06 crc kubenswrapper[4885]: E0308 20:51:06.322084 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1\": container with ID starting with 36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1 not found: ID does not exist" containerID="36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.322139 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1"} err="failed to get container status \"36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1\": rpc error: code = NotFound desc = could not find container \"36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1\": container with ID starting with 36a0e03dca8fad88e20a70e88bbe233db9cc466fd79ff15c983216a41e9b79d1 not found: ID does not exist" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.322176 4885 scope.go:117] "RemoveContainer" containerID="03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7" Mar 08 20:51:06 crc kubenswrapper[4885]: E0308 20:51:06.322662 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7\": container with ID starting with 03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7 not found: ID does not exist" containerID="03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.322727 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7"} err="failed to get container status \"03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7\": rpc error: code = NotFound desc = could not find container \"03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7\": container with ID starting with 03a55d9c81ca0e72e850952e2ad86c0b2dcdfa31fd9f27e05210539413a518f7 not found: ID does not exist" Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.325662 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c44667757-cn45b"] Mar 08 20:51:06 crc kubenswrapper[4885]: I0308 20:51:06.334201 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c44667757-cn45b"] Mar 08 20:51:07 crc kubenswrapper[4885]: I0308 20:51:07.360049 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 08 20:51:07 crc kubenswrapper[4885]: I0308 20:51:07.360336 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 08 20:51:07 crc kubenswrapper[4885]: I0308 20:51:07.368731 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:51:07 crc kubenswrapper[4885]: E0308 20:51:07.369019 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:51:07 crc kubenswrapper[4885]: I0308 20:51:07.380842 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" path="/var/lib/kubelet/pods/2cc958bb-3dbf-45b1-be56-8fc362a957a5/volumes" Mar 08 20:51:07 crc kubenswrapper[4885]: I0308 20:51:07.452209 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 08 20:51:07 crc kubenswrapper[4885]: I0308 20:51:07.749750 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 08 20:51:08 crc kubenswrapper[4885]: I0308 20:51:08.492636 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 08 20:51:08 crc kubenswrapper[4885]: I0308 20:51:08.912391 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 08 20:51:08 crc kubenswrapper[4885]: I0308 20:51:08.912442 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 08 20:51:11 crc kubenswrapper[4885]: I0308 20:51:11.393695 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 08 20:51:11 crc kubenswrapper[4885]: I0308 20:51:11.521184 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.039026 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-p2l7p"] Mar 08 20:51:16 crc kubenswrapper[4885]: E0308 20:51:16.040613 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerName="dnsmasq-dns" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.040644 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerName="dnsmasq-dns" Mar 08 20:51:16 crc kubenswrapper[4885]: E0308 20:51:16.040804 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerName="init" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.040817 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerName="init" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.041378 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cc958bb-3dbf-45b1-be56-8fc362a957a5" containerName="dnsmasq-dns" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.042319 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.051165 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.057360 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-p2l7p"] Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.157962 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf35f28-81c5-4f7c-9048-212d8899b7e7-operator-scripts\") pod \"root-account-create-update-p2l7p\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.158099 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gkv6\" (UniqueName: \"kubernetes.io/projected/7bf35f28-81c5-4f7c-9048-212d8899b7e7-kube-api-access-2gkv6\") pod \"root-account-create-update-p2l7p\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.259698 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gkv6\" (UniqueName: \"kubernetes.io/projected/7bf35f28-81c5-4f7c-9048-212d8899b7e7-kube-api-access-2gkv6\") pod \"root-account-create-update-p2l7p\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.259895 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf35f28-81c5-4f7c-9048-212d8899b7e7-operator-scripts\") pod \"root-account-create-update-p2l7p\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.261304 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf35f28-81c5-4f7c-9048-212d8899b7e7-operator-scripts\") pod \"root-account-create-update-p2l7p\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.300858 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gkv6\" (UniqueName: \"kubernetes.io/projected/7bf35f28-81c5-4f7c-9048-212d8899b7e7-kube-api-access-2gkv6\") pod \"root-account-create-update-p2l7p\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.390739 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:16 crc kubenswrapper[4885]: I0308 20:51:16.805966 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-p2l7p"] Mar 08 20:51:17 crc kubenswrapper[4885]: I0308 20:51:17.345533 4885 generic.go:334] "Generic (PLEG): container finished" podID="7bf35f28-81c5-4f7c-9048-212d8899b7e7" containerID="320469e7a132de2b538abe239f05e1a393daef2ccce9780de0687281e071e2ef" exitCode=0 Mar 08 20:51:17 crc kubenswrapper[4885]: I0308 20:51:17.345583 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p2l7p" event={"ID":"7bf35f28-81c5-4f7c-9048-212d8899b7e7","Type":"ContainerDied","Data":"320469e7a132de2b538abe239f05e1a393daef2ccce9780de0687281e071e2ef"} Mar 08 20:51:17 crc kubenswrapper[4885]: I0308 20:51:17.345613 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p2l7p" event={"ID":"7bf35f28-81c5-4f7c-9048-212d8899b7e7","Type":"ContainerStarted","Data":"2d52f4284b74e3b521ad5c49794d9a3b1363e683e4f40a3c0854ee8735cca9ee"} Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.750915 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.802268 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gkv6\" (UniqueName: \"kubernetes.io/projected/7bf35f28-81c5-4f7c-9048-212d8899b7e7-kube-api-access-2gkv6\") pod \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.802388 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf35f28-81c5-4f7c-9048-212d8899b7e7-operator-scripts\") pod \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\" (UID: \"7bf35f28-81c5-4f7c-9048-212d8899b7e7\") " Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.803376 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bf35f28-81c5-4f7c-9048-212d8899b7e7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bf35f28-81c5-4f7c-9048-212d8899b7e7" (UID: "7bf35f28-81c5-4f7c-9048-212d8899b7e7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.812890 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bf35f28-81c5-4f7c-9048-212d8899b7e7-kube-api-access-2gkv6" (OuterVolumeSpecName: "kube-api-access-2gkv6") pod "7bf35f28-81c5-4f7c-9048-212d8899b7e7" (UID: "7bf35f28-81c5-4f7c-9048-212d8899b7e7"). InnerVolumeSpecName "kube-api-access-2gkv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.904067 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gkv6\" (UniqueName: \"kubernetes.io/projected/7bf35f28-81c5-4f7c-9048-212d8899b7e7-kube-api-access-2gkv6\") on node \"crc\" DevicePath \"\"" Mar 08 20:51:18 crc kubenswrapper[4885]: I0308 20:51:18.904119 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf35f28-81c5-4f7c-9048-212d8899b7e7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 20:51:19 crc kubenswrapper[4885]: I0308 20:51:19.367008 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-p2l7p" Mar 08 20:51:19 crc kubenswrapper[4885]: I0308 20:51:19.375321 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:51:19 crc kubenswrapper[4885]: E0308 20:51:19.375742 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:51:19 crc kubenswrapper[4885]: I0308 20:51:19.384204 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-p2l7p" event={"ID":"7bf35f28-81c5-4f7c-9048-212d8899b7e7","Type":"ContainerDied","Data":"2d52f4284b74e3b521ad5c49794d9a3b1363e683e4f40a3c0854ee8735cca9ee"} Mar 08 20:51:19 crc kubenswrapper[4885]: I0308 20:51:19.384254 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d52f4284b74e3b521ad5c49794d9a3b1363e683e4f40a3c0854ee8735cca9ee" Mar 08 20:51:22 crc kubenswrapper[4885]: I0308 20:51:22.559390 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-p2l7p"] Mar 08 20:51:22 crc kubenswrapper[4885]: I0308 20:51:22.570025 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-p2l7p"] Mar 08 20:51:23 crc kubenswrapper[4885]: I0308 20:51:23.383226 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bf35f28-81c5-4f7c-9048-212d8899b7e7" path="/var/lib/kubelet/pods/7bf35f28-81c5-4f7c-9048-212d8899b7e7/volumes" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.564434 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-k2nvt"] Mar 08 20:51:27 crc kubenswrapper[4885]: E0308 20:51:27.565152 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf35f28-81c5-4f7c-9048-212d8899b7e7" containerName="mariadb-account-create-update" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.565169 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf35f28-81c5-4f7c-9048-212d8899b7e7" containerName="mariadb-account-create-update" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.565350 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf35f28-81c5-4f7c-9048-212d8899b7e7" containerName="mariadb-account-create-update" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.565984 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.568320 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.590646 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k2nvt"] Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.740486 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-operator-scripts\") pod \"root-account-create-update-k2nvt\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.740727 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-542x2\" (UniqueName: \"kubernetes.io/projected/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-kube-api-access-542x2\") pod \"root-account-create-update-k2nvt\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.842555 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-operator-scripts\") pod \"root-account-create-update-k2nvt\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.842708 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-542x2\" (UniqueName: \"kubernetes.io/projected/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-kube-api-access-542x2\") pod \"root-account-create-update-k2nvt\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.844113 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-operator-scripts\") pod \"root-account-create-update-k2nvt\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.878629 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-542x2\" (UniqueName: \"kubernetes.io/projected/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-kube-api-access-542x2\") pod \"root-account-create-update-k2nvt\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:27 crc kubenswrapper[4885]: I0308 20:51:27.897499 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:28 crc kubenswrapper[4885]: I0308 20:51:28.451366 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-k2nvt"] Mar 08 20:51:29 crc kubenswrapper[4885]: I0308 20:51:29.474416 4885 generic.go:334] "Generic (PLEG): container finished" podID="1dd375ae-21ae-4fb9-87dc-f6a1a205736f" containerID="d3ae625b11e0cf7052090345483fbacc9c9a2ab6adc1e7e832c166efaabc3867" exitCode=0 Mar 08 20:51:29 crc kubenswrapper[4885]: I0308 20:51:29.474501 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k2nvt" event={"ID":"1dd375ae-21ae-4fb9-87dc-f6a1a205736f","Type":"ContainerDied","Data":"d3ae625b11e0cf7052090345483fbacc9c9a2ab6adc1e7e832c166efaabc3867"} Mar 08 20:51:29 crc kubenswrapper[4885]: I0308 20:51:29.474829 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k2nvt" event={"ID":"1dd375ae-21ae-4fb9-87dc-f6a1a205736f","Type":"ContainerStarted","Data":"fb3e59b65f875de48f7b6500d6e768415adf73a1239015878abe1e2c78a32210"} Mar 08 20:51:30 crc kubenswrapper[4885]: E0308 20:51:30.022471 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17485fbe_1c6a_4d1b_91b3_c465215cb4be.slice/crio-conmon-665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17485fbe_1c6a_4d1b_91b3_c465215cb4be.slice/crio-665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea.scope\": RecentStats: unable to find data in memory cache]" Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.368485 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:51:30 crc kubenswrapper[4885]: E0308 20:51:30.368852 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.489707 4885 generic.go:334] "Generic (PLEG): container finished" podID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerID="665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea" exitCode=0 Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.489798 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17485fbe-1c6a-4d1b-91b3-c465215cb4be","Type":"ContainerDied","Data":"665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea"} Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.492504 4885 generic.go:334] "Generic (PLEG): container finished" podID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerID="7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873" exitCode=0 Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.492619 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872","Type":"ContainerDied","Data":"7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873"} Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.857708 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.998546 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-operator-scripts\") pod \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.998613 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-542x2\" (UniqueName: \"kubernetes.io/projected/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-kube-api-access-542x2\") pod \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\" (UID: \"1dd375ae-21ae-4fb9-87dc-f6a1a205736f\") " Mar 08 20:51:30 crc kubenswrapper[4885]: I0308 20:51:30.999144 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1dd375ae-21ae-4fb9-87dc-f6a1a205736f" (UID: "1dd375ae-21ae-4fb9-87dc-f6a1a205736f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.002722 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-kube-api-access-542x2" (OuterVolumeSpecName: "kube-api-access-542x2") pod "1dd375ae-21ae-4fb9-87dc-f6a1a205736f" (UID: "1dd375ae-21ae-4fb9-87dc-f6a1a205736f"). InnerVolumeSpecName "kube-api-access-542x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.099835 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.099865 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-542x2\" (UniqueName: \"kubernetes.io/projected/1dd375ae-21ae-4fb9-87dc-f6a1a205736f-kube-api-access-542x2\") on node \"crc\" DevicePath \"\"" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.504125 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-k2nvt" event={"ID":"1dd375ae-21ae-4fb9-87dc-f6a1a205736f","Type":"ContainerDied","Data":"fb3e59b65f875de48f7b6500d6e768415adf73a1239015878abe1e2c78a32210"} Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.504169 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-k2nvt" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.504187 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb3e59b65f875de48f7b6500d6e768415adf73a1239015878abe1e2c78a32210" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.507576 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17485fbe-1c6a-4d1b-91b3-c465215cb4be","Type":"ContainerStarted","Data":"075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c"} Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.508722 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.513395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872","Type":"ContainerStarted","Data":"70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5"} Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.513985 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.538566 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.538545444 podStartE2EDuration="37.538545444s" podCreationTimestamp="2026-03-08 20:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:51:31.53350196 +0000 UTC m=+4792.929555993" watchObservedRunningTime="2026-03-08 20:51:31.538545444 +0000 UTC m=+4792.934599467" Mar 08 20:51:31 crc kubenswrapper[4885]: I0308 20:51:31.560768 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.560750867 podStartE2EDuration="37.560750867s" podCreationTimestamp="2026-03-08 20:50:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:51:31.554413397 +0000 UTC m=+4792.950467460" watchObservedRunningTime="2026-03-08 20:51:31.560750867 +0000 UTC m=+4792.956804890" Mar 08 20:51:42 crc kubenswrapper[4885]: I0308 20:51:42.368581 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:51:42 crc kubenswrapper[4885]: E0308 20:51:42.369498 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:51:46 crc kubenswrapper[4885]: I0308 20:51:46.232205 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:51:46 crc kubenswrapper[4885]: I0308 20:51:46.478269 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.358993 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-z8kjz"] Mar 08 20:51:51 crc kubenswrapper[4885]: E0308 20:51:51.359735 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd375ae-21ae-4fb9-87dc-f6a1a205736f" containerName="mariadb-account-create-update" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.359752 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd375ae-21ae-4fb9-87dc-f6a1a205736f" containerName="mariadb-account-create-update" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.359902 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd375ae-21ae-4fb9-87dc-f6a1a205736f" containerName="mariadb-account-create-update" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.360600 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.382252 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-z8kjz"] Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.488121 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vx9b\" (UniqueName: \"kubernetes.io/projected/8e1559f2-4966-4752-8c07-aea40781bbd3-kube-api-access-9vx9b\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.488210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.488716 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-config\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.590187 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.591135 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-dns-svc\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.591393 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-config\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.592125 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-config\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.592198 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vx9b\" (UniqueName: \"kubernetes.io/projected/8e1559f2-4966-4752-8c07-aea40781bbd3-kube-api-access-9vx9b\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.631824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vx9b\" (UniqueName: \"kubernetes.io/projected/8e1559f2-4966-4752-8c07-aea40781bbd3-kube-api-access-9vx9b\") pod \"dnsmasq-dns-66d5bf7c87-z8kjz\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:51 crc kubenswrapper[4885]: I0308 20:51:51.680740 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:52 crc kubenswrapper[4885]: I0308 20:51:52.144984 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:51:52 crc kubenswrapper[4885]: I0308 20:51:52.175443 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-z8kjz"] Mar 08 20:51:52 crc kubenswrapper[4885]: I0308 20:51:52.705345 4885 generic.go:334] "Generic (PLEG): container finished" podID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerID="42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43" exitCode=0 Mar 08 20:51:52 crc kubenswrapper[4885]: I0308 20:51:52.705433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" event={"ID":"8e1559f2-4966-4752-8c07-aea40781bbd3","Type":"ContainerDied","Data":"42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43"} Mar 08 20:51:52 crc kubenswrapper[4885]: I0308 20:51:52.705625 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" event={"ID":"8e1559f2-4966-4752-8c07-aea40781bbd3","Type":"ContainerStarted","Data":"e38cb66edab29ad6cc751ddfec546724fea6786800c820b585d732bbfd60f672"} Mar 08 20:51:52 crc kubenswrapper[4885]: I0308 20:51:52.917692 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:51:53 crc kubenswrapper[4885]: I0308 20:51:53.717729 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" event={"ID":"8e1559f2-4966-4752-8c07-aea40781bbd3","Type":"ContainerStarted","Data":"04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5"} Mar 08 20:51:53 crc kubenswrapper[4885]: I0308 20:51:53.718858 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:51:53 crc kubenswrapper[4885]: I0308 20:51:53.747785 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" podStartSLOduration=2.747769773 podStartE2EDuration="2.747769773s" podCreationTimestamp="2026-03-08 20:51:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:51:53.743671483 +0000 UTC m=+4815.139725506" watchObservedRunningTime="2026-03-08 20:51:53.747769773 +0000 UTC m=+4815.143823796" Mar 08 20:51:53 crc kubenswrapper[4885]: I0308 20:51:53.864361 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="rabbitmq" containerID="cri-o://075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c" gracePeriod=604799 Mar 08 20:51:54 crc kubenswrapper[4885]: I0308 20:51:54.755427 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="rabbitmq" containerID="cri-o://70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5" gracePeriod=604799 Mar 08 20:51:56 crc kubenswrapper[4885]: I0308 20:51:56.228581 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.22:5672: connect: connection refused" Mar 08 20:51:56 crc kubenswrapper[4885]: I0308 20:51:56.475213 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.23:5672: connect: connection refused" Mar 08 20:51:57 crc kubenswrapper[4885]: I0308 20:51:57.370532 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:51:57 crc kubenswrapper[4885]: E0308 20:51:57.371513 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.143618 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550052-jkg7s"] Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.146551 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.156798 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550052-jkg7s"] Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.158472 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.158849 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.159157 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.252732 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ljxm\" (UniqueName: \"kubernetes.io/projected/2414c1e7-ce59-4c76-865d-1a5ffa71578f-kube-api-access-2ljxm\") pod \"auto-csr-approver-29550052-jkg7s\" (UID: \"2414c1e7-ce59-4c76-865d-1a5ffa71578f\") " pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.354192 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ljxm\" (UniqueName: \"kubernetes.io/projected/2414c1e7-ce59-4c76-865d-1a5ffa71578f-kube-api-access-2ljxm\") pod \"auto-csr-approver-29550052-jkg7s\" (UID: \"2414c1e7-ce59-4c76-865d-1a5ffa71578f\") " pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.387149 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ljxm\" (UniqueName: \"kubernetes.io/projected/2414c1e7-ce59-4c76-865d-1a5ffa71578f-kube-api-access-2ljxm\") pod \"auto-csr-approver-29550052-jkg7s\" (UID: \"2414c1e7-ce59-4c76-865d-1a5ffa71578f\") " pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.474434 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.485632 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.557798 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17485fbe-1c6a-4d1b-91b3-c465215cb4be-pod-info\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.557850 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17485fbe-1c6a-4d1b-91b3-c465215cb4be-erlang-cookie-secret\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.557901 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-server-conf\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.557970 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-plugins\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.557995 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-confd\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.558027 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-erlang-cookie\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.558057 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-plugins-conf\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.558171 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.558197 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4w7l\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-kube-api-access-f4w7l\") pod \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\" (UID: \"17485fbe-1c6a-4d1b-91b3-c465215cb4be\") " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.563397 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.573683 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/17485fbe-1c6a-4d1b-91b3-c465215cb4be-pod-info" (OuterVolumeSpecName: "pod-info") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.573784 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.574057 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.574232 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17485fbe-1c6a-4d1b-91b3-c465215cb4be-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.584172 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-kube-api-access-f4w7l" (OuterVolumeSpecName: "kube-api-access-f4w7l") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "kube-api-access-f4w7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.602513 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-server-conf" (OuterVolumeSpecName: "server-conf") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.610352 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa" (OuterVolumeSpecName: "persistence") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.659948 4885 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17485fbe-1c6a-4d1b-91b3-c465215cb4be-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.659986 4885 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-server-conf\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.659998 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.660010 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.660023 4885 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17485fbe-1c6a-4d1b-91b3-c465215cb4be-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.660058 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") on node \"crc\" " Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.660074 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4w7l\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-kube-api-access-f4w7l\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.660086 4885 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17485fbe-1c6a-4d1b-91b3-c465215cb4be-pod-info\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.675461 4885 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.675691 4885 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa") on node "crc" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.691185 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "17485fbe-1c6a-4d1b-91b3-c465215cb4be" (UID: "17485fbe-1c6a-4d1b-91b3-c465215cb4be"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.760851 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17485fbe-1c6a-4d1b-91b3-c465215cb4be-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.761309 4885 reconciler_common.go:293] "Volume detached for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.779581 4885 generic.go:334] "Generic (PLEG): container finished" podID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerID="075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c" exitCode=0 Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.779626 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17485fbe-1c6a-4d1b-91b3-c465215cb4be","Type":"ContainerDied","Data":"075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c"} Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.779659 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"17485fbe-1c6a-4d1b-91b3-c465215cb4be","Type":"ContainerDied","Data":"348bdcbe9fe6c9c9545000fb7ce8f20f391b324558bd8ce452114903f6c00551"} Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.779678 4885 scope.go:117] "RemoveContainer" containerID="075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.779817 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.822065 4885 scope.go:117] "RemoveContainer" containerID="665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.823053 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.828350 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.850179 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:52:00 crc kubenswrapper[4885]: E0308 20:52:00.850543 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="rabbitmq" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.850562 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="rabbitmq" Mar 08 20:52:00 crc kubenswrapper[4885]: E0308 20:52:00.850587 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="setup-container" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.850594 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="setup-container" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.850756 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" containerName="rabbitmq" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.851546 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.859695 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550052-jkg7s"] Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.859882 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.859889 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.860165 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.860309 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.862275 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-qk9sg" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.867784 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.869617 4885 scope.go:117] "RemoveContainer" containerID="075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c" Mar 08 20:52:00 crc kubenswrapper[4885]: E0308 20:52:00.870247 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c\": container with ID starting with 075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c not found: ID does not exist" containerID="075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.870273 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c"} err="failed to get container status \"075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c\": rpc error: code = NotFound desc = could not find container \"075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c\": container with ID starting with 075af44ec31464846679900a9dc7905384c2f16cd39f1547bc93d23f08cce95c not found: ID does not exist" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.870292 4885 scope.go:117] "RemoveContainer" containerID="665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea" Mar 08 20:52:00 crc kubenswrapper[4885]: E0308 20:52:00.870559 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea\": container with ID starting with 665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea not found: ID does not exist" containerID="665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.870580 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea"} err="failed to get container status \"665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea\": rpc error: code = NotFound desc = could not find container \"665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea\": container with ID starting with 665d476c74eb2e979156cbb206a4067b26e3353cd2e248f19ba1c0523c09b5ea not found: ID does not exist" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.879219 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970177 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00704ff6-696f-4687-99e0-23bf055d1bef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970322 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970364 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970396 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00704ff6-696f-4687-99e0-23bf055d1bef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970424 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nl8j\" (UniqueName: \"kubernetes.io/projected/00704ff6-696f-4687-99e0-23bf055d1bef-kube-api-access-7nl8j\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970469 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970493 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00704ff6-696f-4687-99e0-23bf055d1bef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970535 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:00 crc kubenswrapper[4885]: I0308 20:52:00.970556 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00704ff6-696f-4687-99e0-23bf055d1bef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072090 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072146 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00704ff6-696f-4687-99e0-23bf055d1bef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072171 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nl8j\" (UniqueName: \"kubernetes.io/projected/00704ff6-696f-4687-99e0-23bf055d1bef-kube-api-access-7nl8j\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072205 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072223 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00704ff6-696f-4687-99e0-23bf055d1bef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072253 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072269 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00704ff6-696f-4687-99e0-23bf055d1bef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072299 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00704ff6-696f-4687-99e0-23bf055d1bef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072368 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.072890 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.073147 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.078493 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/00704ff6-696f-4687-99e0-23bf055d1bef-pod-info\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.079087 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/00704ff6-696f-4687-99e0-23bf055d1bef-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.079630 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/00704ff6-696f-4687-99e0-23bf055d1bef-server-conf\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.081537 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/00704ff6-696f-4687-99e0-23bf055d1bef-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.082300 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.082350 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4c9d1ef4ae2d748771a4a21e0f3a78df8503e1290141efbfe3cef4df6cc2ca2a/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.091903 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/00704ff6-696f-4687-99e0-23bf055d1bef-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.106346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nl8j\" (UniqueName: \"kubernetes.io/projected/00704ff6-696f-4687-99e0-23bf055d1bef-kube-api-access-7nl8j\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.126232 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66a4a9d-3e38-4c85-ba16-6c00917d90fa\") pod \"rabbitmq-server-0\" (UID: \"00704ff6-696f-4687-99e0-23bf055d1bef\") " pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.180199 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.317111 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.380107 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17485fbe-1c6a-4d1b-91b3-c465215cb4be" path="/var/lib/kubelet/pods/17485fbe-1c6a-4d1b-91b3-c465215cb4be/volumes" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479437 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-plugins\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479860 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-erlang-cookie-secret\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479882 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-confd\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479938 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-pod-info\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-server-conf\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479990 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-plugins-conf\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.480010 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2lfc\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-kube-api-access-l2lfc\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.479978 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.480051 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-erlang-cookie\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.480181 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\" (UID: \"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872\") " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.480485 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.480791 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.481464 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.486374 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.486584 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-pod-info" (OuterVolumeSpecName: "pod-info") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.489192 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-kube-api-access-l2lfc" (OuterVolumeSpecName: "kube-api-access-l2lfc") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "kube-api-access-l2lfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.493572 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c" (OuterVolumeSpecName: "persistence") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.507506 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-server-conf" (OuterVolumeSpecName: "server-conf") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.563643 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" (UID: "0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.581742 4885 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582015 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582109 4885 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-pod-info\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582185 4885 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-server-conf\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582486 4885 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582578 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2lfc\" (UniqueName: \"kubernetes.io/projected/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-kube-api-access-l2lfc\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582669 4885 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.582832 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") on node \"crc\" " Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.599540 4885 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.599739 4885 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c") on node "crc" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.667599 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 08 20:52:01 crc kubenswrapper[4885]: W0308 20:52:01.678363 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00704ff6_696f_4687_99e0_23bf055d1bef.slice/crio-d323dda659275d06098e9af019227635ed12b00854dc16072ef396266c35f2f5 WatchSource:0}: Error finding container d323dda659275d06098e9af019227635ed12b00854dc16072ef396266c35f2f5: Status 404 returned error can't find the container with id d323dda659275d06098e9af019227635ed12b00854dc16072ef396266c35f2f5 Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.682095 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.683895 4885 reconciler_common.go:293] "Volume detached for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.756336 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-6lbrp"] Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.756627 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerName="dnsmasq-dns" containerID="cri-o://7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899" gracePeriod=10 Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.803332 4885 generic.go:334] "Generic (PLEG): container finished" podID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerID="70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5" exitCode=0 Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.803529 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872","Type":"ContainerDied","Data":"70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5"} Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.803570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872","Type":"ContainerDied","Data":"e0698a4264545dc8b5ff57f24a36650153833a6d5b158c71c19b171366b0d5c7"} Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.803592 4885 scope.go:117] "RemoveContainer" containerID="70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.803718 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.819382 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00704ff6-696f-4687-99e0-23bf055d1bef","Type":"ContainerStarted","Data":"d323dda659275d06098e9af019227635ed12b00854dc16072ef396266c35f2f5"} Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.821015 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" event={"ID":"2414c1e7-ce59-4c76-865d-1a5ffa71578f","Type":"ContainerStarted","Data":"8e59bd71d608d2a19f6aba445bfc270fde040b3beeb2fa5a0759e05d0d91b133"} Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.894180 4885 scope.go:117] "RemoveContainer" containerID="7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.918341 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.952713 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.987078 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:52:01 crc kubenswrapper[4885]: E0308 20:52:01.987763 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="setup-container" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.987780 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="setup-container" Mar 08 20:52:01 crc kubenswrapper[4885]: E0308 20:52:01.987796 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="rabbitmq" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.987804 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="rabbitmq" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.988113 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" containerName="rabbitmq" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.990399 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.995589 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.995968 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.995768 4885 scope.go:117] "RemoveContainer" containerID="70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5" Mar 08 20:52:01 crc kubenswrapper[4885]: E0308 20:52:01.996577 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5\": container with ID starting with 70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5 not found: ID does not exist" containerID="70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.996613 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5"} err="failed to get container status \"70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5\": rpc error: code = NotFound desc = could not find container \"70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5\": container with ID starting with 70c7014641bf65ad649d59b467fc485d57cc061aefc6c4523ad411a3c92ce8b5 not found: ID does not exist" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.996640 4885 scope.go:117] "RemoveContainer" containerID="7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873" Mar 08 20:52:01 crc kubenswrapper[4885]: E0308 20:52:01.996875 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873\": container with ID starting with 7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873 not found: ID does not exist" containerID="7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.996897 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873"} err="failed to get container status \"7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873\": rpc error: code = NotFound desc = could not find container \"7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873\": container with ID starting with 7d630c684acb56f85591033006261af56f2285166cf169fc708f71880c082873 not found: ID does not exist" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.997407 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.997783 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 08 20:52:01 crc kubenswrapper[4885]: I0308 20:52:01.999598 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lp8r9" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.001253 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099054 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td4cx\" (UniqueName: \"kubernetes.io/projected/f0d39294-b81d-4534-b86a-35a3aea74ed7-kube-api-access-td4cx\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099136 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099318 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099345 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0d39294-b81d-4534-b86a-35a3aea74ed7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099364 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0d39294-b81d-4534-b86a-35a3aea74ed7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099383 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0d39294-b81d-4534-b86a-35a3aea74ed7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099401 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.099422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0d39294-b81d-4534-b86a-35a3aea74ed7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200471 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0d39294-b81d-4534-b86a-35a3aea74ed7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200496 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0d39294-b81d-4534-b86a-35a3aea74ed7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200519 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0d39294-b81d-4534-b86a-35a3aea74ed7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200539 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200563 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0d39294-b81d-4534-b86a-35a3aea74ed7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200601 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200622 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td4cx\" (UniqueName: \"kubernetes.io/projected/f0d39294-b81d-4534-b86a-35a3aea74ed7-kube-api-access-td4cx\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.200641 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.201539 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.202224 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.202720 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f0d39294-b81d-4534-b86a-35a3aea74ed7-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.202764 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f0d39294-b81d-4534-b86a-35a3aea74ed7-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.207441 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f0d39294-b81d-4534-b86a-35a3aea74ed7-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.207687 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f0d39294-b81d-4534-b86a-35a3aea74ed7-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.207822 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.207871 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c3a41ce06382fbd79ab5375af77b75afb5583c9d1133ff7b974e34c8338b5b2/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.219048 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td4cx\" (UniqueName: \"kubernetes.io/projected/f0d39294-b81d-4534-b86a-35a3aea74ed7-kube-api-access-td4cx\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.219839 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f0d39294-b81d-4534-b86a-35a3aea74ed7-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.247532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3f24c1be-e77c-47d5-b455-c7e5723fa71c\") pod \"rabbitmq-cell1-server-0\" (UID: \"f0d39294-b81d-4534-b86a-35a3aea74ed7\") " pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.333270 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.597658 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:52:02 crc kubenswrapper[4885]: W0308 20:52:02.650092 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf0d39294_b81d_4534_b86a_35a3aea74ed7.slice/crio-41f7dae3a127db06853738981eb55d48d6847c008aad7cda6045a1a3d97286d8 WatchSource:0}: Error finding container 41f7dae3a127db06853738981eb55d48d6847c008aad7cda6045a1a3d97286d8: Status 404 returned error can't find the container with id 41f7dae3a127db06853738981eb55d48d6847c008aad7cda6045a1a3d97286d8 Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.653556 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.708570 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7z56t\" (UniqueName: \"kubernetes.io/projected/7084d0a7-6526-4b90-93a3-e16b9d374be2-kube-api-access-7z56t\") pod \"7084d0a7-6526-4b90-93a3-e16b9d374be2\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.708787 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-config\") pod \"7084d0a7-6526-4b90-93a3-e16b9d374be2\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.708896 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-dns-svc\") pod \"7084d0a7-6526-4b90-93a3-e16b9d374be2\" (UID: \"7084d0a7-6526-4b90-93a3-e16b9d374be2\") " Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.713212 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7084d0a7-6526-4b90-93a3-e16b9d374be2-kube-api-access-7z56t" (OuterVolumeSpecName: "kube-api-access-7z56t") pod "7084d0a7-6526-4b90-93a3-e16b9d374be2" (UID: "7084d0a7-6526-4b90-93a3-e16b9d374be2"). InnerVolumeSpecName "kube-api-access-7z56t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.747799 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7084d0a7-6526-4b90-93a3-e16b9d374be2" (UID: "7084d0a7-6526-4b90-93a3-e16b9d374be2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.748970 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-config" (OuterVolumeSpecName: "config") pod "7084d0a7-6526-4b90-93a3-e16b9d374be2" (UID: "7084d0a7-6526-4b90-93a3-e16b9d374be2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.810466 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.810506 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7084d0a7-6526-4b90-93a3-e16b9d374be2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.810524 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7z56t\" (UniqueName: \"kubernetes.io/projected/7084d0a7-6526-4b90-93a3-e16b9d374be2-kube-api-access-7z56t\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.829803 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f0d39294-b81d-4534-b86a-35a3aea74ed7","Type":"ContainerStarted","Data":"41f7dae3a127db06853738981eb55d48d6847c008aad7cda6045a1a3d97286d8"} Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.831705 4885 generic.go:334] "Generic (PLEG): container finished" podID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerID="7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899" exitCode=0 Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.831793 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" event={"ID":"7084d0a7-6526-4b90-93a3-e16b9d374be2","Type":"ContainerDied","Data":"7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899"} Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.831828 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" event={"ID":"7084d0a7-6526-4b90-93a3-e16b9d374be2","Type":"ContainerDied","Data":"cefbe5124552682fd6a4537cd48b7c8e69da3ba23b252f870be87ed5d1dd02f5"} Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.831848 4885 scope.go:117] "RemoveContainer" containerID="7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.832175 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-ff89b6977-6lbrp" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.837478 4885 generic.go:334] "Generic (PLEG): container finished" podID="2414c1e7-ce59-4c76-865d-1a5ffa71578f" containerID="4a9dac92cb97fc09835d72492b83bdbd16e3d2d9b07c98a3d36966204fa55732" exitCode=0 Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.837872 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" event={"ID":"2414c1e7-ce59-4c76-865d-1a5ffa71578f","Type":"ContainerDied","Data":"4a9dac92cb97fc09835d72492b83bdbd16e3d2d9b07c98a3d36966204fa55732"} Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.858639 4885 scope.go:117] "RemoveContainer" containerID="4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.888837 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-6lbrp"] Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.898165 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-ff89b6977-6lbrp"] Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.902053 4885 scope.go:117] "RemoveContainer" containerID="7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899" Mar 08 20:52:02 crc kubenswrapper[4885]: E0308 20:52:02.902558 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899\": container with ID starting with 7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899 not found: ID does not exist" containerID="7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.902598 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899"} err="failed to get container status \"7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899\": rpc error: code = NotFound desc = could not find container \"7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899\": container with ID starting with 7c4ccc10548b3a4fe131db124af8298d29bea4ea57f0d25bf94f2733a2cb6899 not found: ID does not exist" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.902624 4885 scope.go:117] "RemoveContainer" containerID="4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd" Mar 08 20:52:02 crc kubenswrapper[4885]: E0308 20:52:02.902980 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd\": container with ID starting with 4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd not found: ID does not exist" containerID="4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd" Mar 08 20:52:02 crc kubenswrapper[4885]: I0308 20:52:02.903045 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd"} err="failed to get container status \"4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd\": rpc error: code = NotFound desc = could not find container \"4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd\": container with ID starting with 4bdbb6065e1cf82c8b1b3d5c20d52a4f7cd4a057d599d3c8fc74fe2dd7c609cd not found: ID does not exist" Mar 08 20:52:03 crc kubenswrapper[4885]: I0308 20:52:03.383777 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872" path="/var/lib/kubelet/pods/0a3c4c5d-5d8f-4011-9ab6-94c44dfc2872/volumes" Mar 08 20:52:03 crc kubenswrapper[4885]: I0308 20:52:03.385568 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" path="/var/lib/kubelet/pods/7084d0a7-6526-4b90-93a3-e16b9d374be2/volumes" Mar 08 20:52:03 crc kubenswrapper[4885]: I0308 20:52:03.848025 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00704ff6-696f-4687-99e0-23bf055d1bef","Type":"ContainerStarted","Data":"a283cbecf908e182dd51b0bfeb925825b7ba37fe1ff0de14d5fc2f94c054e7eb"} Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.391530 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.441716 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ljxm\" (UniqueName: \"kubernetes.io/projected/2414c1e7-ce59-4c76-865d-1a5ffa71578f-kube-api-access-2ljxm\") pod \"2414c1e7-ce59-4c76-865d-1a5ffa71578f\" (UID: \"2414c1e7-ce59-4c76-865d-1a5ffa71578f\") " Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.448899 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2414c1e7-ce59-4c76-865d-1a5ffa71578f-kube-api-access-2ljxm" (OuterVolumeSpecName: "kube-api-access-2ljxm") pod "2414c1e7-ce59-4c76-865d-1a5ffa71578f" (UID: "2414c1e7-ce59-4c76-865d-1a5ffa71578f"). InnerVolumeSpecName "kube-api-access-2ljxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.543489 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ljxm\" (UniqueName: \"kubernetes.io/projected/2414c1e7-ce59-4c76-865d-1a5ffa71578f-kube-api-access-2ljxm\") on node \"crc\" DevicePath \"\"" Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.860541 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" event={"ID":"2414c1e7-ce59-4c76-865d-1a5ffa71578f","Type":"ContainerDied","Data":"8e59bd71d608d2a19f6aba445bfc270fde040b3beeb2fa5a0759e05d0d91b133"} Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.860566 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550052-jkg7s" Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.860590 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e59bd71d608d2a19f6aba445bfc270fde040b3beeb2fa5a0759e05d0d91b133" Mar 08 20:52:04 crc kubenswrapper[4885]: I0308 20:52:04.863918 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f0d39294-b81d-4534-b86a-35a3aea74ed7","Type":"ContainerStarted","Data":"3da51067610480309689286dd1478ea79781213cf89c7dbffabc576d71783cd4"} Mar 08 20:52:05 crc kubenswrapper[4885]: I0308 20:52:05.488113 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550046-f4q77"] Mar 08 20:52:05 crc kubenswrapper[4885]: I0308 20:52:05.516643 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550046-f4q77"] Mar 08 20:52:07 crc kubenswrapper[4885]: I0308 20:52:07.381827 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b82d1463-69f5-455a-b2bf-493366c067f7" path="/var/lib/kubelet/pods/b82d1463-69f5-455a-b2bf-493366c067f7/volumes" Mar 08 20:52:12 crc kubenswrapper[4885]: I0308 20:52:12.368831 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:52:12 crc kubenswrapper[4885]: E0308 20:52:12.369984 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:52:24 crc kubenswrapper[4885]: I0308 20:52:24.368062 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:52:24 crc kubenswrapper[4885]: E0308 20:52:24.370391 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:52:36 crc kubenswrapper[4885]: I0308 20:52:36.144008 4885 generic.go:334] "Generic (PLEG): container finished" podID="00704ff6-696f-4687-99e0-23bf055d1bef" containerID="a283cbecf908e182dd51b0bfeb925825b7ba37fe1ff0de14d5fc2f94c054e7eb" exitCode=0 Mar 08 20:52:36 crc kubenswrapper[4885]: I0308 20:52:36.144139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00704ff6-696f-4687-99e0-23bf055d1bef","Type":"ContainerDied","Data":"a283cbecf908e182dd51b0bfeb925825b7ba37fe1ff0de14d5fc2f94c054e7eb"} Mar 08 20:52:37 crc kubenswrapper[4885]: I0308 20:52:37.159161 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"00704ff6-696f-4687-99e0-23bf055d1bef","Type":"ContainerStarted","Data":"8a643cc17ac5b6dd9664f5ea40e5d3c3347d697976b4cea9ad7ce397ae21f250"} Mar 08 20:52:37 crc kubenswrapper[4885]: I0308 20:52:37.159745 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 08 20:52:37 crc kubenswrapper[4885]: I0308 20:52:37.203409 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.203382433 podStartE2EDuration="37.203382433s" podCreationTimestamp="2026-03-08 20:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:52:37.193605222 +0000 UTC m=+4858.589659285" watchObservedRunningTime="2026-03-08 20:52:37.203382433 +0000 UTC m=+4858.599436466" Mar 08 20:52:38 crc kubenswrapper[4885]: I0308 20:52:38.170812 4885 generic.go:334] "Generic (PLEG): container finished" podID="f0d39294-b81d-4534-b86a-35a3aea74ed7" containerID="3da51067610480309689286dd1478ea79781213cf89c7dbffabc576d71783cd4" exitCode=0 Mar 08 20:52:38 crc kubenswrapper[4885]: I0308 20:52:38.170867 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f0d39294-b81d-4534-b86a-35a3aea74ed7","Type":"ContainerDied","Data":"3da51067610480309689286dd1478ea79781213cf89c7dbffabc576d71783cd4"} Mar 08 20:52:38 crc kubenswrapper[4885]: I0308 20:52:38.368150 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:52:38 crc kubenswrapper[4885]: E0308 20:52:38.368855 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:52:39 crc kubenswrapper[4885]: I0308 20:52:39.183167 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f0d39294-b81d-4534-b86a-35a3aea74ed7","Type":"ContainerStarted","Data":"de1fa4e51a74fae8244eedc279b615a73ac695d53a2de64cd8ef58a0ec3c637e"} Mar 08 20:52:39 crc kubenswrapper[4885]: I0308 20:52:39.183517 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:39 crc kubenswrapper[4885]: I0308 20:52:39.220549 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.220515019 podStartE2EDuration="38.220515019s" podCreationTimestamp="2026-03-08 20:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:52:39.212594179 +0000 UTC m=+4860.608648292" watchObservedRunningTime="2026-03-08 20:52:39.220515019 +0000 UTC m=+4860.616569092" Mar 08 20:52:51 crc kubenswrapper[4885]: I0308 20:52:51.185218 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 08 20:52:52 crc kubenswrapper[4885]: I0308 20:52:52.336146 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 08 20:52:52 crc kubenswrapper[4885]: I0308 20:52:52.372559 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:52:52 crc kubenswrapper[4885]: E0308 20:52:52.374975 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.296540 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 08 20:52:59 crc kubenswrapper[4885]: E0308 20:52:59.298262 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerName="init" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.298288 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerName="init" Mar 08 20:52:59 crc kubenswrapper[4885]: E0308 20:52:59.298323 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2414c1e7-ce59-4c76-865d-1a5ffa71578f" containerName="oc" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.298335 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2414c1e7-ce59-4c76-865d-1a5ffa71578f" containerName="oc" Mar 08 20:52:59 crc kubenswrapper[4885]: E0308 20:52:59.298360 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerName="dnsmasq-dns" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.298375 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerName="dnsmasq-dns" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.298648 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2414c1e7-ce59-4c76-865d-1a5ffa71578f" containerName="oc" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.298676 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7084d0a7-6526-4b90-93a3-e16b9d374be2" containerName="dnsmasq-dns" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.299560 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.303079 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xp44w" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.311258 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.401830 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k68tq\" (UniqueName: \"kubernetes.io/projected/b4e6b48b-506c-43ea-8933-c1b68f82790d-kube-api-access-k68tq\") pod \"mariadb-client\" (UID: \"b4e6b48b-506c-43ea-8933-c1b68f82790d\") " pod="openstack/mariadb-client" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.504134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k68tq\" (UniqueName: \"kubernetes.io/projected/b4e6b48b-506c-43ea-8933-c1b68f82790d-kube-api-access-k68tq\") pod \"mariadb-client\" (UID: \"b4e6b48b-506c-43ea-8933-c1b68f82790d\") " pod="openstack/mariadb-client" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.541586 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k68tq\" (UniqueName: \"kubernetes.io/projected/b4e6b48b-506c-43ea-8933-c1b68f82790d-kube-api-access-k68tq\") pod \"mariadb-client\" (UID: \"b4e6b48b-506c-43ea-8933-c1b68f82790d\") " pod="openstack/mariadb-client" Mar 08 20:52:59 crc kubenswrapper[4885]: I0308 20:52:59.636996 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:53:00 crc kubenswrapper[4885]: I0308 20:53:00.302895 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:53:00 crc kubenswrapper[4885]: I0308 20:53:00.379214 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b4e6b48b-506c-43ea-8933-c1b68f82790d","Type":"ContainerStarted","Data":"33714966dcd5a2a47e2d73b61ab2fcdb64486f77dd8de4ec2d7f86a9e962ec18"} Mar 08 20:53:01 crc kubenswrapper[4885]: I0308 20:53:01.393656 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b4e6b48b-506c-43ea-8933-c1b68f82790d","Type":"ContainerStarted","Data":"3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a"} Mar 08 20:53:01 crc kubenswrapper[4885]: I0308 20:53:01.426306 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client" podStartSLOduration=1.924029003 podStartE2EDuration="2.426274505s" podCreationTimestamp="2026-03-08 20:52:59 +0000 UTC" firstStartedPulling="2026-03-08 20:53:00.320367736 +0000 UTC m=+4881.716421799" lastFinishedPulling="2026-03-08 20:53:00.822613238 +0000 UTC m=+4882.218667301" observedRunningTime="2026-03-08 20:53:01.417580503 +0000 UTC m=+4882.813634566" watchObservedRunningTime="2026-03-08 20:53:01.426274505 +0000 UTC m=+4882.822328538" Mar 08 20:53:03 crc kubenswrapper[4885]: E0308 20:53:03.374693 4885 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.80:36730->38.102.83.80:33667: read tcp 38.102.83.80:36730->38.102.83.80:33667: read: connection reset by peer Mar 08 20:53:03 crc kubenswrapper[4885]: I0308 20:53:03.813358 4885 scope.go:117] "RemoveContainer" containerID="28fd3b4e0daaabbfc53b90770763cfd958469e45abae3ae9fd53f9c9e8ab327b" Mar 08 20:53:07 crc kubenswrapper[4885]: I0308 20:53:07.367513 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:53:07 crc kubenswrapper[4885]: E0308 20:53:07.368175 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:53:16 crc kubenswrapper[4885]: I0308 20:53:16.466731 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:53:16 crc kubenswrapper[4885]: I0308 20:53:16.467651 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-client" podUID="b4e6b48b-506c-43ea-8933-c1b68f82790d" containerName="mariadb-client" containerID="cri-o://3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a" gracePeriod=30 Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.063265 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.217285 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k68tq\" (UniqueName: \"kubernetes.io/projected/b4e6b48b-506c-43ea-8933-c1b68f82790d-kube-api-access-k68tq\") pod \"b4e6b48b-506c-43ea-8933-c1b68f82790d\" (UID: \"b4e6b48b-506c-43ea-8933-c1b68f82790d\") " Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.224171 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4e6b48b-506c-43ea-8933-c1b68f82790d-kube-api-access-k68tq" (OuterVolumeSpecName: "kube-api-access-k68tq") pod "b4e6b48b-506c-43ea-8933-c1b68f82790d" (UID: "b4e6b48b-506c-43ea-8933-c1b68f82790d"). InnerVolumeSpecName "kube-api-access-k68tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.318757 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k68tq\" (UniqueName: \"kubernetes.io/projected/b4e6b48b-506c-43ea-8933-c1b68f82790d-kube-api-access-k68tq\") on node \"crc\" DevicePath \"\"" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.549093 4885 generic.go:334] "Generic (PLEG): container finished" podID="b4e6b48b-506c-43ea-8933-c1b68f82790d" containerID="3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a" exitCode=143 Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.549175 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b4e6b48b-506c-43ea-8933-c1b68f82790d","Type":"ContainerDied","Data":"3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a"} Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.549224 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.549771 4885 scope.go:117] "RemoveContainer" containerID="3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.550419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"b4e6b48b-506c-43ea-8933-c1b68f82790d","Type":"ContainerDied","Data":"33714966dcd5a2a47e2d73b61ab2fcdb64486f77dd8de4ec2d7f86a9e962ec18"} Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.577577 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.577825 4885 scope.go:117] "RemoveContainer" containerID="3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a" Mar 08 20:53:17 crc kubenswrapper[4885]: E0308 20:53:17.578507 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a\": container with ID starting with 3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a not found: ID does not exist" containerID="3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.578556 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a"} err="failed to get container status \"3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a\": rpc error: code = NotFound desc = could not find container \"3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a\": container with ID starting with 3963089c00bb7bfd71237e8092a0bb779cae6422701c9c1790764bff580d805a not found: ID does not exist" Mar 08 20:53:17 crc kubenswrapper[4885]: I0308 20:53:17.583996 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:53:19 crc kubenswrapper[4885]: I0308 20:53:19.382786 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4e6b48b-506c-43ea-8933-c1b68f82790d" path="/var/lib/kubelet/pods/b4e6b48b-506c-43ea-8933-c1b68f82790d/volumes" Mar 08 20:53:20 crc kubenswrapper[4885]: I0308 20:53:20.368836 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:53:20 crc kubenswrapper[4885]: E0308 20:53:20.369520 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.673587 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-twqvq"] Mar 08 20:53:24 crc kubenswrapper[4885]: E0308 20:53:24.674759 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4e6b48b-506c-43ea-8933-c1b68f82790d" containerName="mariadb-client" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.674781 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4e6b48b-506c-43ea-8933-c1b68f82790d" containerName="mariadb-client" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.675077 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4e6b48b-506c-43ea-8933-c1b68f82790d" containerName="mariadb-client" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.678486 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.702182 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twqvq"] Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.850524 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-utilities\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.850974 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89zjk\" (UniqueName: \"kubernetes.io/projected/836676da-534a-42e5-b256-f7d5a2a13f22-kube-api-access-89zjk\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.851118 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-catalog-content\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.952290 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-utilities\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.952466 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89zjk\" (UniqueName: \"kubernetes.io/projected/836676da-534a-42e5-b256-f7d5a2a13f22-kube-api-access-89zjk\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.952542 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-catalog-content\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.953032 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-utilities\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.953166 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-catalog-content\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:24 crc kubenswrapper[4885]: I0308 20:53:24.975811 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89zjk\" (UniqueName: \"kubernetes.io/projected/836676da-534a-42e5-b256-f7d5a2a13f22-kube-api-access-89zjk\") pod \"community-operators-twqvq\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:25 crc kubenswrapper[4885]: I0308 20:53:25.009886 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:25 crc kubenswrapper[4885]: I0308 20:53:25.533994 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twqvq"] Mar 08 20:53:25 crc kubenswrapper[4885]: I0308 20:53:25.636644 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerStarted","Data":"31daaecae7e860fcf9c1efb90c24db9d8ac1cd2c562f0d39c509abfb2ef1b691"} Mar 08 20:53:26 crc kubenswrapper[4885]: I0308 20:53:26.650402 4885 generic.go:334] "Generic (PLEG): container finished" podID="836676da-534a-42e5-b256-f7d5a2a13f22" containerID="4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2" exitCode=0 Mar 08 20:53:26 crc kubenswrapper[4885]: I0308 20:53:26.650474 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerDied","Data":"4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2"} Mar 08 20:53:27 crc kubenswrapper[4885]: I0308 20:53:27.659005 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerStarted","Data":"29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c"} Mar 08 20:53:28 crc kubenswrapper[4885]: I0308 20:53:28.671726 4885 generic.go:334] "Generic (PLEG): container finished" podID="836676da-534a-42e5-b256-f7d5a2a13f22" containerID="29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c" exitCode=0 Mar 08 20:53:28 crc kubenswrapper[4885]: I0308 20:53:28.671831 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerDied","Data":"29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c"} Mar 08 20:53:29 crc kubenswrapper[4885]: I0308 20:53:29.685963 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerStarted","Data":"41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f"} Mar 08 20:53:29 crc kubenswrapper[4885]: I0308 20:53:29.717653 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-twqvq" podStartSLOduration=3.268438641 podStartE2EDuration="5.71762489s" podCreationTimestamp="2026-03-08 20:53:24 +0000 UTC" firstStartedPulling="2026-03-08 20:53:26.653287789 +0000 UTC m=+4908.049341842" lastFinishedPulling="2026-03-08 20:53:29.102474028 +0000 UTC m=+4910.498528091" observedRunningTime="2026-03-08 20:53:29.709558995 +0000 UTC m=+4911.105613028" watchObservedRunningTime="2026-03-08 20:53:29.71762489 +0000 UTC m=+4911.113678943" Mar 08 20:53:32 crc kubenswrapper[4885]: I0308 20:53:32.367886 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:53:32 crc kubenswrapper[4885]: E0308 20:53:32.368909 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:53:35 crc kubenswrapper[4885]: I0308 20:53:35.010759 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:35 crc kubenswrapper[4885]: I0308 20:53:35.011231 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:35 crc kubenswrapper[4885]: I0308 20:53:35.071910 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:35 crc kubenswrapper[4885]: I0308 20:53:35.811705 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:35 crc kubenswrapper[4885]: I0308 20:53:35.875899 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twqvq"] Mar 08 20:53:37 crc kubenswrapper[4885]: I0308 20:53:37.771421 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-twqvq" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="registry-server" containerID="cri-o://41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f" gracePeriod=2 Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.743091 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.782328 4885 generic.go:334] "Generic (PLEG): container finished" podID="836676da-534a-42e5-b256-f7d5a2a13f22" containerID="41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f" exitCode=0 Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.782388 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerDied","Data":"41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f"} Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.782427 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twqvq" event={"ID":"836676da-534a-42e5-b256-f7d5a2a13f22","Type":"ContainerDied","Data":"31daaecae7e860fcf9c1efb90c24db9d8ac1cd2c562f0d39c509abfb2ef1b691"} Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.782454 4885 scope.go:117] "RemoveContainer" containerID="41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.782635 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twqvq" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.792554 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-catalog-content\") pod \"836676da-534a-42e5-b256-f7d5a2a13f22\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.792937 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-utilities\") pod \"836676da-534a-42e5-b256-f7d5a2a13f22\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.793014 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89zjk\" (UniqueName: \"kubernetes.io/projected/836676da-534a-42e5-b256-f7d5a2a13f22-kube-api-access-89zjk\") pod \"836676da-534a-42e5-b256-f7d5a2a13f22\" (UID: \"836676da-534a-42e5-b256-f7d5a2a13f22\") " Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.794145 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-utilities" (OuterVolumeSpecName: "utilities") pod "836676da-534a-42e5-b256-f7d5a2a13f22" (UID: "836676da-534a-42e5-b256-f7d5a2a13f22"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.798344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/836676da-534a-42e5-b256-f7d5a2a13f22-kube-api-access-89zjk" (OuterVolumeSpecName: "kube-api-access-89zjk") pod "836676da-534a-42e5-b256-f7d5a2a13f22" (UID: "836676da-534a-42e5-b256-f7d5a2a13f22"). InnerVolumeSpecName "kube-api-access-89zjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.806339 4885 scope.go:117] "RemoveContainer" containerID="29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.850302 4885 scope.go:117] "RemoveContainer" containerID="4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.859491 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "836676da-534a-42e5-b256-f7d5a2a13f22" (UID: "836676da-534a-42e5-b256-f7d5a2a13f22"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.885023 4885 scope.go:117] "RemoveContainer" containerID="41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f" Mar 08 20:53:38 crc kubenswrapper[4885]: E0308 20:53:38.885385 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f\": container with ID starting with 41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f not found: ID does not exist" containerID="41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.885431 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f"} err="failed to get container status \"41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f\": rpc error: code = NotFound desc = could not find container \"41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f\": container with ID starting with 41770166f5a8f3333fd27783249eb856b2093371069bd1cf2d0b8b5e754a952f not found: ID does not exist" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.885448 4885 scope.go:117] "RemoveContainer" containerID="29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c" Mar 08 20:53:38 crc kubenswrapper[4885]: E0308 20:53:38.885655 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c\": container with ID starting with 29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c not found: ID does not exist" containerID="29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.885681 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c"} err="failed to get container status \"29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c\": rpc error: code = NotFound desc = could not find container \"29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c\": container with ID starting with 29b838c26ca7b190c3bd7a5602988eee401527cf2708f46c260f5f6d1c12e76c not found: ID does not exist" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.885692 4885 scope.go:117] "RemoveContainer" containerID="4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2" Mar 08 20:53:38 crc kubenswrapper[4885]: E0308 20:53:38.886058 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2\": container with ID starting with 4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2 not found: ID does not exist" containerID="4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.886128 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2"} err="failed to get container status \"4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2\": rpc error: code = NotFound desc = could not find container \"4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2\": container with ID starting with 4449ed507f52d5a3381cafa2643788ef752fdb05b13269d1c8e018731f1655a2 not found: ID does not exist" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.894829 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.894881 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/836676da-534a-42e5-b256-f7d5a2a13f22-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:53:38 crc kubenswrapper[4885]: I0308 20:53:38.894903 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89zjk\" (UniqueName: \"kubernetes.io/projected/836676da-534a-42e5-b256-f7d5a2a13f22-kube-api-access-89zjk\") on node \"crc\" DevicePath \"\"" Mar 08 20:53:39 crc kubenswrapper[4885]: I0308 20:53:39.135528 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-twqvq"] Mar 08 20:53:39 crc kubenswrapper[4885]: I0308 20:53:39.146040 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-twqvq"] Mar 08 20:53:39 crc kubenswrapper[4885]: I0308 20:53:39.389581 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" path="/var/lib/kubelet/pods/836676da-534a-42e5-b256-f7d5a2a13f22/volumes" Mar 08 20:53:44 crc kubenswrapper[4885]: I0308 20:53:44.368978 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:53:44 crc kubenswrapper[4885]: E0308 20:53:44.370251 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:53:58 crc kubenswrapper[4885]: I0308 20:53:58.368911 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:53:58 crc kubenswrapper[4885]: E0308 20:53:58.370071 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.167791 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550054-wsncs"] Mar 08 20:54:00 crc kubenswrapper[4885]: E0308 20:54:00.168530 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="registry-server" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.168559 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="registry-server" Mar 08 20:54:00 crc kubenswrapper[4885]: E0308 20:54:00.168590 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="extract-utilities" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.168603 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="extract-utilities" Mar 08 20:54:00 crc kubenswrapper[4885]: E0308 20:54:00.168633 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="extract-content" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.168646 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="extract-content" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.169050 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="836676da-534a-42e5-b256-f7d5a2a13f22" containerName="registry-server" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.169862 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.173217 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.173278 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.177685 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.181735 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550054-wsncs"] Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.362177 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xntkj\" (UniqueName: \"kubernetes.io/projected/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b-kube-api-access-xntkj\") pod \"auto-csr-approver-29550054-wsncs\" (UID: \"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b\") " pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.465819 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xntkj\" (UniqueName: \"kubernetes.io/projected/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b-kube-api-access-xntkj\") pod \"auto-csr-approver-29550054-wsncs\" (UID: \"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b\") " pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.499271 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xntkj\" (UniqueName: \"kubernetes.io/projected/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b-kube-api-access-xntkj\") pod \"auto-csr-approver-29550054-wsncs\" (UID: \"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b\") " pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.501647 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:00 crc kubenswrapper[4885]: I0308 20:54:00.849129 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550054-wsncs"] Mar 08 20:54:01 crc kubenswrapper[4885]: I0308 20:54:01.025076 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550054-wsncs" event={"ID":"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b","Type":"ContainerStarted","Data":"4e30f734ff88492b94c1d6e2b3541ab710b8bf4b9d5f10b38edb420b2bf64bc1"} Mar 08 20:54:03 crc kubenswrapper[4885]: I0308 20:54:03.048218 4885 generic.go:334] "Generic (PLEG): container finished" podID="66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b" containerID="9b23f86db419001dec3042d5f280866857f260d2b86edbe13a17fd8cd9ba2fd4" exitCode=0 Mar 08 20:54:03 crc kubenswrapper[4885]: I0308 20:54:03.048310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550054-wsncs" event={"ID":"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b","Type":"ContainerDied","Data":"9b23f86db419001dec3042d5f280866857f260d2b86edbe13a17fd8cd9ba2fd4"} Mar 08 20:54:03 crc kubenswrapper[4885]: I0308 20:54:03.920572 4885 scope.go:117] "RemoveContainer" containerID="71660241cb857dd4a39450381f9b1b87218e1ce546c14f61658581a4f1a6ae9d" Mar 08 20:54:04 crc kubenswrapper[4885]: I0308 20:54:04.494473 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:04 crc kubenswrapper[4885]: I0308 20:54:04.658150 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xntkj\" (UniqueName: \"kubernetes.io/projected/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b-kube-api-access-xntkj\") pod \"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b\" (UID: \"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b\") " Mar 08 20:54:04 crc kubenswrapper[4885]: I0308 20:54:04.664029 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b-kube-api-access-xntkj" (OuterVolumeSpecName: "kube-api-access-xntkj") pod "66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b" (UID: "66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b"). InnerVolumeSpecName "kube-api-access-xntkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:54:04 crc kubenswrapper[4885]: I0308 20:54:04.760650 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xntkj\" (UniqueName: \"kubernetes.io/projected/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b-kube-api-access-xntkj\") on node \"crc\" DevicePath \"\"" Mar 08 20:54:05 crc kubenswrapper[4885]: I0308 20:54:05.082905 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550054-wsncs" event={"ID":"66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b","Type":"ContainerDied","Data":"4e30f734ff88492b94c1d6e2b3541ab710b8bf4b9d5f10b38edb420b2bf64bc1"} Mar 08 20:54:05 crc kubenswrapper[4885]: I0308 20:54:05.083008 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e30f734ff88492b94c1d6e2b3541ab710b8bf4b9d5f10b38edb420b2bf64bc1" Mar 08 20:54:05 crc kubenswrapper[4885]: I0308 20:54:05.083018 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550054-wsncs" Mar 08 20:54:05 crc kubenswrapper[4885]: I0308 20:54:05.598304 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550048-bs8p7"] Mar 08 20:54:05 crc kubenswrapper[4885]: I0308 20:54:05.608165 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550048-bs8p7"] Mar 08 20:54:07 crc kubenswrapper[4885]: I0308 20:54:07.386098 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cc070c5-7e69-4caa-82a5-b21b8fa66256" path="/var/lib/kubelet/pods/8cc070c5-7e69-4caa-82a5-b21b8fa66256/volumes" Mar 08 20:54:11 crc kubenswrapper[4885]: I0308 20:54:11.369675 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:54:12 crc kubenswrapper[4885]: I0308 20:54:12.158440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"b63b0ab95208c6fa0889efffa8bab4195db0658e5a1af71b9988f5d8b91fa038"} Mar 08 20:55:04 crc kubenswrapper[4885]: I0308 20:55:04.026183 4885 scope.go:117] "RemoveContainer" containerID="4768620fbc443f87f40deff915eceaf069ee28b8e25c0efabb4228990e81cee6" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.164996 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550056-kpvwx"] Mar 08 20:56:00 crc kubenswrapper[4885]: E0308 20:56:00.166023 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b" containerName="oc" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.166041 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b" containerName="oc" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.166233 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b" containerName="oc" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.166798 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.171793 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.172623 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.174297 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.184763 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550056-kpvwx"] Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.274696 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndbx9\" (UniqueName: \"kubernetes.io/projected/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2-kube-api-access-ndbx9\") pod \"auto-csr-approver-29550056-kpvwx\" (UID: \"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2\") " pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.378207 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndbx9\" (UniqueName: \"kubernetes.io/projected/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2-kube-api-access-ndbx9\") pod \"auto-csr-approver-29550056-kpvwx\" (UID: \"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2\") " pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.407709 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndbx9\" (UniqueName: \"kubernetes.io/projected/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2-kube-api-access-ndbx9\") pod \"auto-csr-approver-29550056-kpvwx\" (UID: \"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2\") " pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.490362 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:00 crc kubenswrapper[4885]: I0308 20:56:00.774310 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550056-kpvwx"] Mar 08 20:56:00 crc kubenswrapper[4885]: W0308 20:56:00.778374 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97098aa1_1dc7_4efc_b2a2_0c0a97ae36f2.slice/crio-cbc45294638fc08f7cb3dd0fa22e35827672fcc9b5a9a562040cddcdd00058e4 WatchSource:0}: Error finding container cbc45294638fc08f7cb3dd0fa22e35827672fcc9b5a9a562040cddcdd00058e4: Status 404 returned error can't find the container with id cbc45294638fc08f7cb3dd0fa22e35827672fcc9b5a9a562040cddcdd00058e4 Mar 08 20:56:01 crc kubenswrapper[4885]: I0308 20:56:01.229430 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" event={"ID":"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2","Type":"ContainerStarted","Data":"cbc45294638fc08f7cb3dd0fa22e35827672fcc9b5a9a562040cddcdd00058e4"} Mar 08 20:56:02 crc kubenswrapper[4885]: I0308 20:56:02.242320 4885 generic.go:334] "Generic (PLEG): container finished" podID="97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2" containerID="06ba2614b2073ae88c1afd46a3629242eb9b0dfca6cc39c42c6f2b45e68e1af1" exitCode=0 Mar 08 20:56:02 crc kubenswrapper[4885]: I0308 20:56:02.242425 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" event={"ID":"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2","Type":"ContainerDied","Data":"06ba2614b2073ae88c1afd46a3629242eb9b0dfca6cc39c42c6f2b45e68e1af1"} Mar 08 20:56:03 crc kubenswrapper[4885]: I0308 20:56:03.647347 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:03 crc kubenswrapper[4885]: I0308 20:56:03.732697 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndbx9\" (UniqueName: \"kubernetes.io/projected/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2-kube-api-access-ndbx9\") pod \"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2\" (UID: \"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2\") " Mar 08 20:56:03 crc kubenswrapper[4885]: I0308 20:56:03.739728 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2-kube-api-access-ndbx9" (OuterVolumeSpecName: "kube-api-access-ndbx9") pod "97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2" (UID: "97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2"). InnerVolumeSpecName "kube-api-access-ndbx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:56:03 crc kubenswrapper[4885]: I0308 20:56:03.834400 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndbx9\" (UniqueName: \"kubernetes.io/projected/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2-kube-api-access-ndbx9\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:04 crc kubenswrapper[4885]: I0308 20:56:04.258631 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" event={"ID":"97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2","Type":"ContainerDied","Data":"cbc45294638fc08f7cb3dd0fa22e35827672fcc9b5a9a562040cddcdd00058e4"} Mar 08 20:56:04 crc kubenswrapper[4885]: I0308 20:56:04.258665 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbc45294638fc08f7cb3dd0fa22e35827672fcc9b5a9a562040cddcdd00058e4" Mar 08 20:56:04 crc kubenswrapper[4885]: I0308 20:56:04.258978 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550056-kpvwx" Mar 08 20:56:04 crc kubenswrapper[4885]: I0308 20:56:04.739897 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550050-xrlpj"] Mar 08 20:56:04 crc kubenswrapper[4885]: I0308 20:56:04.757318 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550050-xrlpj"] Mar 08 20:56:05 crc kubenswrapper[4885]: I0308 20:56:05.383748 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c364e354-f542-45ec-9322-125db18eb928" path="/var/lib/kubelet/pods/c364e354-f542-45ec-9322-125db18eb928/volumes" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.526534 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n5bmd"] Mar 08 20:56:11 crc kubenswrapper[4885]: E0308 20:56:11.527624 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2" containerName="oc" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.527645 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2" containerName="oc" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.527950 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2" containerName="oc" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.530240 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.546290 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n5bmd"] Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.671314 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-catalog-content\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.671415 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-utilities\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.671440 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh9jf\" (UniqueName: \"kubernetes.io/projected/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-kube-api-access-nh9jf\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.772780 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-utilities\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.772831 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh9jf\" (UniqueName: \"kubernetes.io/projected/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-kube-api-access-nh9jf\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.772976 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-catalog-content\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.773381 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-utilities\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.773470 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-catalog-content\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.810942 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh9jf\" (UniqueName: \"kubernetes.io/projected/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-kube-api-access-nh9jf\") pod \"redhat-operators-n5bmd\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:11 crc kubenswrapper[4885]: I0308 20:56:11.880731 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:12 crc kubenswrapper[4885]: I0308 20:56:12.314683 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n5bmd"] Mar 08 20:56:13 crc kubenswrapper[4885]: I0308 20:56:13.337829 4885 generic.go:334] "Generic (PLEG): container finished" podID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerID="736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a" exitCode=0 Mar 08 20:56:13 crc kubenswrapper[4885]: I0308 20:56:13.337958 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerDied","Data":"736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a"} Mar 08 20:56:13 crc kubenswrapper[4885]: I0308 20:56:13.338081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerStarted","Data":"4f37e648886bd997e7152ec15c819b52ba5354bfc7c1ad348fdee032439da0bf"} Mar 08 20:56:14 crc kubenswrapper[4885]: I0308 20:56:14.349035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerStarted","Data":"8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350"} Mar 08 20:56:15 crc kubenswrapper[4885]: I0308 20:56:15.362767 4885 generic.go:334] "Generic (PLEG): container finished" podID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerID="8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350" exitCode=0 Mar 08 20:56:15 crc kubenswrapper[4885]: I0308 20:56:15.363228 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerDied","Data":"8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350"} Mar 08 20:56:16 crc kubenswrapper[4885]: I0308 20:56:16.373648 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerStarted","Data":"a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16"} Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.035104 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n5bmd" podStartSLOduration=5.410986352 podStartE2EDuration="8.035084233s" podCreationTimestamp="2026-03-08 20:56:11 +0000 UTC" firstStartedPulling="2026-03-08 20:56:13.342144691 +0000 UTC m=+5074.738198744" lastFinishedPulling="2026-03-08 20:56:15.966242572 +0000 UTC m=+5077.362296625" observedRunningTime="2026-03-08 20:56:16.412076739 +0000 UTC m=+5077.808130772" watchObservedRunningTime="2026-03-08 20:56:19.035084233 +0000 UTC m=+5080.431138256" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.040143 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6mx7n"] Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.042329 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.054584 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6mx7n"] Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.223077 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29kkf\" (UniqueName: \"kubernetes.io/projected/12ebdbbf-4422-4e98-acc5-cca08fcb3444-kube-api-access-29kkf\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.223288 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-catalog-content\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.223448 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-utilities\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.324533 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-utilities\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.324673 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29kkf\" (UniqueName: \"kubernetes.io/projected/12ebdbbf-4422-4e98-acc5-cca08fcb3444-kube-api-access-29kkf\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.324792 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-catalog-content\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.325245 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-utilities\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.325406 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-catalog-content\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.343505 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29kkf\" (UniqueName: \"kubernetes.io/projected/12ebdbbf-4422-4e98-acc5-cca08fcb3444-kube-api-access-29kkf\") pod \"certified-operators-6mx7n\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.363988 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:19 crc kubenswrapper[4885]: I0308 20:56:19.880244 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6mx7n"] Mar 08 20:56:20 crc kubenswrapper[4885]: I0308 20:56:20.418760 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerStarted","Data":"a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b"} Mar 08 20:56:20 crc kubenswrapper[4885]: I0308 20:56:20.419184 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerStarted","Data":"0da2db770322b4cd9942fcab3f1443894aab6a4200b01789887cd5d5ab8a0923"} Mar 08 20:56:21 crc kubenswrapper[4885]: I0308 20:56:21.433302 4885 generic.go:334] "Generic (PLEG): container finished" podID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerID="a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b" exitCode=0 Mar 08 20:56:21 crc kubenswrapper[4885]: I0308 20:56:21.433389 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerDied","Data":"a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b"} Mar 08 20:56:21 crc kubenswrapper[4885]: I0308 20:56:21.881761 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:21 crc kubenswrapper[4885]: I0308 20:56:21.883516 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:22 crc kubenswrapper[4885]: I0308 20:56:22.446630 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerStarted","Data":"7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634"} Mar 08 20:56:22 crc kubenswrapper[4885]: I0308 20:56:22.965890 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-n5bmd" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="registry-server" probeResult="failure" output=< Mar 08 20:56:22 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 20:56:22 crc kubenswrapper[4885]: > Mar 08 20:56:23 crc kubenswrapper[4885]: I0308 20:56:23.477372 4885 generic.go:334] "Generic (PLEG): container finished" podID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerID="7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634" exitCode=0 Mar 08 20:56:23 crc kubenswrapper[4885]: I0308 20:56:23.477435 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerDied","Data":"7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634"} Mar 08 20:56:24 crc kubenswrapper[4885]: I0308 20:56:24.490633 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerStarted","Data":"d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4"} Mar 08 20:56:24 crc kubenswrapper[4885]: I0308 20:56:24.519323 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6mx7n" podStartSLOduration=3.083350574 podStartE2EDuration="5.519302029s" podCreationTimestamp="2026-03-08 20:56:19 +0000 UTC" firstStartedPulling="2026-03-08 20:56:21.436115425 +0000 UTC m=+5082.832169478" lastFinishedPulling="2026-03-08 20:56:23.87206687 +0000 UTC m=+5085.268120933" observedRunningTime="2026-03-08 20:56:24.51558267 +0000 UTC m=+5085.911636723" watchObservedRunningTime="2026-03-08 20:56:24.519302029 +0000 UTC m=+5085.915356052" Mar 08 20:56:29 crc kubenswrapper[4885]: I0308 20:56:29.364741 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:29 crc kubenswrapper[4885]: I0308 20:56:29.365352 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:29 crc kubenswrapper[4885]: I0308 20:56:29.440781 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:29 crc kubenswrapper[4885]: I0308 20:56:29.611546 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:29 crc kubenswrapper[4885]: I0308 20:56:29.690599 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6mx7n"] Mar 08 20:56:31 crc kubenswrapper[4885]: I0308 20:56:31.556031 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6mx7n" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="registry-server" containerID="cri-o://d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4" gracePeriod=2 Mar 08 20:56:31 crc kubenswrapper[4885]: I0308 20:56:31.948175 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.014281 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.051389 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.159763 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29kkf\" (UniqueName: \"kubernetes.io/projected/12ebdbbf-4422-4e98-acc5-cca08fcb3444-kube-api-access-29kkf\") pod \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.160041 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-utilities\") pod \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.160786 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-catalog-content\") pod \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\" (UID: \"12ebdbbf-4422-4e98-acc5-cca08fcb3444\") " Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.161022 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-utilities" (OuterVolumeSpecName: "utilities") pod "12ebdbbf-4422-4e98-acc5-cca08fcb3444" (UID: "12ebdbbf-4422-4e98-acc5-cca08fcb3444"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.161471 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.165342 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12ebdbbf-4422-4e98-acc5-cca08fcb3444-kube-api-access-29kkf" (OuterVolumeSpecName: "kube-api-access-29kkf") pod "12ebdbbf-4422-4e98-acc5-cca08fcb3444" (UID: "12ebdbbf-4422-4e98-acc5-cca08fcb3444"). InnerVolumeSpecName "kube-api-access-29kkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.220344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "12ebdbbf-4422-4e98-acc5-cca08fcb3444" (UID: "12ebdbbf-4422-4e98-acc5-cca08fcb3444"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.263072 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29kkf\" (UniqueName: \"kubernetes.io/projected/12ebdbbf-4422-4e98-acc5-cca08fcb3444-kube-api-access-29kkf\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.263116 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/12ebdbbf-4422-4e98-acc5-cca08fcb3444-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.567674 4885 generic.go:334] "Generic (PLEG): container finished" podID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerID="d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4" exitCode=0 Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.567828 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6mx7n" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.567854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerDied","Data":"d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4"} Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.567956 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6mx7n" event={"ID":"12ebdbbf-4422-4e98-acc5-cca08fcb3444","Type":"ContainerDied","Data":"0da2db770322b4cd9942fcab3f1443894aab6a4200b01789887cd5d5ab8a0923"} Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.567990 4885 scope.go:117] "RemoveContainer" containerID="d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.609389 4885 scope.go:117] "RemoveContainer" containerID="7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.640640 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6mx7n"] Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.657619 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6mx7n"] Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.660193 4885 scope.go:117] "RemoveContainer" containerID="a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.708860 4885 scope.go:117] "RemoveContainer" containerID="d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4" Mar 08 20:56:32 crc kubenswrapper[4885]: E0308 20:56:32.711631 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4\": container with ID starting with d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4 not found: ID does not exist" containerID="d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.711688 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4"} err="failed to get container status \"d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4\": rpc error: code = NotFound desc = could not find container \"d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4\": container with ID starting with d98d413b438d54525a72e59b7155550d034d3bacf00efa8a18397c7c799147c4 not found: ID does not exist" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.711720 4885 scope.go:117] "RemoveContainer" containerID="7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634" Mar 08 20:56:32 crc kubenswrapper[4885]: E0308 20:56:32.712424 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634\": container with ID starting with 7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634 not found: ID does not exist" containerID="7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.712491 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634"} err="failed to get container status \"7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634\": rpc error: code = NotFound desc = could not find container \"7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634\": container with ID starting with 7e7e35ed64b25708fa2001fed0e82f28f68e02c92a5c42837c150bda50097634 not found: ID does not exist" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.712533 4885 scope.go:117] "RemoveContainer" containerID="a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b" Mar 08 20:56:32 crc kubenswrapper[4885]: E0308 20:56:32.713162 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b\": container with ID starting with a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b not found: ID does not exist" containerID="a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.713205 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b"} err="failed to get container status \"a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b\": rpc error: code = NotFound desc = could not find container \"a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b\": container with ID starting with a38f510d529a6c58d2df186226f088f4ed82accfca11a11bfaf380699265141b not found: ID does not exist" Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.818423 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:56:32 crc kubenswrapper[4885]: I0308 20:56:32.818502 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:56:33 crc kubenswrapper[4885]: I0308 20:56:33.385155 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" path="/var/lib/kubelet/pods/12ebdbbf-4422-4e98-acc5-cca08fcb3444/volumes" Mar 08 20:56:33 crc kubenswrapper[4885]: I0308 20:56:33.894165 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n5bmd"] Mar 08 20:56:33 crc kubenswrapper[4885]: I0308 20:56:33.894588 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n5bmd" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="registry-server" containerID="cri-o://a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16" gracePeriod=2 Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.379841 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.497705 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-utilities\") pod \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.497889 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh9jf\" (UniqueName: \"kubernetes.io/projected/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-kube-api-access-nh9jf\") pod \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.498004 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-catalog-content\") pod \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\" (UID: \"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e\") " Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.498554 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-utilities" (OuterVolumeSpecName: "utilities") pod "956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" (UID: "956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.501517 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-kube-api-access-nh9jf" (OuterVolumeSpecName: "kube-api-access-nh9jf") pod "956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" (UID: "956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e"). InnerVolumeSpecName "kube-api-access-nh9jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.586134 4885 generic.go:334] "Generic (PLEG): container finished" podID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerID="a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16" exitCode=0 Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.586185 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerDied","Data":"a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16"} Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.586210 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n5bmd" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.586237 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n5bmd" event={"ID":"956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e","Type":"ContainerDied","Data":"4f37e648886bd997e7152ec15c819b52ba5354bfc7c1ad348fdee032439da0bf"} Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.586256 4885 scope.go:117] "RemoveContainer" containerID="a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.599893 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.599934 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh9jf\" (UniqueName: \"kubernetes.io/projected/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-kube-api-access-nh9jf\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.605966 4885 scope.go:117] "RemoveContainer" containerID="8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.625552 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" (UID: "956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.641468 4885 scope.go:117] "RemoveContainer" containerID="736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.666755 4885 scope.go:117] "RemoveContainer" containerID="a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16" Mar 08 20:56:34 crc kubenswrapper[4885]: E0308 20:56:34.667250 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16\": container with ID starting with a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16 not found: ID does not exist" containerID="a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.667288 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16"} err="failed to get container status \"a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16\": rpc error: code = NotFound desc = could not find container \"a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16\": container with ID starting with a09eca83dc906a33181103bff9c1878eb4d999bb8d2046bac895892190264d16 not found: ID does not exist" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.667313 4885 scope.go:117] "RemoveContainer" containerID="8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350" Mar 08 20:56:34 crc kubenswrapper[4885]: E0308 20:56:34.667801 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350\": container with ID starting with 8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350 not found: ID does not exist" containerID="8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.667870 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350"} err="failed to get container status \"8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350\": rpc error: code = NotFound desc = could not find container \"8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350\": container with ID starting with 8d077fcfa0fd946c830ec8276c622994768bd2bf21c66e83a36bc016b4336350 not found: ID does not exist" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.667958 4885 scope.go:117] "RemoveContainer" containerID="736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a" Mar 08 20:56:34 crc kubenswrapper[4885]: E0308 20:56:34.668386 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a\": container with ID starting with 736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a not found: ID does not exist" containerID="736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.668427 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a"} err="failed to get container status \"736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a\": rpc error: code = NotFound desc = could not find container \"736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a\": container with ID starting with 736e16745163c74cb256232a661b35370f9f024da75539a82804a9bde6a5924a not found: ID does not exist" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.702058 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.937151 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n5bmd"] Mar 08 20:56:34 crc kubenswrapper[4885]: I0308 20:56:34.947942 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n5bmd"] Mar 08 20:56:35 crc kubenswrapper[4885]: I0308 20:56:35.384386 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" path="/var/lib/kubelet/pods/956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e/volumes" Mar 08 20:57:02 crc kubenswrapper[4885]: I0308 20:57:02.818650 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:57:02 crc kubenswrapper[4885]: I0308 20:57:02.820679 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:57:04 crc kubenswrapper[4885]: I0308 20:57:04.160170 4885 scope.go:117] "RemoveContainer" containerID="00728729b1a54767f7bf5ead12746c7b79c9d4ef7c28991788ed27e082f5ef87" Mar 08 20:57:32 crc kubenswrapper[4885]: I0308 20:57:32.818768 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 20:57:32 crc kubenswrapper[4885]: I0308 20:57:32.819417 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 20:57:32 crc kubenswrapper[4885]: I0308 20:57:32.819502 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 20:57:32 crc kubenswrapper[4885]: I0308 20:57:32.820703 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b63b0ab95208c6fa0889efffa8bab4195db0658e5a1af71b9988f5d8b91fa038"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 20:57:32 crc kubenswrapper[4885]: I0308 20:57:32.820834 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://b63b0ab95208c6fa0889efffa8bab4195db0658e5a1af71b9988f5d8b91fa038" gracePeriod=600 Mar 08 20:57:33 crc kubenswrapper[4885]: I0308 20:57:33.116435 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="b63b0ab95208c6fa0889efffa8bab4195db0658e5a1af71b9988f5d8b91fa038" exitCode=0 Mar 08 20:57:33 crc kubenswrapper[4885]: I0308 20:57:33.116510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"b63b0ab95208c6fa0889efffa8bab4195db0658e5a1af71b9988f5d8b91fa038"} Mar 08 20:57:33 crc kubenswrapper[4885]: I0308 20:57:33.116559 4885 scope.go:117] "RemoveContainer" containerID="0c6030e1b9921e4eb0853c1f95f32469063a55675f5d6d7b96fdb6b8a7c4d3a6" Mar 08 20:57:34 crc kubenswrapper[4885]: I0308 20:57:34.134040 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080"} Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.179844 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550058-z6lnn"] Mar 08 20:58:00 crc kubenswrapper[4885]: E0308 20:58:00.181344 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="extract-utilities" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.181410 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="extract-utilities" Mar 08 20:58:00 crc kubenswrapper[4885]: E0308 20:58:00.181453 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="registry-server" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.181475 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="registry-server" Mar 08 20:58:00 crc kubenswrapper[4885]: E0308 20:58:00.181497 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="extract-content" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.181515 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="extract-content" Mar 08 20:58:00 crc kubenswrapper[4885]: E0308 20:58:00.181561 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="extract-utilities" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.181578 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="extract-utilities" Mar 08 20:58:00 crc kubenswrapper[4885]: E0308 20:58:00.181609 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="registry-server" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.181629 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="registry-server" Mar 08 20:58:00 crc kubenswrapper[4885]: E0308 20:58:00.181649 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="extract-content" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.181668 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="extract-content" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.182094 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="956e8ead-9d7e-4d7b-aa80-4e1f3e0bad1e" containerName="registry-server" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.182135 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="12ebdbbf-4422-4e98-acc5-cca08fcb3444" containerName="registry-server" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.182821 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.191092 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.194449 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.195286 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.200224 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550058-z6lnn"] Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.229456 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kmvj\" (UniqueName: \"kubernetes.io/projected/95f14f7f-4dec-4d9d-a320-7a5c927d4983-kube-api-access-2kmvj\") pod \"auto-csr-approver-29550058-z6lnn\" (UID: \"95f14f7f-4dec-4d9d-a320-7a5c927d4983\") " pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.331345 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kmvj\" (UniqueName: \"kubernetes.io/projected/95f14f7f-4dec-4d9d-a320-7a5c927d4983-kube-api-access-2kmvj\") pod \"auto-csr-approver-29550058-z6lnn\" (UID: \"95f14f7f-4dec-4d9d-a320-7a5c927d4983\") " pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.358207 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kmvj\" (UniqueName: \"kubernetes.io/projected/95f14f7f-4dec-4d9d-a320-7a5c927d4983-kube-api-access-2kmvj\") pod \"auto-csr-approver-29550058-z6lnn\" (UID: \"95f14f7f-4dec-4d9d-a320-7a5c927d4983\") " pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:00 crc kubenswrapper[4885]: I0308 20:58:00.509785 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:01 crc kubenswrapper[4885]: I0308 20:58:01.001092 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550058-z6lnn"] Mar 08 20:58:01 crc kubenswrapper[4885]: I0308 20:58:01.172035 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 20:58:01 crc kubenswrapper[4885]: I0308 20:58:01.387809 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" event={"ID":"95f14f7f-4dec-4d9d-a320-7a5c927d4983","Type":"ContainerStarted","Data":"10b28d86c25bcf38b02ed90d52c136f2c9819b6432c2748afb6d02e50c08f796"} Mar 08 20:58:02 crc kubenswrapper[4885]: I0308 20:58:02.397650 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" event={"ID":"95f14f7f-4dec-4d9d-a320-7a5c927d4983","Type":"ContainerStarted","Data":"21b8174ae95621e7d89055b3e4716d5a83a8f7fb3dd103300c6b0dc26e415bb4"} Mar 08 20:58:02 crc kubenswrapper[4885]: I0308 20:58:02.418430 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" podStartSLOduration=1.598702472 podStartE2EDuration="2.418404699s" podCreationTimestamp="2026-03-08 20:58:00 +0000 UTC" firstStartedPulling="2026-03-08 20:58:01.171752288 +0000 UTC m=+5182.567806311" lastFinishedPulling="2026-03-08 20:58:01.991454475 +0000 UTC m=+5183.387508538" observedRunningTime="2026-03-08 20:58:02.414269099 +0000 UTC m=+5183.810323132" watchObservedRunningTime="2026-03-08 20:58:02.418404699 +0000 UTC m=+5183.814458762" Mar 08 20:58:03 crc kubenswrapper[4885]: I0308 20:58:03.406486 4885 generic.go:334] "Generic (PLEG): container finished" podID="95f14f7f-4dec-4d9d-a320-7a5c927d4983" containerID="21b8174ae95621e7d89055b3e4716d5a83a8f7fb3dd103300c6b0dc26e415bb4" exitCode=0 Mar 08 20:58:03 crc kubenswrapper[4885]: I0308 20:58:03.406531 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" event={"ID":"95f14f7f-4dec-4d9d-a320-7a5c927d4983","Type":"ContainerDied","Data":"21b8174ae95621e7d89055b3e4716d5a83a8f7fb3dd103300c6b0dc26e415bb4"} Mar 08 20:58:04 crc kubenswrapper[4885]: I0308 20:58:04.296357 4885 scope.go:117] "RemoveContainer" containerID="320469e7a132de2b538abe239f05e1a393daef2ccce9780de0687281e071e2ef" Mar 08 20:58:04 crc kubenswrapper[4885]: I0308 20:58:04.750970 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:04 crc kubenswrapper[4885]: I0308 20:58:04.827280 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kmvj\" (UniqueName: \"kubernetes.io/projected/95f14f7f-4dec-4d9d-a320-7a5c927d4983-kube-api-access-2kmvj\") pod \"95f14f7f-4dec-4d9d-a320-7a5c927d4983\" (UID: \"95f14f7f-4dec-4d9d-a320-7a5c927d4983\") " Mar 08 20:58:04 crc kubenswrapper[4885]: I0308 20:58:04.835684 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f14f7f-4dec-4d9d-a320-7a5c927d4983-kube-api-access-2kmvj" (OuterVolumeSpecName: "kube-api-access-2kmvj") pod "95f14f7f-4dec-4d9d-a320-7a5c927d4983" (UID: "95f14f7f-4dec-4d9d-a320-7a5c927d4983"). InnerVolumeSpecName "kube-api-access-2kmvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:58:04 crc kubenswrapper[4885]: I0308 20:58:04.930715 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kmvj\" (UniqueName: \"kubernetes.io/projected/95f14f7f-4dec-4d9d-a320-7a5c927d4983-kube-api-access-2kmvj\") on node \"crc\" DevicePath \"\"" Mar 08 20:58:05 crc kubenswrapper[4885]: I0308 20:58:05.430264 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" event={"ID":"95f14f7f-4dec-4d9d-a320-7a5c927d4983","Type":"ContainerDied","Data":"10b28d86c25bcf38b02ed90d52c136f2c9819b6432c2748afb6d02e50c08f796"} Mar 08 20:58:05 crc kubenswrapper[4885]: I0308 20:58:05.430320 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10b28d86c25bcf38b02ed90d52c136f2c9819b6432c2748afb6d02e50c08f796" Mar 08 20:58:05 crc kubenswrapper[4885]: I0308 20:58:05.430396 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550058-z6lnn" Mar 08 20:58:05 crc kubenswrapper[4885]: I0308 20:58:05.499661 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550052-jkg7s"] Mar 08 20:58:05 crc kubenswrapper[4885]: I0308 20:58:05.510302 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550052-jkg7s"] Mar 08 20:58:07 crc kubenswrapper[4885]: I0308 20:58:07.394064 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2414c1e7-ce59-4c76-865d-1a5ffa71578f" path="/var/lib/kubelet/pods/2414c1e7-ce59-4c76-865d-1a5ffa71578f/volumes" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.468606 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Mar 08 20:58:09 crc kubenswrapper[4885]: E0308 20:58:09.469335 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f14f7f-4dec-4d9d-a320-7a5c927d4983" containerName="oc" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.469352 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f14f7f-4dec-4d9d-a320-7a5c927d4983" containerName="oc" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.469660 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f14f7f-4dec-4d9d-a320-7a5c927d4983" containerName="oc" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.470270 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.472790 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xp44w" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.476136 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.509779 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scrbk\" (UniqueName: \"kubernetes.io/projected/1a10ccbd-e30c-478f-84a4-c869a8cd0924-kube-api-access-scrbk\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.509840 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.611648 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scrbk\" (UniqueName: \"kubernetes.io/projected/1a10ccbd-e30c-478f-84a4-c869a8cd0924-kube-api-access-scrbk\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.612103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.615833 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.615894 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/acd8297510132edeb9d5328b0d06a30d4f8877acc00b01d77a9c3ca4476f150c/globalmount\"" pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.640461 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scrbk\" (UniqueName: \"kubernetes.io/projected/1a10ccbd-e30c-478f-84a4-c869a8cd0924-kube-api-access-scrbk\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.658676 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") pod \"mariadb-copy-data\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " pod="openstack/mariadb-copy-data" Mar 08 20:58:09 crc kubenswrapper[4885]: I0308 20:58:09.797011 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 08 20:58:10 crc kubenswrapper[4885]: I0308 20:58:10.135534 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Mar 08 20:58:10 crc kubenswrapper[4885]: W0308 20:58:10.147598 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a10ccbd_e30c_478f_84a4_c869a8cd0924.slice/crio-035005254ed1ada4ad851bbbcd646012f92c499c388cf96db3c6e6fbc8bfb688 WatchSource:0}: Error finding container 035005254ed1ada4ad851bbbcd646012f92c499c388cf96db3c6e6fbc8bfb688: Status 404 returned error can't find the container with id 035005254ed1ada4ad851bbbcd646012f92c499c388cf96db3c6e6fbc8bfb688 Mar 08 20:58:10 crc kubenswrapper[4885]: I0308 20:58:10.474137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1a10ccbd-e30c-478f-84a4-c869a8cd0924","Type":"ContainerStarted","Data":"ece949f19629c28be550e718314e58a7cadf1e2c5464306896ed9854368d9a69"} Mar 08 20:58:10 crc kubenswrapper[4885]: I0308 20:58:10.474223 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1a10ccbd-e30c-478f-84a4-c869a8cd0924","Type":"ContainerStarted","Data":"035005254ed1ada4ad851bbbcd646012f92c499c388cf96db3c6e6fbc8bfb688"} Mar 08 20:58:10 crc kubenswrapper[4885]: I0308 20:58:10.504035 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.50400708 podStartE2EDuration="2.50400708s" podCreationTimestamp="2026-03-08 20:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:58:10.500329812 +0000 UTC m=+5191.896383865" watchObservedRunningTime="2026-03-08 20:58:10.50400708 +0000 UTC m=+5191.900061133" Mar 08 20:58:13 crc kubenswrapper[4885]: I0308 20:58:13.693266 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:13 crc kubenswrapper[4885]: I0308 20:58:13.694665 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:13 crc kubenswrapper[4885]: I0308 20:58:13.710810 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:13 crc kubenswrapper[4885]: I0308 20:58:13.887605 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwsz7\" (UniqueName: \"kubernetes.io/projected/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f-kube-api-access-hwsz7\") pod \"mariadb-client\" (UID: \"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f\") " pod="openstack/mariadb-client" Mar 08 20:58:13 crc kubenswrapper[4885]: I0308 20:58:13.989451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwsz7\" (UniqueName: \"kubernetes.io/projected/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f-kube-api-access-hwsz7\") pod \"mariadb-client\" (UID: \"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f\") " pod="openstack/mariadb-client" Mar 08 20:58:14 crc kubenswrapper[4885]: I0308 20:58:14.361947 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwsz7\" (UniqueName: \"kubernetes.io/projected/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f-kube-api-access-hwsz7\") pod \"mariadb-client\" (UID: \"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f\") " pod="openstack/mariadb-client" Mar 08 20:58:14 crc kubenswrapper[4885]: I0308 20:58:14.659405 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:15 crc kubenswrapper[4885]: I0308 20:58:15.205174 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:15 crc kubenswrapper[4885]: W0308 20:58:15.211173 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffbeea9e_d5fb_42aa_8ad9_a687fc50c96f.slice/crio-90150457faeb182e5ccb269e6333eb0fa773cc31eb2e3ee37ea0bf490081de89 WatchSource:0}: Error finding container 90150457faeb182e5ccb269e6333eb0fa773cc31eb2e3ee37ea0bf490081de89: Status 404 returned error can't find the container with id 90150457faeb182e5ccb269e6333eb0fa773cc31eb2e3ee37ea0bf490081de89 Mar 08 20:58:15 crc kubenswrapper[4885]: I0308 20:58:15.528526 4885 generic.go:334] "Generic (PLEG): container finished" podID="ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f" containerID="cb3b0cf9a33a47c0e77a72a5c33129d359ae195a6b6e13ab819d24d148ae74bb" exitCode=0 Mar 08 20:58:15 crc kubenswrapper[4885]: I0308 20:58:15.528586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f","Type":"ContainerDied","Data":"cb3b0cf9a33a47c0e77a72a5c33129d359ae195a6b6e13ab819d24d148ae74bb"} Mar 08 20:58:15 crc kubenswrapper[4885]: I0308 20:58:15.528629 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f","Type":"ContainerStarted","Data":"90150457faeb182e5ccb269e6333eb0fa773cc31eb2e3ee37ea0bf490081de89"} Mar 08 20:58:16 crc kubenswrapper[4885]: I0308 20:58:16.933979 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:16 crc kubenswrapper[4885]: I0308 20:58:16.959330 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f/mariadb-client/0.log" Mar 08 20:58:16 crc kubenswrapper[4885]: I0308 20:58:16.989062 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:16 crc kubenswrapper[4885]: I0308 20:58:16.995392 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.042835 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwsz7\" (UniqueName: \"kubernetes.io/projected/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f-kube-api-access-hwsz7\") pod \"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f\" (UID: \"ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f\") " Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.051102 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f-kube-api-access-hwsz7" (OuterVolumeSpecName: "kube-api-access-hwsz7") pod "ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f" (UID: "ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f"). InnerVolumeSpecName "kube-api-access-hwsz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.114458 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:17 crc kubenswrapper[4885]: E0308 20:58:17.115009 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f" containerName="mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.115042 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f" containerName="mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.115290 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f" containerName="mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.116106 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.145612 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.151912 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwsz7\" (UniqueName: \"kubernetes.io/projected/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f-kube-api-access-hwsz7\") on node \"crc\" DevicePath \"\"" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.253280 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vx74\" (UniqueName: \"kubernetes.io/projected/030a880e-43ba-49b0-a593-248f9c58df16-kube-api-access-8vx74\") pod \"mariadb-client\" (UID: \"030a880e-43ba-49b0-a593-248f9c58df16\") " pod="openstack/mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.354749 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vx74\" (UniqueName: \"kubernetes.io/projected/030a880e-43ba-49b0-a593-248f9c58df16-kube-api-access-8vx74\") pod \"mariadb-client\" (UID: \"030a880e-43ba-49b0-a593-248f9c58df16\") " pod="openstack/mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.378501 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f" path="/var/lib/kubelet/pods/ffbeea9e-d5fb-42aa-8ad9-a687fc50c96f/volumes" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.380310 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vx74\" (UniqueName: \"kubernetes.io/projected/030a880e-43ba-49b0-a593-248f9c58df16-kube-api-access-8vx74\") pod \"mariadb-client\" (UID: \"030a880e-43ba-49b0-a593-248f9c58df16\") " pod="openstack/mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.485625 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.543538 4885 scope.go:117] "RemoveContainer" containerID="cb3b0cf9a33a47c0e77a72a5c33129d359ae195a6b6e13ab819d24d148ae74bb" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.543646 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:17 crc kubenswrapper[4885]: I0308 20:58:17.912384 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:17 crc kubenswrapper[4885]: W0308 20:58:17.925175 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod030a880e_43ba_49b0_a593_248f9c58df16.slice/crio-dea38899acf042e904b04c9a685602651f109b76aefcda3b6e5e833251f9ed23 WatchSource:0}: Error finding container dea38899acf042e904b04c9a685602651f109b76aefcda3b6e5e833251f9ed23: Status 404 returned error can't find the container with id dea38899acf042e904b04c9a685602651f109b76aefcda3b6e5e833251f9ed23 Mar 08 20:58:18 crc kubenswrapper[4885]: I0308 20:58:18.553004 4885 generic.go:334] "Generic (PLEG): container finished" podID="030a880e-43ba-49b0-a593-248f9c58df16" containerID="7d907d9b611e54cf97509bd0480731a1868d207218e0f55feb360b3b591d95c2" exitCode=0 Mar 08 20:58:18 crc kubenswrapper[4885]: I0308 20:58:18.553055 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"030a880e-43ba-49b0-a593-248f9c58df16","Type":"ContainerDied","Data":"7d907d9b611e54cf97509bd0480731a1868d207218e0f55feb360b3b591d95c2"} Mar 08 20:58:18 crc kubenswrapper[4885]: I0308 20:58:18.553302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"030a880e-43ba-49b0-a593-248f9c58df16","Type":"ContainerStarted","Data":"dea38899acf042e904b04c9a685602651f109b76aefcda3b6e5e833251f9ed23"} Mar 08 20:58:19 crc kubenswrapper[4885]: I0308 20:58:19.968470 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:19 crc kubenswrapper[4885]: I0308 20:58:19.993266 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_030a880e-43ba-49b0-a593-248f9c58df16/mariadb-client/0.log" Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.007584 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vx74\" (UniqueName: \"kubernetes.io/projected/030a880e-43ba-49b0-a593-248f9c58df16-kube-api-access-8vx74\") pod \"030a880e-43ba-49b0-a593-248f9c58df16\" (UID: \"030a880e-43ba-49b0-a593-248f9c58df16\") " Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.016347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/030a880e-43ba-49b0-a593-248f9c58df16-kube-api-access-8vx74" (OuterVolumeSpecName: "kube-api-access-8vx74") pod "030a880e-43ba-49b0-a593-248f9c58df16" (UID: "030a880e-43ba-49b0-a593-248f9c58df16"). InnerVolumeSpecName "kube-api-access-8vx74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.031828 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.043470 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.109136 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vx74\" (UniqueName: \"kubernetes.io/projected/030a880e-43ba-49b0-a593-248f9c58df16-kube-api-access-8vx74\") on node \"crc\" DevicePath \"\"" Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.574900 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dea38899acf042e904b04c9a685602651f109b76aefcda3b6e5e833251f9ed23" Mar 08 20:58:20 crc kubenswrapper[4885]: I0308 20:58:20.575091 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Mar 08 20:58:21 crc kubenswrapper[4885]: I0308 20:58:21.386418 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="030a880e-43ba-49b0-a593-248f9c58df16" path="/var/lib/kubelet/pods/030a880e-43ba-49b0-a593-248f9c58df16/volumes" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.213419 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 20:58:59 crc kubenswrapper[4885]: E0308 20:58:59.216637 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="030a880e-43ba-49b0-a593-248f9c58df16" containerName="mariadb-client" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.216891 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="030a880e-43ba-49b0-a593-248f9c58df16" containerName="mariadb-client" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.218843 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="030a880e-43ba-49b0-a593-248f9c58df16" containerName="mariadb-client" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.220689 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.229286 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.230448 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-h588k" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.237855 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.245617 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.266345 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.267831 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.284795 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.286471 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.303068 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.316473 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.319669 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.319825 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.319911 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hn2k\" (UniqueName: \"kubernetes.io/projected/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-kube-api-access-5hn2k\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.319950 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.320218 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.320293 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421480 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421550 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11852e05-e4cd-4884-b382-035694906263-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cdde225-3478-4566-9019-df846ce962fb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421655 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rtrp\" (UniqueName: \"kubernetes.io/projected/11852e05-e4cd-4884-b382-035694906263-kube-api-access-4rtrp\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421800 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421843 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11852e05-e4cd-4884-b382-035694906263-config\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421884 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1cdde225-3478-4566-9019-df846ce962fb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.421965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422020 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltn22\" (UniqueName: \"kubernetes.io/projected/1cdde225-3478-4566-9019-df846ce962fb-kube-api-access-ltn22\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422069 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11852e05-e4cd-4884-b382-035694906263-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422112 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11852e05-e4cd-4884-b382-035694906263-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422316 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422409 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hn2k\" (UniqueName: \"kubernetes.io/projected/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-kube-api-access-5hn2k\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422460 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422562 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cdde225-3478-4566-9019-df846ce962fb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422695 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.422735 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cdde225-3478-4566-9019-df846ce962fb-config\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.424267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.424312 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-config\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.424668 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.429895 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.429984 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ca9173663b5e8869f2a1d06ba2f0d2643b686ec230901091c64e10437141f124/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.439695 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.447139 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.449650 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.454702 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-8wnqg" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.455108 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.455343 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.464043 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hn2k\" (UniqueName: \"kubernetes.io/projected/2c0a1292-7594-49d3-b3f0-2e1a6aa004e2-kube-api-access-5hn2k\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.473693 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.476817 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.486410 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.490078 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.512325 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-50b4e568-d9c1-4ae2-8e15-c5948a9ad0a7\") pod \"ovsdbserver-sb-0\" (UID: \"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2\") " pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.513000 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.519140 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.523914 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltn22\" (UniqueName: \"kubernetes.io/projected/1cdde225-3478-4566-9019-df846ce962fb-kube-api-access-ltn22\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.523985 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11852e05-e4cd-4884-b382-035694906263-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524003 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11852e05-e4cd-4884-b382-035694906263-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524039 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd9461e-0196-4eaf-a733-44340b19d354-config\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524061 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524110 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524132 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cdde225-3478-4566-9019-df846ce962fb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524150 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebd9461e-0196-4eaf-a733-44340b19d354-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524174 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cdde225-3478-4566-9019-df846ce962fb-config\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524190 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb9d4\" (UniqueName: \"kubernetes.io/projected/ebd9461e-0196-4eaf-a733-44340b19d354-kube-api-access-sb9d4\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524222 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11852e05-e4cd-4884-b382-035694906263-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524260 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cdde225-3478-4566-9019-df846ce962fb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524282 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rtrp\" (UniqueName: \"kubernetes.io/projected/11852e05-e4cd-4884-b382-035694906263-kube-api-access-4rtrp\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524308 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebd9461e-0196-4eaf-a733-44340b19d354-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524328 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd9461e-0196-4eaf-a733-44340b19d354-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524357 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524375 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11852e05-e4cd-4884-b382-035694906263-config\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524398 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1cdde225-3478-4566-9019-df846ce962fb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.524798 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1cdde225-3478-4566-9019-df846ce962fb-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.525457 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/11852e05-e4cd-4884-b382-035694906263-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.525705 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.526473 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cdde225-3478-4566-9019-df846ce962fb-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.526676 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1cdde225-3478-4566-9019-df846ce962fb-config\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.529491 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11852e05-e4cd-4884-b382-035694906263-config\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.530423 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/11852e05-e4cd-4884-b382-035694906263-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.536563 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.536592 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8ebd3a6eda502377f605d32caed44b01abe458065c6f1cd0a759ff1bb8cc7eb5/globalmount\"" pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.536934 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cdde225-3478-4566-9019-df846ce962fb-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.540042 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.540062 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6b8422d04854d1c8065197f1c0893c01a2f78fb0cd1fc2b94e00e55b33d539b5/globalmount\"" pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.541487 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11852e05-e4cd-4884-b382-035694906263-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.545200 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rtrp\" (UniqueName: \"kubernetes.io/projected/11852e05-e4cd-4884-b382-035694906263-kube-api-access-4rtrp\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.546333 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltn22\" (UniqueName: \"kubernetes.io/projected/1cdde225-3478-4566-9019-df846ce962fb-kube-api-access-ltn22\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.559392 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.566428 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-efc19fdb-fb0a-4bdc-b7dd-551700d80d2a\") pod \"ovsdbserver-sb-1\" (UID: \"1cdde225-3478-4566-9019-df846ce962fb\") " pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.567119 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b592becd-3b5f-4322-ba0d-2c55140a0322\") pod \"ovsdbserver-sb-2\" (UID: \"11852e05-e4cd-4884-b382-035694906263\") " pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.586512 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.604396 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.625626 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eca16e2-6962-4cad-9cbb-23d33af9c10a-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626017 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd9461e-0196-4eaf-a733-44340b19d354-config\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626168 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626223 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626281 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebd9461e-0196-4eaf-a733-44340b19d354-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626316 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d605915b-24f4-45ec-bb13-7e7097bb288b-config\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626382 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb9d4\" (UniqueName: \"kubernetes.io/projected/ebd9461e-0196-4eaf-a733-44340b19d354-kube-api-access-sb9d4\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626478 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcrpr\" (UniqueName: \"kubernetes.io/projected/4eca16e2-6962-4cad-9cbb-23d33af9c10a-kube-api-access-pcrpr\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626541 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d605915b-24f4-45ec-bb13-7e7097bb288b-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626590 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eca16e2-6962-4cad-9cbb-23d33af9c10a-config\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.626824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ebd9461e-0196-4eaf-a733-44340b19d354-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.627704 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd9461e-0196-4eaf-a733-44340b19d354-config\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.628551 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.628591 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f5ed38bef82799b6681aaf26cc928b27424b3c057ebb6776ac2ea2fccf1a63e7/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629345 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d605915b-24f4-45ec-bb13-7e7097bb288b-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629438 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629495 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmrh6\" (UniqueName: \"kubernetes.io/projected/d605915b-24f4-45ec-bb13-7e7097bb288b-kube-api-access-mmrh6\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629530 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d605915b-24f4-45ec-bb13-7e7097bb288b-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629586 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebd9461e-0196-4eaf-a733-44340b19d354-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629634 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd9461e-0196-4eaf-a733-44340b19d354-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629670 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eca16e2-6962-4cad-9cbb-23d33af9c10a-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.629707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4eca16e2-6962-4cad-9cbb-23d33af9c10a-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.631690 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ebd9461e-0196-4eaf-a733-44340b19d354-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.634735 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebd9461e-0196-4eaf-a733-44340b19d354-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.647228 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb9d4\" (UniqueName: \"kubernetes.io/projected/ebd9461e-0196-4eaf-a733-44340b19d354-kube-api-access-sb9d4\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.684618 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9d605aec-81c7-4deb-b4fc-065e179878b9\") pod \"ovsdbserver-nb-0\" (UID: \"ebd9461e-0196-4eaf-a733-44340b19d354\") " pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730666 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730720 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d605915b-24f4-45ec-bb13-7e7097bb288b-config\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730767 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcrpr\" (UniqueName: \"kubernetes.io/projected/4eca16e2-6962-4cad-9cbb-23d33af9c10a-kube-api-access-pcrpr\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730783 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d605915b-24f4-45ec-bb13-7e7097bb288b-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730807 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eca16e2-6962-4cad-9cbb-23d33af9c10a-config\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d605915b-24f4-45ec-bb13-7e7097bb288b-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730859 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmrh6\" (UniqueName: \"kubernetes.io/projected/d605915b-24f4-45ec-bb13-7e7097bb288b-kube-api-access-mmrh6\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730916 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d605915b-24f4-45ec-bb13-7e7097bb288b-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.730982 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eca16e2-6962-4cad-9cbb-23d33af9c10a-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.731001 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4eca16e2-6962-4cad-9cbb-23d33af9c10a-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.731029 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eca16e2-6962-4cad-9cbb-23d33af9c10a-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.731752 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d605915b-24f4-45ec-bb13-7e7097bb288b-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.732361 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4eca16e2-6962-4cad-9cbb-23d33af9c10a-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.732701 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d605915b-24f4-45ec-bb13-7e7097bb288b-config\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.733775 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d605915b-24f4-45ec-bb13-7e7097bb288b-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.734289 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4eca16e2-6962-4cad-9cbb-23d33af9c10a-config\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.735511 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.735537 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/be4d4bdf9879c17893725f2f238c9378caa9301312b00496f3cf5ea61073dfc0/globalmount\"" pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.735813 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.735863 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/633348a338d920df1dcb058c2c84bb4f6e6614f10dd6ecfc5cc803d00153f8ed/globalmount\"" pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.737581 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4eca16e2-6962-4cad-9cbb-23d33af9c10a-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.749164 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4eca16e2-6962-4cad-9cbb-23d33af9c10a-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.753060 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d605915b-24f4-45ec-bb13-7e7097bb288b-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.755810 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmrh6\" (UniqueName: \"kubernetes.io/projected/d605915b-24f4-45ec-bb13-7e7097bb288b-kube-api-access-mmrh6\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.756694 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcrpr\" (UniqueName: \"kubernetes.io/projected/4eca16e2-6962-4cad-9cbb-23d33af9c10a-kube-api-access-pcrpr\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.780611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bb8fd6cc-60a7-4710-ab01-b3806fd1ddba\") pod \"ovsdbserver-nb-2\" (UID: \"4eca16e2-6962-4cad-9cbb-23d33af9c10a\") " pod="openstack/ovsdbserver-nb-2" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.781331 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d9defed5-f309-4d8c-bf1e-9bf74a95be15\") pod \"ovsdbserver-nb-1\" (UID: \"d605915b-24f4-45ec-bb13-7e7097bb288b\") " pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.926109 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.948775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Mar 08 20:58:59 crc kubenswrapper[4885]: I0308 20:58:59.961023 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.111341 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 08 20:59:00 crc kubenswrapper[4885]: W0308 20:59:00.129215 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c0a1292_7594_49d3_b3f0_2e1a6aa004e2.slice/crio-9fc62a49c76a5bcd0763a24f25d11d140521d54d58deed2f7929cba63a25c8b2 WatchSource:0}: Error finding container 9fc62a49c76a5bcd0763a24f25d11d140521d54d58deed2f7929cba63a25c8b2: Status 404 returned error can't find the container with id 9fc62a49c76a5bcd0763a24f25d11d140521d54d58deed2f7929cba63a25c8b2 Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.201793 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.455194 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.574232 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.957314 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ebd9461e-0196-4eaf-a733-44340b19d354","Type":"ContainerStarted","Data":"1dade7de7daa8c5b89bebaa909d702ebb223d1d31a52106ba20c6947908e5d83"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.957684 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ebd9461e-0196-4eaf-a733-44340b19d354","Type":"ContainerStarted","Data":"059118c9ef699bb0b280b108a95e8510ec7e80cc6e933f0f02eec053a1e5d826"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.957701 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"ebd9461e-0196-4eaf-a733-44340b19d354","Type":"ContainerStarted","Data":"cccdd8e1056c26fbb563e37aaf0672d94926523626c320b579eed1aaafddfc5c"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.960034 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"4eca16e2-6962-4cad-9cbb-23d33af9c10a","Type":"ContainerStarted","Data":"5bacc20730bf2d7aa32126ec4c9edafec9f84aae11176f80c5cfba1ec377c323"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.960081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"4eca16e2-6962-4cad-9cbb-23d33af9c10a","Type":"ContainerStarted","Data":"3ee9a18d842fd2f2b2f73d5ecc8f53993bee1e2028582d670b18f9918d61d8f5"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.960095 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"4eca16e2-6962-4cad-9cbb-23d33af9c10a","Type":"ContainerStarted","Data":"2e8c03152d595674364d783b3c1eaa57676fb9c7422c0d37f82b7633d05b93e6"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.962050 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2","Type":"ContainerStarted","Data":"f1e1b98f69ad7a962faf3c5f57c68ef4c90eb0c0d61884d215b9f2fb78b69ca1"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.962078 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2","Type":"ContainerStarted","Data":"3e82c8cba2d8ea79bd3e21615834f39903a3f11fc300d281ec2ff5877a33c43e"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.962091 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2c0a1292-7594-49d3-b3f0-2e1a6aa004e2","Type":"ContainerStarted","Data":"9fc62a49c76a5bcd0763a24f25d11d140521d54d58deed2f7929cba63a25c8b2"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.964033 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"1cdde225-3478-4566-9019-df846ce962fb","Type":"ContainerStarted","Data":"ab22ce67a252b13b4cd81306e4f70041e4fc1d065efff6149eef4d8dcd870c68"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.964061 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"1cdde225-3478-4566-9019-df846ce962fb","Type":"ContainerStarted","Data":"b6feceebc499df4d23791119af9e17693560f82576bfa59023005f97c477a95d"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.964073 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"1cdde225-3478-4566-9019-df846ce962fb","Type":"ContainerStarted","Data":"b37832bdb20284f645c1f15ef8bca5e51a0a279a99b0a6a9f355cb43ca221e4d"} Mar 08 20:59:00 crc kubenswrapper[4885]: I0308 20:59:00.981960 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=2.981915751 podStartE2EDuration="2.981915751s" podCreationTimestamp="2026-03-08 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:00.97515165 +0000 UTC m=+5242.371205673" watchObservedRunningTime="2026-03-08 20:59:00.981915751 +0000 UTC m=+5242.377969794" Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.000711 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.000688231 podStartE2EDuration="3.000688231s" podCreationTimestamp="2026-03-08 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:00.994311122 +0000 UTC m=+5242.390365155" watchObservedRunningTime="2026-03-08 20:59:01.000688231 +0000 UTC m=+5242.396742264" Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.019295 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.019272457 podStartE2EDuration="3.019272457s" podCreationTimestamp="2026-03-08 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:01.018004943 +0000 UTC m=+5242.414058976" watchObservedRunningTime="2026-03-08 20:59:01.019272457 +0000 UTC m=+5242.415326490" Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.048505 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.048480726 podStartE2EDuration="3.048480726s" podCreationTimestamp="2026-03-08 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:01.044974522 +0000 UTC m=+5242.441028575" watchObservedRunningTime="2026-03-08 20:59:01.048480726 +0000 UTC m=+5242.444534769" Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.104025 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.231069 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Mar 08 20:59:01 crc kubenswrapper[4885]: W0308 20:59:01.239607 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd605915b_24f4_45ec_bb13_7e7097bb288b.slice/crio-4374e61b737331da82fd399301a9db4b10fff3750d07b2c5deb3d8b0a2b40a2e WatchSource:0}: Error finding container 4374e61b737331da82fd399301a9db4b10fff3750d07b2c5deb3d8b0a2b40a2e: Status 404 returned error can't find the container with id 4374e61b737331da82fd399301a9db4b10fff3750d07b2c5deb3d8b0a2b40a2e Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.977377 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d605915b-24f4-45ec-bb13-7e7097bb288b","Type":"ContainerStarted","Data":"2653ab3aa7af4c7cb424010346b54c092a232b9945f7a88f950dea7374140593"} Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.977708 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d605915b-24f4-45ec-bb13-7e7097bb288b","Type":"ContainerStarted","Data":"8eb1bbf1deaa7131fbd32b22b9dea7d99848880570be4d9d2e0a5ec34156adfc"} Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.977719 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"d605915b-24f4-45ec-bb13-7e7097bb288b","Type":"ContainerStarted","Data":"4374e61b737331da82fd399301a9db4b10fff3750d07b2c5deb3d8b0a2b40a2e"} Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.979975 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"11852e05-e4cd-4884-b382-035694906263","Type":"ContainerStarted","Data":"ee90ca4eb708fa670d4ede507e4dcbb3d4af9ac401b0877e603c6b4428f5034d"} Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.980133 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"11852e05-e4cd-4884-b382-035694906263","Type":"ContainerStarted","Data":"7b7e53d4b24c99f965824abec94febd3afad0c358834f19eaf04ccb79a91cfae"} Mar 08 20:59:01 crc kubenswrapper[4885]: I0308 20:59:01.980172 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"11852e05-e4cd-4884-b382-035694906263","Type":"ContainerStarted","Data":"6e74e98ff97e97200a83ea01eb663ed85dc6711b32db471352dd506f4d7751bb"} Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.009609 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=4.009562623 podStartE2EDuration="4.009562623s" podCreationTimestamp="2026-03-08 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:01.993592518 +0000 UTC m=+5243.389646541" watchObservedRunningTime="2026-03-08 20:59:02.009562623 +0000 UTC m=+5243.405616686" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.039449 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=4.039382678 podStartE2EDuration="4.039382678s" podCreationTimestamp="2026-03-08 20:58:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:02.019131148 +0000 UTC m=+5243.415185261" watchObservedRunningTime="2026-03-08 20:59:02.039382678 +0000 UTC m=+5243.435436741" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.560376 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.586884 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.605248 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.926798 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.949190 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Mar 08 20:59:02 crc kubenswrapper[4885]: I0308 20:59:02.961687 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.369427 4885 scope.go:117] "RemoveContainer" containerID="4a9dac92cb97fc09835d72492b83bdbd16e3d2d9b07c98a3d36966204fa55732" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.560094 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.587525 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.604854 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.926458 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.949869 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Mar 08 20:59:04 crc kubenswrapper[4885]: I0308 20:59:04.961633 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.442464 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hq8r2"] Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.445259 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.454872 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hq8r2"] Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.636563 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-catalog-content\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.636770 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-utilities\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.636840 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnqrf\" (UniqueName: \"kubernetes.io/projected/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-kube-api-access-wnqrf\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.739061 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-catalog-content\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.739212 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-catalog-content\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.739304 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-utilities\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.739348 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnqrf\" (UniqueName: \"kubernetes.io/projected/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-kube-api-access-wnqrf\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.739626 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-utilities\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.776897 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnqrf\" (UniqueName: \"kubernetes.io/projected/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-kube-api-access-wnqrf\") pod \"redhat-marketplace-hq8r2\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.799942 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.800039 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.802807 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.835974 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.846638 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.964578 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 08 20:59:05 crc kubenswrapper[4885]: I0308 20:59:05.994599 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.033415 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.059097 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.074637 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.085181 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.118102 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.130591 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.133122 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d88bcf99f-q2r8n"] Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.190540 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.194965 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d88bcf99f-q2r8n"] Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.212536 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.341957 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d88bcf99f-q2r8n"] Mar 08 20:59:06 crc kubenswrapper[4885]: E0308 20:59:06.342570 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-nmcd4 ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" podUID="d8c57abb-ec77-4ac1-9ca5-913f466c13ab" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.365812 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-dns-svc\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.365908 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-ovsdbserver-sb\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.365986 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmcd4\" (UniqueName: \"kubernetes.io/projected/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-kube-api-access-nmcd4\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.366021 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-config\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.371881 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c977fd9ff-cg2sc"] Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.373680 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.376592 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.382206 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c977fd9ff-cg2sc"] Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.471490 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-ovsdbserver-sb\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.471780 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-nb\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.471814 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-sb\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.471831 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2cnj\" (UniqueName: \"kubernetes.io/projected/97015d9c-53b3-463a-8953-0c5338fbaefe-kube-api-access-d2cnj\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.471873 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-config\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.472008 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmcd4\" (UniqueName: \"kubernetes.io/projected/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-kube-api-access-nmcd4\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.472035 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-config\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.472088 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-dns-svc\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.472201 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-dns-svc\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.476595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-config\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.476607 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-dns-svc\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.476662 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-ovsdbserver-sb\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.491101 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmcd4\" (UniqueName: \"kubernetes.io/projected/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-kube-api-access-nmcd4\") pod \"dnsmasq-dns-d88bcf99f-q2r8n\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.573851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-nb\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.573905 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-sb\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.573939 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2cnj\" (UniqueName: \"kubernetes.io/projected/97015d9c-53b3-463a-8953-0c5338fbaefe-kube-api-access-d2cnj\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.573981 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-config\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.574021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-dns-svc\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.574763 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-nb\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.574804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-dns-svc\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.575069 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-sb\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.575242 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-config\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.594881 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2cnj\" (UniqueName: \"kubernetes.io/projected/97015d9c-53b3-463a-8953-0c5338fbaefe-kube-api-access-d2cnj\") pod \"dnsmasq-dns-c977fd9ff-cg2sc\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.656807 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hq8r2"] Mar 08 20:59:06 crc kubenswrapper[4885]: I0308 20:59:06.695703 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.028117 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerID="7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d" exitCode=0 Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.028177 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hq8r2" event={"ID":"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d","Type":"ContainerDied","Data":"7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d"} Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.028815 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hq8r2" event={"ID":"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d","Type":"ContainerStarted","Data":"988413fd6813a018a052c35557173c8058b1b56cf0dc5288e4ec2c652d437f89"} Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.029628 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.044695 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.080654 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-dns-svc\") pod \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.080705 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmcd4\" (UniqueName: \"kubernetes.io/projected/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-kube-api-access-nmcd4\") pod \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.080787 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-ovsdbserver-sb\") pod \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.080817 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-config\") pod \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\" (UID: \"d8c57abb-ec77-4ac1-9ca5-913f466c13ab\") " Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.081598 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8c57abb-ec77-4ac1-9ca5-913f466c13ab" (UID: "d8c57abb-ec77-4ac1-9ca5-913f466c13ab"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.083879 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-config" (OuterVolumeSpecName: "config") pod "d8c57abb-ec77-4ac1-9ca5-913f466c13ab" (UID: "d8c57abb-ec77-4ac1-9ca5-913f466c13ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.084292 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8c57abb-ec77-4ac1-9ca5-913f466c13ab" (UID: "d8c57abb-ec77-4ac1-9ca5-913f466c13ab"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.089566 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-kube-api-access-nmcd4" (OuterVolumeSpecName: "kube-api-access-nmcd4") pod "d8c57abb-ec77-4ac1-9ca5-913f466c13ab" (UID: "d8c57abb-ec77-4ac1-9ca5-913f466c13ab"). InnerVolumeSpecName "kube-api-access-nmcd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.185823 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.185901 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmcd4\" (UniqueName: \"kubernetes.io/projected/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-kube-api-access-nmcd4\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.185970 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.185991 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8c57abb-ec77-4ac1-9ca5-913f466c13ab-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:07 crc kubenswrapper[4885]: W0308 20:59:07.200569 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97015d9c_53b3_463a_8953_0c5338fbaefe.slice/crio-9baaed9043ae4e717157055effd88e4c50c4eb535de4ba919caa286fe87c1640 WatchSource:0}: Error finding container 9baaed9043ae4e717157055effd88e4c50c4eb535de4ba919caa286fe87c1640: Status 404 returned error can't find the container with id 9baaed9043ae4e717157055effd88e4c50c4eb535de4ba919caa286fe87c1640 Mar 08 20:59:07 crc kubenswrapper[4885]: I0308 20:59:07.200742 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c977fd9ff-cg2sc"] Mar 08 20:59:08 crc kubenswrapper[4885]: I0308 20:59:08.039888 4885 generic.go:334] "Generic (PLEG): container finished" podID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerID="478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c" exitCode=0 Mar 08 20:59:08 crc kubenswrapper[4885]: I0308 20:59:08.039974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" event={"ID":"97015d9c-53b3-463a-8953-0c5338fbaefe","Type":"ContainerDied","Data":"478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c"} Mar 08 20:59:08 crc kubenswrapper[4885]: I0308 20:59:08.040210 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d88bcf99f-q2r8n" Mar 08 20:59:08 crc kubenswrapper[4885]: I0308 20:59:08.040243 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" event={"ID":"97015d9c-53b3-463a-8953-0c5338fbaefe","Type":"ContainerStarted","Data":"9baaed9043ae4e717157055effd88e4c50c4eb535de4ba919caa286fe87c1640"} Mar 08 20:59:08 crc kubenswrapper[4885]: I0308 20:59:08.269887 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d88bcf99f-q2r8n"] Mar 08 20:59:08 crc kubenswrapper[4885]: I0308 20:59:08.275658 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d88bcf99f-q2r8n"] Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.054538 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" event={"ID":"97015d9c-53b3-463a-8953-0c5338fbaefe","Type":"ContainerStarted","Data":"4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4"} Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.061114 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.062669 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerID="c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d" exitCode=0 Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.062714 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hq8r2" event={"ID":"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d","Type":"ContainerDied","Data":"c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d"} Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.089675 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" podStartSLOduration=3.089651524 podStartE2EDuration="3.089651524s" podCreationTimestamp="2026-03-08 20:59:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:09.083576741 +0000 UTC m=+5250.479630804" watchObservedRunningTime="2026-03-08 20:59:09.089651524 +0000 UTC m=+5250.485705557" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.141060 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.142085 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.147220 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.156533 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.220853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.221009 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.221111 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmb5r\" (UniqueName: \"kubernetes.io/projected/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-kube-api-access-fmb5r\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.323901 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmb5r\" (UniqueName: \"kubernetes.io/projected/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-kube-api-access-fmb5r\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.324080 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.324130 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.329452 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.329720 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0d13b9839d6363de7430b2dc3885042ff5afd37c33060049b87006ee21f82e9a/globalmount\"" pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.335432 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.360500 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmb5r\" (UniqueName: \"kubernetes.io/projected/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-kube-api-access-fmb5r\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.391772 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8c57abb-ec77-4ac1-9ca5-913f466c13ab" path="/var/lib/kubelet/pods/d8c57abb-ec77-4ac1-9ca5-913f466c13ab/volumes" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.396099 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") pod \"ovn-copy-data\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " pod="openstack/ovn-copy-data" Mar 08 20:59:09 crc kubenswrapper[4885]: I0308 20:59:09.479507 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 08 20:59:10 crc kubenswrapper[4885]: I0308 20:59:10.075433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hq8r2" event={"ID":"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d","Type":"ContainerStarted","Data":"3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483"} Mar 08 20:59:10 crc kubenswrapper[4885]: I0308 20:59:10.097832 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Mar 08 20:59:10 crc kubenswrapper[4885]: I0308 20:59:10.106514 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hq8r2" podStartSLOduration=2.552663489 podStartE2EDuration="5.106494307s" podCreationTimestamp="2026-03-08 20:59:05 +0000 UTC" firstStartedPulling="2026-03-08 20:59:07.0306497 +0000 UTC m=+5248.426703733" lastFinishedPulling="2026-03-08 20:59:09.584480528 +0000 UTC m=+5250.980534551" observedRunningTime="2026-03-08 20:59:10.106283862 +0000 UTC m=+5251.502337885" watchObservedRunningTime="2026-03-08 20:59:10.106494307 +0000 UTC m=+5251.502548330" Mar 08 20:59:11 crc kubenswrapper[4885]: I0308 20:59:11.083109 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a086771f-d0fc-4265-b8ba-a414a7f6c7d0","Type":"ContainerStarted","Data":"fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19"} Mar 08 20:59:11 crc kubenswrapper[4885]: I0308 20:59:11.083668 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a086771f-d0fc-4265-b8ba-a414a7f6c7d0","Type":"ContainerStarted","Data":"2b2a6f955da79537fe6939ae44e2e8e65e67c9ab78da8f292a75babe5150e678"} Mar 08 20:59:11 crc kubenswrapper[4885]: I0308 20:59:11.105616 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.562767313 podStartE2EDuration="3.105590998s" podCreationTimestamp="2026-03-08 20:59:08 +0000 UTC" firstStartedPulling="2026-03-08 20:59:10.110672799 +0000 UTC m=+5251.506726842" lastFinishedPulling="2026-03-08 20:59:10.653496504 +0000 UTC m=+5252.049550527" observedRunningTime="2026-03-08 20:59:11.09815977 +0000 UTC m=+5252.494213833" watchObservedRunningTime="2026-03-08 20:59:11.105590998 +0000 UTC m=+5252.501645031" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.075323 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.076061 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.139089 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.217299 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.396980 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hq8r2"] Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.697977 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.772484 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-z8kjz"] Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.946904 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.949138 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.951399 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.959301 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.959957 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-hg5dp" Mar 08 20:59:16 crc kubenswrapper[4885]: I0308 20:59:16.960499 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.073490 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.073549 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-config\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.073569 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.073675 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm9hp\" (UniqueName: \"kubernetes.io/projected/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-kube-api-access-sm9hp\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.073758 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-scripts\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.134137 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerName="dnsmasq-dns" containerID="cri-o://04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5" gracePeriod=10 Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.175099 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-scripts\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.175196 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.175233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-config\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.175274 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.175337 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm9hp\" (UniqueName: \"kubernetes.io/projected/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-kube-api-access-sm9hp\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.175816 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.176289 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-scripts\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.176689 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-config\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.180678 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.190182 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm9hp\" (UniqueName: \"kubernetes.io/projected/3e81fc01-0a65-4956-9ba5-26ec5f7c25c9-kube-api-access-sm9hp\") pod \"ovn-northd-0\" (UID: \"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9\") " pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.272193 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.827433 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.918069 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-config\") pod \"8e1559f2-4966-4752-8c07-aea40781bbd3\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.918191 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vx9b\" (UniqueName: \"kubernetes.io/projected/8e1559f2-4966-4752-8c07-aea40781bbd3-kube-api-access-9vx9b\") pod \"8e1559f2-4966-4752-8c07-aea40781bbd3\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.918208 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-dns-svc\") pod \"8e1559f2-4966-4752-8c07-aea40781bbd3\" (UID: \"8e1559f2-4966-4752-8c07-aea40781bbd3\") " Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.924457 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e1559f2-4966-4752-8c07-aea40781bbd3-kube-api-access-9vx9b" (OuterVolumeSpecName: "kube-api-access-9vx9b") pod "8e1559f2-4966-4752-8c07-aea40781bbd3" (UID: "8e1559f2-4966-4752-8c07-aea40781bbd3"). InnerVolumeSpecName "kube-api-access-9vx9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.938110 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 08 20:59:17 crc kubenswrapper[4885]: W0308 20:59:17.947410 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e81fc01_0a65_4956_9ba5_26ec5f7c25c9.slice/crio-d6bea0b2762f79a29c7555bfeee7055fd2e1a2ca27b7943744bcf42b5c7efb7c WatchSource:0}: Error finding container d6bea0b2762f79a29c7555bfeee7055fd2e1a2ca27b7943744bcf42b5c7efb7c: Status 404 returned error can't find the container with id d6bea0b2762f79a29c7555bfeee7055fd2e1a2ca27b7943744bcf42b5c7efb7c Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.964914 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8e1559f2-4966-4752-8c07-aea40781bbd3" (UID: "8e1559f2-4966-4752-8c07-aea40781bbd3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:17 crc kubenswrapper[4885]: I0308 20:59:17.981112 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-config" (OuterVolumeSpecName: "config") pod "8e1559f2-4966-4752-8c07-aea40781bbd3" (UID: "8e1559f2-4966-4752-8c07-aea40781bbd3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.020115 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vx9b\" (UniqueName: \"kubernetes.io/projected/8e1559f2-4966-4752-8c07-aea40781bbd3-kube-api-access-9vx9b\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.020143 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.020152 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8e1559f2-4966-4752-8c07-aea40781bbd3-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.144676 4885 generic.go:334] "Generic (PLEG): container finished" podID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerID="04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5" exitCode=0 Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.144781 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.144798 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" event={"ID":"8e1559f2-4966-4752-8c07-aea40781bbd3","Type":"ContainerDied","Data":"04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5"} Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.144875 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66d5bf7c87-z8kjz" event={"ID":"8e1559f2-4966-4752-8c07-aea40781bbd3","Type":"ContainerDied","Data":"e38cb66edab29ad6cc751ddfec546724fea6786800c820b585d732bbfd60f672"} Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.144895 4885 scope.go:117] "RemoveContainer" containerID="04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.147179 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9","Type":"ContainerStarted","Data":"e8bc72a3b9172c08f37114013e31d81e70218a0ddae0d38248d1660be4df7772"} Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.147216 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9","Type":"ContainerStarted","Data":"d6bea0b2762f79a29c7555bfeee7055fd2e1a2ca27b7943744bcf42b5c7efb7c"} Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.147442 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hq8r2" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="registry-server" containerID="cri-o://3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483" gracePeriod=2 Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.177186 4885 scope.go:117] "RemoveContainer" containerID="42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.184606 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-z8kjz"] Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.192419 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66d5bf7c87-z8kjz"] Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.203224 4885 scope.go:117] "RemoveContainer" containerID="04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5" Mar 08 20:59:18 crc kubenswrapper[4885]: E0308 20:59:18.203554 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5\": container with ID starting with 04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5 not found: ID does not exist" containerID="04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.203582 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5"} err="failed to get container status \"04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5\": rpc error: code = NotFound desc = could not find container \"04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5\": container with ID starting with 04bd2ba6e007e1045a26f19d0ea7b27886442451086297dfccb3566ee56264e5 not found: ID does not exist" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.203605 4885 scope.go:117] "RemoveContainer" containerID="42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43" Mar 08 20:59:18 crc kubenswrapper[4885]: E0308 20:59:18.203781 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43\": container with ID starting with 42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43 not found: ID does not exist" containerID="42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.203800 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43"} err="failed to get container status \"42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43\": rpc error: code = NotFound desc = could not find container \"42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43\": container with ID starting with 42623606bb4fd42c32f36c81f869206ed1dad43fe839130d5ee57247a1716b43 not found: ID does not exist" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.593231 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.630791 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-utilities\") pod \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.630841 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-catalog-content\") pod \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.630979 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnqrf\" (UniqueName: \"kubernetes.io/projected/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-kube-api-access-wnqrf\") pod \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\" (UID: \"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d\") " Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.632988 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-utilities" (OuterVolumeSpecName: "utilities") pod "cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" (UID: "cf890fa2-1612-4dbd-b85e-6f304b9ccd6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.641226 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-kube-api-access-wnqrf" (OuterVolumeSpecName: "kube-api-access-wnqrf") pod "cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" (UID: "cf890fa2-1612-4dbd-b85e-6f304b9ccd6d"). InnerVolumeSpecName "kube-api-access-wnqrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.672701 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" (UID: "cf890fa2-1612-4dbd-b85e-6f304b9ccd6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.733096 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnqrf\" (UniqueName: \"kubernetes.io/projected/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-kube-api-access-wnqrf\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.733402 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:18 crc kubenswrapper[4885]: I0308 20:59:18.733445 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.158693 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"3e81fc01-0a65-4956-9ba5-26ec5f7c25c9","Type":"ContainerStarted","Data":"662d70db5585e2c725bc484b19e622e88e9a75dd69b9c6a2613a47dbbb6e9f00"} Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.159095 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.162765 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerID="3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483" exitCode=0 Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.162851 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hq8r2" event={"ID":"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d","Type":"ContainerDied","Data":"3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483"} Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.162970 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hq8r2" event={"ID":"cf890fa2-1612-4dbd-b85e-6f304b9ccd6d","Type":"ContainerDied","Data":"988413fd6813a018a052c35557173c8058b1b56cf0dc5288e4ec2c652d437f89"} Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.162874 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hq8r2" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.163014 4885 scope.go:117] "RemoveContainer" containerID="3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.203621 4885 scope.go:117] "RemoveContainer" containerID="c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.210376 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.210344291 podStartE2EDuration="3.210344291s" podCreationTimestamp="2026-03-08 20:59:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:19.188299533 +0000 UTC m=+5260.584353596" watchObservedRunningTime="2026-03-08 20:59:19.210344291 +0000 UTC m=+5260.606398324" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.229156 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hq8r2"] Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.235726 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hq8r2"] Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.240107 4885 scope.go:117] "RemoveContainer" containerID="7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.263815 4885 scope.go:117] "RemoveContainer" containerID="3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483" Mar 08 20:59:19 crc kubenswrapper[4885]: E0308 20:59:19.267491 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483\": container with ID starting with 3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483 not found: ID does not exist" containerID="3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.267521 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483"} err="failed to get container status \"3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483\": rpc error: code = NotFound desc = could not find container \"3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483\": container with ID starting with 3ad21d91284b6e616366b7de6188bf12069810f999666525534e93508a5cb483 not found: ID does not exist" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.267540 4885 scope.go:117] "RemoveContainer" containerID="c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d" Mar 08 20:59:19 crc kubenswrapper[4885]: E0308 20:59:19.268240 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d\": container with ID starting with c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d not found: ID does not exist" containerID="c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.268302 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d"} err="failed to get container status \"c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d\": rpc error: code = NotFound desc = could not find container \"c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d\": container with ID starting with c77d46ff5eca4d0504fb14fbc55c38e5849f3aea0d2a4bec4f5c6a9abfb8a11d not found: ID does not exist" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.268346 4885 scope.go:117] "RemoveContainer" containerID="7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d" Mar 08 20:59:19 crc kubenswrapper[4885]: E0308 20:59:19.268912 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d\": container with ID starting with 7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d not found: ID does not exist" containerID="7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.268945 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d"} err="failed to get container status \"7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d\": rpc error: code = NotFound desc = could not find container \"7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d\": container with ID starting with 7173a8341ff4d63e73a5926e761f41a758852fb5b743e0f9904d5d7b5cde2c7d not found: ID does not exist" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.388669 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" path="/var/lib/kubelet/pods/8e1559f2-4966-4752-8c07-aea40781bbd3/volumes" Mar 08 20:59:19 crc kubenswrapper[4885]: I0308 20:59:19.389875 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" path="/var/lib/kubelet/pods/cf890fa2-1612-4dbd-b85e-6f304b9ccd6d/volumes" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.171376 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-dkqb5"] Mar 08 20:59:22 crc kubenswrapper[4885]: E0308 20:59:22.172067 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="extract-content" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172080 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="extract-content" Mar 08 20:59:22 crc kubenswrapper[4885]: E0308 20:59:22.172100 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="registry-server" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172106 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="registry-server" Mar 08 20:59:22 crc kubenswrapper[4885]: E0308 20:59:22.172120 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerName="dnsmasq-dns" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172126 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerName="dnsmasq-dns" Mar 08 20:59:22 crc kubenswrapper[4885]: E0308 20:59:22.172136 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="extract-utilities" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172142 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="extract-utilities" Mar 08 20:59:22 crc kubenswrapper[4885]: E0308 20:59:22.172156 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerName="init" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172162 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerName="init" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172317 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e1559f2-4966-4752-8c07-aea40781bbd3" containerName="dnsmasq-dns" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172332 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf890fa2-1612-4dbd-b85e-6f304b9ccd6d" containerName="registry-server" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.172786 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.209500 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxbs4\" (UniqueName: \"kubernetes.io/projected/2af0fb78-1571-4090-a0e4-009deb2915a5-kube-api-access-nxbs4\") pod \"keystone-db-create-dkqb5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.209607 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af0fb78-1571-4090-a0e4-009deb2915a5-operator-scripts\") pod \"keystone-db-create-dkqb5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.215226 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dkqb5"] Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.264502 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c81b-account-create-update-sqjkr"] Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.265534 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.267983 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.270384 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c81b-account-create-update-sqjkr"] Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.311740 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbs4\" (UniqueName: \"kubernetes.io/projected/2af0fb78-1571-4090-a0e4-009deb2915a5-kube-api-access-nxbs4\") pod \"keystone-db-create-dkqb5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.311779 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af0fb78-1571-4090-a0e4-009deb2915a5-operator-scripts\") pod \"keystone-db-create-dkqb5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.312458 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af0fb78-1571-4090-a0e4-009deb2915a5-operator-scripts\") pod \"keystone-db-create-dkqb5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.329910 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbs4\" (UniqueName: \"kubernetes.io/projected/2af0fb78-1571-4090-a0e4-009deb2915a5-kube-api-access-nxbs4\") pod \"keystone-db-create-dkqb5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.415060 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k8d6\" (UniqueName: \"kubernetes.io/projected/dadcfb24-7e2e-42d4-b4da-4567105c11ad-kube-api-access-9k8d6\") pod \"keystone-c81b-account-create-update-sqjkr\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.415633 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadcfb24-7e2e-42d4-b4da-4567105c11ad-operator-scripts\") pod \"keystone-c81b-account-create-update-sqjkr\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.516556 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k8d6\" (UniqueName: \"kubernetes.io/projected/dadcfb24-7e2e-42d4-b4da-4567105c11ad-kube-api-access-9k8d6\") pod \"keystone-c81b-account-create-update-sqjkr\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.516649 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadcfb24-7e2e-42d4-b4da-4567105c11ad-operator-scripts\") pod \"keystone-c81b-account-create-update-sqjkr\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.517469 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadcfb24-7e2e-42d4-b4da-4567105c11ad-operator-scripts\") pod \"keystone-c81b-account-create-update-sqjkr\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.522722 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.544714 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k8d6\" (UniqueName: \"kubernetes.io/projected/dadcfb24-7e2e-42d4-b4da-4567105c11ad-kube-api-access-9k8d6\") pod \"keystone-c81b-account-create-update-sqjkr\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.586535 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:22 crc kubenswrapper[4885]: I0308 20:59:22.813019 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dkqb5"] Mar 08 20:59:22 crc kubenswrapper[4885]: W0308 20:59:22.817629 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2af0fb78_1571_4090_a0e4_009deb2915a5.slice/crio-96743c725d026a8093869af69b694bcbf09c5122eddabfaa3d0dfb797414159a WatchSource:0}: Error finding container 96743c725d026a8093869af69b694bcbf09c5122eddabfaa3d0dfb797414159a: Status 404 returned error can't find the container with id 96743c725d026a8093869af69b694bcbf09c5122eddabfaa3d0dfb797414159a Mar 08 20:59:23 crc kubenswrapper[4885]: I0308 20:59:23.123649 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c81b-account-create-update-sqjkr"] Mar 08 20:59:23 crc kubenswrapper[4885]: W0308 20:59:23.126258 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddadcfb24_7e2e_42d4_b4da_4567105c11ad.slice/crio-a9f36b9ec155fdb626e40bd59f93b121945dc3d36a8c18de2ef17dc4f395e957 WatchSource:0}: Error finding container a9f36b9ec155fdb626e40bd59f93b121945dc3d36a8c18de2ef17dc4f395e957: Status 404 returned error can't find the container with id a9f36b9ec155fdb626e40bd59f93b121945dc3d36a8c18de2ef17dc4f395e957 Mar 08 20:59:23 crc kubenswrapper[4885]: I0308 20:59:23.206628 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c81b-account-create-update-sqjkr" event={"ID":"dadcfb24-7e2e-42d4-b4da-4567105c11ad","Type":"ContainerStarted","Data":"a9f36b9ec155fdb626e40bd59f93b121945dc3d36a8c18de2ef17dc4f395e957"} Mar 08 20:59:23 crc kubenswrapper[4885]: I0308 20:59:23.208477 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dkqb5" event={"ID":"2af0fb78-1571-4090-a0e4-009deb2915a5","Type":"ContainerStarted","Data":"2a92dad5038281e9a909493af60718e80efbed4b0a30aad4e3ed0e4f55630488"} Mar 08 20:59:23 crc kubenswrapper[4885]: I0308 20:59:23.208518 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dkqb5" event={"ID":"2af0fb78-1571-4090-a0e4-009deb2915a5","Type":"ContainerStarted","Data":"96743c725d026a8093869af69b694bcbf09c5122eddabfaa3d0dfb797414159a"} Mar 08 20:59:23 crc kubenswrapper[4885]: I0308 20:59:23.221548 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-dkqb5" podStartSLOduration=1.22153053 podStartE2EDuration="1.22153053s" podCreationTimestamp="2026-03-08 20:59:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:23.220623125 +0000 UTC m=+5264.616677148" watchObservedRunningTime="2026-03-08 20:59:23.22153053 +0000 UTC m=+5264.617584553" Mar 08 20:59:24 crc kubenswrapper[4885]: I0308 20:59:24.221863 4885 generic.go:334] "Generic (PLEG): container finished" podID="2af0fb78-1571-4090-a0e4-009deb2915a5" containerID="2a92dad5038281e9a909493af60718e80efbed4b0a30aad4e3ed0e4f55630488" exitCode=0 Mar 08 20:59:24 crc kubenswrapper[4885]: I0308 20:59:24.222008 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dkqb5" event={"ID":"2af0fb78-1571-4090-a0e4-009deb2915a5","Type":"ContainerDied","Data":"2a92dad5038281e9a909493af60718e80efbed4b0a30aad4e3ed0e4f55630488"} Mar 08 20:59:24 crc kubenswrapper[4885]: I0308 20:59:24.224825 4885 generic.go:334] "Generic (PLEG): container finished" podID="dadcfb24-7e2e-42d4-b4da-4567105c11ad" containerID="b2ba1b445c0bfbdc509da995c43b1467221966fc77b2d2c35df9edb0c74ad904" exitCode=0 Mar 08 20:59:24 crc kubenswrapper[4885]: I0308 20:59:24.224895 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c81b-account-create-update-sqjkr" event={"ID":"dadcfb24-7e2e-42d4-b4da-4567105c11ad","Type":"ContainerDied","Data":"b2ba1b445c0bfbdc509da995c43b1467221966fc77b2d2c35df9edb0c74ad904"} Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.751855 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.763682 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.783650 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af0fb78-1571-4090-a0e4-009deb2915a5-operator-scripts\") pod \"2af0fb78-1571-4090-a0e4-009deb2915a5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.783771 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxbs4\" (UniqueName: \"kubernetes.io/projected/2af0fb78-1571-4090-a0e4-009deb2915a5-kube-api-access-nxbs4\") pod \"2af0fb78-1571-4090-a0e4-009deb2915a5\" (UID: \"2af0fb78-1571-4090-a0e4-009deb2915a5\") " Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.783806 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k8d6\" (UniqueName: \"kubernetes.io/projected/dadcfb24-7e2e-42d4-b4da-4567105c11ad-kube-api-access-9k8d6\") pod \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.783874 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadcfb24-7e2e-42d4-b4da-4567105c11ad-operator-scripts\") pod \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\" (UID: \"dadcfb24-7e2e-42d4-b4da-4567105c11ad\") " Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.784530 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af0fb78-1571-4090-a0e4-009deb2915a5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2af0fb78-1571-4090-a0e4-009deb2915a5" (UID: "2af0fb78-1571-4090-a0e4-009deb2915a5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.784680 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadcfb24-7e2e-42d4-b4da-4567105c11ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dadcfb24-7e2e-42d4-b4da-4567105c11ad" (UID: "dadcfb24-7e2e-42d4-b4da-4567105c11ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.790707 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af0fb78-1571-4090-a0e4-009deb2915a5-kube-api-access-nxbs4" (OuterVolumeSpecName: "kube-api-access-nxbs4") pod "2af0fb78-1571-4090-a0e4-009deb2915a5" (UID: "2af0fb78-1571-4090-a0e4-009deb2915a5"). InnerVolumeSpecName "kube-api-access-nxbs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.795404 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadcfb24-7e2e-42d4-b4da-4567105c11ad-kube-api-access-9k8d6" (OuterVolumeSpecName: "kube-api-access-9k8d6") pod "dadcfb24-7e2e-42d4-b4da-4567105c11ad" (UID: "dadcfb24-7e2e-42d4-b4da-4567105c11ad"). InnerVolumeSpecName "kube-api-access-9k8d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.885560 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af0fb78-1571-4090-a0e4-009deb2915a5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.885606 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxbs4\" (UniqueName: \"kubernetes.io/projected/2af0fb78-1571-4090-a0e4-009deb2915a5-kube-api-access-nxbs4\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.885618 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k8d6\" (UniqueName: \"kubernetes.io/projected/dadcfb24-7e2e-42d4-b4da-4567105c11ad-kube-api-access-9k8d6\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:25 crc kubenswrapper[4885]: I0308 20:59:25.885631 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dadcfb24-7e2e-42d4-b4da-4567105c11ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:26 crc kubenswrapper[4885]: I0308 20:59:26.253200 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dkqb5" event={"ID":"2af0fb78-1571-4090-a0e4-009deb2915a5","Type":"ContainerDied","Data":"96743c725d026a8093869af69b694bcbf09c5122eddabfaa3d0dfb797414159a"} Mar 08 20:59:26 crc kubenswrapper[4885]: I0308 20:59:26.253267 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96743c725d026a8093869af69b694bcbf09c5122eddabfaa3d0dfb797414159a" Mar 08 20:59:26 crc kubenswrapper[4885]: I0308 20:59:26.253343 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dkqb5" Mar 08 20:59:26 crc kubenswrapper[4885]: I0308 20:59:26.256887 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c81b-account-create-update-sqjkr" event={"ID":"dadcfb24-7e2e-42d4-b4da-4567105c11ad","Type":"ContainerDied","Data":"a9f36b9ec155fdb626e40bd59f93b121945dc3d36a8c18de2ef17dc4f395e957"} Mar 08 20:59:26 crc kubenswrapper[4885]: I0308 20:59:26.256976 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9f36b9ec155fdb626e40bd59f93b121945dc3d36a8c18de2ef17dc4f395e957" Mar 08 20:59:26 crc kubenswrapper[4885]: I0308 20:59:26.257049 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c81b-account-create-update-sqjkr" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.737892 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-nbq5w"] Mar 08 20:59:27 crc kubenswrapper[4885]: E0308 20:59:27.738510 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadcfb24-7e2e-42d4-b4da-4567105c11ad" containerName="mariadb-account-create-update" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.738523 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadcfb24-7e2e-42d4-b4da-4567105c11ad" containerName="mariadb-account-create-update" Mar 08 20:59:27 crc kubenswrapper[4885]: E0308 20:59:27.738544 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af0fb78-1571-4090-a0e4-009deb2915a5" containerName="mariadb-database-create" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.738553 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af0fb78-1571-4090-a0e4-009deb2915a5" containerName="mariadb-database-create" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.738701 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af0fb78-1571-4090-a0e4-009deb2915a5" containerName="mariadb-database-create" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.738721 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dadcfb24-7e2e-42d4-b4da-4567105c11ad" containerName="mariadb-account-create-update" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.739252 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.741847 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.742079 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.742404 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pcbss" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.743522 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.752929 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-nbq5w"] Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.824362 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82tsl\" (UniqueName: \"kubernetes.io/projected/a21d9a63-6439-41e2-915d-9ffa3d014a30-kube-api-access-82tsl\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.824467 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-combined-ca-bundle\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.824501 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-config-data\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.926207 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82tsl\" (UniqueName: \"kubernetes.io/projected/a21d9a63-6439-41e2-915d-9ffa3d014a30-kube-api-access-82tsl\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.926349 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-combined-ca-bundle\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.926384 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-config-data\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.931104 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-combined-ca-bundle\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.931679 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-config-data\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:27 crc kubenswrapper[4885]: I0308 20:59:27.943472 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82tsl\" (UniqueName: \"kubernetes.io/projected/a21d9a63-6439-41e2-915d-9ffa3d014a30-kube-api-access-82tsl\") pod \"keystone-db-sync-nbq5w\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:28 crc kubenswrapper[4885]: I0308 20:59:28.062513 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:28 crc kubenswrapper[4885]: I0308 20:59:28.591802 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-nbq5w"] Mar 08 20:59:28 crc kubenswrapper[4885]: W0308 20:59:28.597735 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda21d9a63_6439_41e2_915d_9ffa3d014a30.slice/crio-e7e89ac9417027dd7976b27bac2deeb804b087dc579ab7cd4ce0c4e3b2925377 WatchSource:0}: Error finding container e7e89ac9417027dd7976b27bac2deeb804b087dc579ab7cd4ce0c4e3b2925377: Status 404 returned error can't find the container with id e7e89ac9417027dd7976b27bac2deeb804b087dc579ab7cd4ce0c4e3b2925377 Mar 08 20:59:29 crc kubenswrapper[4885]: I0308 20:59:29.287691 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nbq5w" event={"ID":"a21d9a63-6439-41e2-915d-9ffa3d014a30","Type":"ContainerStarted","Data":"e7e89ac9417027dd7976b27bac2deeb804b087dc579ab7cd4ce0c4e3b2925377"} Mar 08 20:59:30 crc kubenswrapper[4885]: I0308 20:59:30.298697 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nbq5w" event={"ID":"a21d9a63-6439-41e2-915d-9ffa3d014a30","Type":"ContainerStarted","Data":"3210dc1871dd8ae46bd14950976c866628de438227db3aa55b84daa5b1afb3d6"} Mar 08 20:59:30 crc kubenswrapper[4885]: I0308 20:59:30.331359 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-nbq5w" podStartSLOduration=3.331333552 podStartE2EDuration="3.331333552s" podCreationTimestamp="2026-03-08 20:59:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:30.323256637 +0000 UTC m=+5271.719310700" watchObservedRunningTime="2026-03-08 20:59:30.331333552 +0000 UTC m=+5271.727387575" Mar 08 20:59:31 crc kubenswrapper[4885]: I0308 20:59:31.314043 4885 generic.go:334] "Generic (PLEG): container finished" podID="a21d9a63-6439-41e2-915d-9ffa3d014a30" containerID="3210dc1871dd8ae46bd14950976c866628de438227db3aa55b84daa5b1afb3d6" exitCode=0 Mar 08 20:59:31 crc kubenswrapper[4885]: I0308 20:59:31.314123 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nbq5w" event={"ID":"a21d9a63-6439-41e2-915d-9ffa3d014a30","Type":"ContainerDied","Data":"3210dc1871dd8ae46bd14950976c866628de438227db3aa55b84daa5b1afb3d6"} Mar 08 20:59:32 crc kubenswrapper[4885]: I0308 20:59:32.782653 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:32 crc kubenswrapper[4885]: I0308 20:59:32.930621 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-config-data\") pod \"a21d9a63-6439-41e2-915d-9ffa3d014a30\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " Mar 08 20:59:32 crc kubenswrapper[4885]: I0308 20:59:32.931210 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-combined-ca-bundle\") pod \"a21d9a63-6439-41e2-915d-9ffa3d014a30\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " Mar 08 20:59:32 crc kubenswrapper[4885]: I0308 20:59:32.931420 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82tsl\" (UniqueName: \"kubernetes.io/projected/a21d9a63-6439-41e2-915d-9ffa3d014a30-kube-api-access-82tsl\") pod \"a21d9a63-6439-41e2-915d-9ffa3d014a30\" (UID: \"a21d9a63-6439-41e2-915d-9ffa3d014a30\") " Mar 08 20:59:32 crc kubenswrapper[4885]: I0308 20:59:32.942129 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a21d9a63-6439-41e2-915d-9ffa3d014a30-kube-api-access-82tsl" (OuterVolumeSpecName: "kube-api-access-82tsl") pod "a21d9a63-6439-41e2-915d-9ffa3d014a30" (UID: "a21d9a63-6439-41e2-915d-9ffa3d014a30"). InnerVolumeSpecName "kube-api-access-82tsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:32 crc kubenswrapper[4885]: I0308 20:59:32.978792 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a21d9a63-6439-41e2-915d-9ffa3d014a30" (UID: "a21d9a63-6439-41e2-915d-9ffa3d014a30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.007271 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-config-data" (OuterVolumeSpecName: "config-data") pod "a21d9a63-6439-41e2-915d-9ffa3d014a30" (UID: "a21d9a63-6439-41e2-915d-9ffa3d014a30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.033366 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.033406 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82tsl\" (UniqueName: \"kubernetes.io/projected/a21d9a63-6439-41e2-915d-9ffa3d014a30-kube-api-access-82tsl\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.033420 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a21d9a63-6439-41e2-915d-9ffa3d014a30-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.336594 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nbq5w" event={"ID":"a21d9a63-6439-41e2-915d-9ffa3d014a30","Type":"ContainerDied","Data":"e7e89ac9417027dd7976b27bac2deeb804b087dc579ab7cd4ce0c4e3b2925377"} Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.336639 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7e89ac9417027dd7976b27bac2deeb804b087dc579ab7cd4ce0c4e3b2925377" Mar 08 20:59:33 crc kubenswrapper[4885]: I0308 20:59:33.336722 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nbq5w" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.068397 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dt79q"] Mar 08 20:59:34 crc kubenswrapper[4885]: E0308 20:59:34.068787 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a21d9a63-6439-41e2-915d-9ffa3d014a30" containerName="keystone-db-sync" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.068801 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a21d9a63-6439-41e2-915d-9ffa3d014a30" containerName="keystone-db-sync" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.068999 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a21d9a63-6439-41e2-915d-9ffa3d014a30" containerName="keystone-db-sync" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.069517 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.073246 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pcbss" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.073597 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.073737 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.077290 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.077582 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.125095 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65c5b744dc-fp6n9"] Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.126455 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.130597 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dt79q"] Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.138974 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c5b744dc-fp6n9"] Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152179 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-nb\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152242 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2d7q\" (UniqueName: \"kubernetes.io/projected/c4efef13-d123-4300-b581-2a9a52de6d1b-kube-api-access-g2d7q\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152299 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-scripts\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152327 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-config\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152373 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-dns-svc\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152432 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-config-data\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.152495 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-sb\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.253818 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-config-data\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.254137 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-combined-ca-bundle\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.254288 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-sb\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.254987 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-nb\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.255181 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2d7q\" (UniqueName: \"kubernetes.io/projected/c4efef13-d123-4300-b581-2a9a52de6d1b-kube-api-access-g2d7q\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.255353 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-fernet-keys\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.255515 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-scripts\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.255651 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-config\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.255839 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-credential-keys\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.256016 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-dns-svc\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.256146 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmq7z\" (UniqueName: \"kubernetes.io/projected/8816a3aa-9268-4201-9ad0-bc816fdaba11-kube-api-access-kmq7z\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.256285 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-sb\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.256411 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-nb\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.257800 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-config-data\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.259292 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-dns-svc\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.259307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-config\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.260082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-scripts\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.274078 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2d7q\" (UniqueName: \"kubernetes.io/projected/c4efef13-d123-4300-b581-2a9a52de6d1b-kube-api-access-g2d7q\") pod \"dnsmasq-dns-65c5b744dc-fp6n9\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.358848 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-credential-keys\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.359218 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmq7z\" (UniqueName: \"kubernetes.io/projected/8816a3aa-9268-4201-9ad0-bc816fdaba11-kube-api-access-kmq7z\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.359313 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-combined-ca-bundle\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.359390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-fernet-keys\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.363772 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-fernet-keys\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.364270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-combined-ca-bundle\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.364280 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-credential-keys\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.392234 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmq7z\" (UniqueName: \"kubernetes.io/projected/8816a3aa-9268-4201-9ad0-bc816fdaba11-kube-api-access-kmq7z\") pod \"keystone-bootstrap-dt79q\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.443266 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.689516 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:34 crc kubenswrapper[4885]: I0308 20:59:34.921705 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65c5b744dc-fp6n9"] Mar 08 20:59:34 crc kubenswrapper[4885]: W0308 20:59:34.928592 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4efef13_d123_4300_b581_2a9a52de6d1b.slice/crio-951c5b88e4e526df079bda44e005142fe20c5e1f10cb2f30511bef46f3a2b1e7 WatchSource:0}: Error finding container 951c5b88e4e526df079bda44e005142fe20c5e1f10cb2f30511bef46f3a2b1e7: Status 404 returned error can't find the container with id 951c5b88e4e526df079bda44e005142fe20c5e1f10cb2f30511bef46f3a2b1e7 Mar 08 20:59:35 crc kubenswrapper[4885]: I0308 20:59:35.165710 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dt79q"] Mar 08 20:59:35 crc kubenswrapper[4885]: I0308 20:59:35.354823 4885 generic.go:334] "Generic (PLEG): container finished" podID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerID="d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9" exitCode=0 Mar 08 20:59:35 crc kubenswrapper[4885]: I0308 20:59:35.354899 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" event={"ID":"c4efef13-d123-4300-b581-2a9a52de6d1b","Type":"ContainerDied","Data":"d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9"} Mar 08 20:59:35 crc kubenswrapper[4885]: I0308 20:59:35.355232 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" event={"ID":"c4efef13-d123-4300-b581-2a9a52de6d1b","Type":"ContainerStarted","Data":"951c5b88e4e526df079bda44e005142fe20c5e1f10cb2f30511bef46f3a2b1e7"} Mar 08 20:59:35 crc kubenswrapper[4885]: I0308 20:59:35.357307 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dt79q" event={"ID":"8816a3aa-9268-4201-9ad0-bc816fdaba11","Type":"ContainerStarted","Data":"b2a6d60ac180e7237cc570d9e3ce07878fe370c6ed8d4f85797a8e711a06d792"} Mar 08 20:59:36 crc kubenswrapper[4885]: I0308 20:59:36.368570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dt79q" event={"ID":"8816a3aa-9268-4201-9ad0-bc816fdaba11","Type":"ContainerStarted","Data":"93dc9dbb2536460c751fb5259c99f80a1281e51794443a335faf96ba42cb4c59"} Mar 08 20:59:36 crc kubenswrapper[4885]: I0308 20:59:36.371447 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" event={"ID":"c4efef13-d123-4300-b581-2a9a52de6d1b","Type":"ContainerStarted","Data":"c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0"} Mar 08 20:59:36 crc kubenswrapper[4885]: I0308 20:59:36.371828 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:36 crc kubenswrapper[4885]: I0308 20:59:36.396756 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dt79q" podStartSLOduration=2.396731454 podStartE2EDuration="2.396731454s" podCreationTimestamp="2026-03-08 20:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:36.393835167 +0000 UTC m=+5277.789889200" watchObservedRunningTime="2026-03-08 20:59:36.396731454 +0000 UTC m=+5277.792785477" Mar 08 20:59:36 crc kubenswrapper[4885]: I0308 20:59:36.424501 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" podStartSLOduration=2.424479934 podStartE2EDuration="2.424479934s" podCreationTimestamp="2026-03-08 20:59:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:36.42136522 +0000 UTC m=+5277.817419243" watchObservedRunningTime="2026-03-08 20:59:36.424479934 +0000 UTC m=+5277.820533957" Mar 08 20:59:37 crc kubenswrapper[4885]: I0308 20:59:37.385218 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 08 20:59:39 crc kubenswrapper[4885]: I0308 20:59:39.403255 4885 generic.go:334] "Generic (PLEG): container finished" podID="8816a3aa-9268-4201-9ad0-bc816fdaba11" containerID="93dc9dbb2536460c751fb5259c99f80a1281e51794443a335faf96ba42cb4c59" exitCode=0 Mar 08 20:59:39 crc kubenswrapper[4885]: I0308 20:59:39.403378 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dt79q" event={"ID":"8816a3aa-9268-4201-9ad0-bc816fdaba11","Type":"ContainerDied","Data":"93dc9dbb2536460c751fb5259c99f80a1281e51794443a335faf96ba42cb4c59"} Mar 08 20:59:40 crc kubenswrapper[4885]: I0308 20:59:40.942063 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.079897 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmq7z\" (UniqueName: \"kubernetes.io/projected/8816a3aa-9268-4201-9ad0-bc816fdaba11-kube-api-access-kmq7z\") pod \"8816a3aa-9268-4201-9ad0-bc816fdaba11\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.080371 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-config-data\") pod \"8816a3aa-9268-4201-9ad0-bc816fdaba11\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.080452 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-fernet-keys\") pod \"8816a3aa-9268-4201-9ad0-bc816fdaba11\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.080563 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-combined-ca-bundle\") pod \"8816a3aa-9268-4201-9ad0-bc816fdaba11\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.080609 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-credential-keys\") pod \"8816a3aa-9268-4201-9ad0-bc816fdaba11\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.080652 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-scripts\") pod \"8816a3aa-9268-4201-9ad0-bc816fdaba11\" (UID: \"8816a3aa-9268-4201-9ad0-bc816fdaba11\") " Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.086536 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8816a3aa-9268-4201-9ad0-bc816fdaba11" (UID: "8816a3aa-9268-4201-9ad0-bc816fdaba11"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.087331 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-scripts" (OuterVolumeSpecName: "scripts") pod "8816a3aa-9268-4201-9ad0-bc816fdaba11" (UID: "8816a3aa-9268-4201-9ad0-bc816fdaba11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.087254 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8816a3aa-9268-4201-9ad0-bc816fdaba11-kube-api-access-kmq7z" (OuterVolumeSpecName: "kube-api-access-kmq7z") pod "8816a3aa-9268-4201-9ad0-bc816fdaba11" (UID: "8816a3aa-9268-4201-9ad0-bc816fdaba11"). InnerVolumeSpecName "kube-api-access-kmq7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.089395 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8816a3aa-9268-4201-9ad0-bc816fdaba11" (UID: "8816a3aa-9268-4201-9ad0-bc816fdaba11"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.107577 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-config-data" (OuterVolumeSpecName: "config-data") pod "8816a3aa-9268-4201-9ad0-bc816fdaba11" (UID: "8816a3aa-9268-4201-9ad0-bc816fdaba11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.108514 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8816a3aa-9268-4201-9ad0-bc816fdaba11" (UID: "8816a3aa-9268-4201-9ad0-bc816fdaba11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.183662 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.183695 4885 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.183705 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.183714 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmq7z\" (UniqueName: \"kubernetes.io/projected/8816a3aa-9268-4201-9ad0-bc816fdaba11-kube-api-access-kmq7z\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.183724 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.183733 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8816a3aa-9268-4201-9ad0-bc816fdaba11-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.430911 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dt79q" event={"ID":"8816a3aa-9268-4201-9ad0-bc816fdaba11","Type":"ContainerDied","Data":"b2a6d60ac180e7237cc570d9e3ce07878fe370c6ed8d4f85797a8e711a06d792"} Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.431002 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2a6d60ac180e7237cc570d9e3ce07878fe370c6ed8d4f85797a8e711a06d792" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.431039 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dt79q" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.528589 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dt79q"] Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.542401 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dt79q"] Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.627854 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-bqsgq"] Mar 08 20:59:41 crc kubenswrapper[4885]: E0308 20:59:41.628303 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8816a3aa-9268-4201-9ad0-bc816fdaba11" containerName="keystone-bootstrap" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.628325 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8816a3aa-9268-4201-9ad0-bc816fdaba11" containerName="keystone-bootstrap" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.628503 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8816a3aa-9268-4201-9ad0-bc816fdaba11" containerName="keystone-bootstrap" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.629165 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.638081 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bqsgq"] Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.669598 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.670008 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.670243 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.670540 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pcbss" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.671128 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.693961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gxzs\" (UniqueName: \"kubernetes.io/projected/8bd0921d-5173-43dd-ac53-0ec3417dce77-kube-api-access-8gxzs\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.694216 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-scripts\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.694425 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-fernet-keys\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.694512 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-combined-ca-bundle\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.694634 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-config-data\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.694696 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-credential-keys\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.796353 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-fernet-keys\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.796426 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-combined-ca-bundle\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.796489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-config-data\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.796521 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-credential-keys\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.796595 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gxzs\" (UniqueName: \"kubernetes.io/projected/8bd0921d-5173-43dd-ac53-0ec3417dce77-kube-api-access-8gxzs\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.796646 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-scripts\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.802467 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-scripts\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.802804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-combined-ca-bundle\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.806967 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-credential-keys\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.814624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-fernet-keys\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.815208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-config-data\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.818076 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gxzs\" (UniqueName: \"kubernetes.io/projected/8bd0921d-5173-43dd-ac53-0ec3417dce77-kube-api-access-8gxzs\") pod \"keystone-bootstrap-bqsgq\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:41 crc kubenswrapper[4885]: I0308 20:59:41.993818 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:42 crc kubenswrapper[4885]: I0308 20:59:42.923484 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-bqsgq"] Mar 08 20:59:43 crc kubenswrapper[4885]: I0308 20:59:43.388412 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8816a3aa-9268-4201-9ad0-bc816fdaba11" path="/var/lib/kubelet/pods/8816a3aa-9268-4201-9ad0-bc816fdaba11/volumes" Mar 08 20:59:43 crc kubenswrapper[4885]: I0308 20:59:43.449754 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqsgq" event={"ID":"8bd0921d-5173-43dd-ac53-0ec3417dce77","Type":"ContainerStarted","Data":"de07ff485f289c819ea06e6137cf2c359f8f6dec75a1a6a503e4f8a88ac8bac6"} Mar 08 20:59:43 crc kubenswrapper[4885]: I0308 20:59:43.450103 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqsgq" event={"ID":"8bd0921d-5173-43dd-ac53-0ec3417dce77","Type":"ContainerStarted","Data":"44c7f5c32ca6bb4cbe52759db36b680e8277e291b16a7ce4f458e870d81e9f1c"} Mar 08 20:59:43 crc kubenswrapper[4885]: I0308 20:59:43.474944 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-bqsgq" podStartSLOduration=2.474907733 podStartE2EDuration="2.474907733s" podCreationTimestamp="2026-03-08 20:59:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:43.469540029 +0000 UTC m=+5284.865594062" watchObservedRunningTime="2026-03-08 20:59:43.474907733 +0000 UTC m=+5284.870961776" Mar 08 20:59:44 crc kubenswrapper[4885]: I0308 20:59:44.444095 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 20:59:44 crc kubenswrapper[4885]: I0308 20:59:44.524139 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c977fd9ff-cg2sc"] Mar 08 20:59:44 crc kubenswrapper[4885]: I0308 20:59:44.524451 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerName="dnsmasq-dns" containerID="cri-o://4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4" gracePeriod=10 Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.011512 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.056357 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-config\") pod \"97015d9c-53b3-463a-8953-0c5338fbaefe\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.056410 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-dns-svc\") pod \"97015d9c-53b3-463a-8953-0c5338fbaefe\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.056451 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-sb\") pod \"97015d9c-53b3-463a-8953-0c5338fbaefe\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.056515 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2cnj\" (UniqueName: \"kubernetes.io/projected/97015d9c-53b3-463a-8953-0c5338fbaefe-kube-api-access-d2cnj\") pod \"97015d9c-53b3-463a-8953-0c5338fbaefe\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.056543 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-nb\") pod \"97015d9c-53b3-463a-8953-0c5338fbaefe\" (UID: \"97015d9c-53b3-463a-8953-0c5338fbaefe\") " Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.064093 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97015d9c-53b3-463a-8953-0c5338fbaefe-kube-api-access-d2cnj" (OuterVolumeSpecName: "kube-api-access-d2cnj") pod "97015d9c-53b3-463a-8953-0c5338fbaefe" (UID: "97015d9c-53b3-463a-8953-0c5338fbaefe"). InnerVolumeSpecName "kube-api-access-d2cnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.095192 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "97015d9c-53b3-463a-8953-0c5338fbaefe" (UID: "97015d9c-53b3-463a-8953-0c5338fbaefe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.100066 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "97015d9c-53b3-463a-8953-0c5338fbaefe" (UID: "97015d9c-53b3-463a-8953-0c5338fbaefe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.110368 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-config" (OuterVolumeSpecName: "config") pod "97015d9c-53b3-463a-8953-0c5338fbaefe" (UID: "97015d9c-53b3-463a-8953-0c5338fbaefe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.112631 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "97015d9c-53b3-463a-8953-0c5338fbaefe" (UID: "97015d9c-53b3-463a-8953-0c5338fbaefe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.157697 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.157723 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2cnj\" (UniqueName: \"kubernetes.io/projected/97015d9c-53b3-463a-8953-0c5338fbaefe-kube-api-access-d2cnj\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.157735 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.157757 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-config\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.157767 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97015d9c-53b3-463a-8953-0c5338fbaefe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.468376 4885 generic.go:334] "Generic (PLEG): container finished" podID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerID="4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4" exitCode=0 Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.468477 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" event={"ID":"97015d9c-53b3-463a-8953-0c5338fbaefe","Type":"ContainerDied","Data":"4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4"} Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.468796 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" event={"ID":"97015d9c-53b3-463a-8953-0c5338fbaefe","Type":"ContainerDied","Data":"9baaed9043ae4e717157055effd88e4c50c4eb535de4ba919caa286fe87c1640"} Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.468831 4885 scope.go:117] "RemoveContainer" containerID="4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.468568 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c977fd9ff-cg2sc" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.504140 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c977fd9ff-cg2sc"] Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.518114 4885 scope.go:117] "RemoveContainer" containerID="478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.521597 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c977fd9ff-cg2sc"] Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.542139 4885 scope.go:117] "RemoveContainer" containerID="4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4" Mar 08 20:59:45 crc kubenswrapper[4885]: E0308 20:59:45.542454 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4\": container with ID starting with 4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4 not found: ID does not exist" containerID="4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.542486 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4"} err="failed to get container status \"4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4\": rpc error: code = NotFound desc = could not find container \"4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4\": container with ID starting with 4eaebb8a99b035af07106501e04a701ae2c578182c9bf29985cdc7ca99860de4 not found: ID does not exist" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.542507 4885 scope.go:117] "RemoveContainer" containerID="478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c" Mar 08 20:59:45 crc kubenswrapper[4885]: E0308 20:59:45.542880 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c\": container with ID starting with 478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c not found: ID does not exist" containerID="478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c" Mar 08 20:59:45 crc kubenswrapper[4885]: I0308 20:59:45.542905 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c"} err="failed to get container status \"478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c\": rpc error: code = NotFound desc = could not find container \"478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c\": container with ID starting with 478414728b0451428a96f17377d4b32f72e03a9cee34bbe2caff620732f8dc1c not found: ID does not exist" Mar 08 20:59:46 crc kubenswrapper[4885]: I0308 20:59:46.483529 4885 generic.go:334] "Generic (PLEG): container finished" podID="8bd0921d-5173-43dd-ac53-0ec3417dce77" containerID="de07ff485f289c819ea06e6137cf2c359f8f6dec75a1a6a503e4f8a88ac8bac6" exitCode=0 Mar 08 20:59:46 crc kubenswrapper[4885]: I0308 20:59:46.483578 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqsgq" event={"ID":"8bd0921d-5173-43dd-ac53-0ec3417dce77","Type":"ContainerDied","Data":"de07ff485f289c819ea06e6137cf2c359f8f6dec75a1a6a503e4f8a88ac8bac6"} Mar 08 20:59:47 crc kubenswrapper[4885]: I0308 20:59:47.386912 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" path="/var/lib/kubelet/pods/97015d9c-53b3-463a-8953-0c5338fbaefe/volumes" Mar 08 20:59:47 crc kubenswrapper[4885]: I0308 20:59:47.859573 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.010776 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-fernet-keys\") pod \"8bd0921d-5173-43dd-ac53-0ec3417dce77\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.010844 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-scripts\") pod \"8bd0921d-5173-43dd-ac53-0ec3417dce77\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.010876 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-combined-ca-bundle\") pod \"8bd0921d-5173-43dd-ac53-0ec3417dce77\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.011014 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-config-data\") pod \"8bd0921d-5173-43dd-ac53-0ec3417dce77\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.011055 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8gxzs\" (UniqueName: \"kubernetes.io/projected/8bd0921d-5173-43dd-ac53-0ec3417dce77-kube-api-access-8gxzs\") pod \"8bd0921d-5173-43dd-ac53-0ec3417dce77\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.011091 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-credential-keys\") pod \"8bd0921d-5173-43dd-ac53-0ec3417dce77\" (UID: \"8bd0921d-5173-43dd-ac53-0ec3417dce77\") " Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.017802 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-scripts" (OuterVolumeSpecName: "scripts") pod "8bd0921d-5173-43dd-ac53-0ec3417dce77" (UID: "8bd0921d-5173-43dd-ac53-0ec3417dce77"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.018989 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd0921d-5173-43dd-ac53-0ec3417dce77-kube-api-access-8gxzs" (OuterVolumeSpecName: "kube-api-access-8gxzs") pod "8bd0921d-5173-43dd-ac53-0ec3417dce77" (UID: "8bd0921d-5173-43dd-ac53-0ec3417dce77"). InnerVolumeSpecName "kube-api-access-8gxzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.019160 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "8bd0921d-5173-43dd-ac53-0ec3417dce77" (UID: "8bd0921d-5173-43dd-ac53-0ec3417dce77"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.019407 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8bd0921d-5173-43dd-ac53-0ec3417dce77" (UID: "8bd0921d-5173-43dd-ac53-0ec3417dce77"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.039109 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bd0921d-5173-43dd-ac53-0ec3417dce77" (UID: "8bd0921d-5173-43dd-ac53-0ec3417dce77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.055582 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-config-data" (OuterVolumeSpecName: "config-data") pod "8bd0921d-5173-43dd-ac53-0ec3417dce77" (UID: "8bd0921d-5173-43dd-ac53-0ec3417dce77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.112863 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.112909 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8gxzs\" (UniqueName: \"kubernetes.io/projected/8bd0921d-5173-43dd-ac53-0ec3417dce77-kube-api-access-8gxzs\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.112940 4885 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.112952 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.112962 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.112971 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd0921d-5173-43dd-ac53-0ec3417dce77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.504178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-bqsgq" event={"ID":"8bd0921d-5173-43dd-ac53-0ec3417dce77","Type":"ContainerDied","Data":"44c7f5c32ca6bb4cbe52759db36b680e8277e291b16a7ce4f458e870d81e9f1c"} Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.504233 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44c7f5c32ca6bb4cbe52759db36b680e8277e291b16a7ce4f458e870d81e9f1c" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.504395 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-bqsgq" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.618442 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-868b8c986d-gxm79"] Mar 08 20:59:48 crc kubenswrapper[4885]: E0308 20:59:48.619057 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerName="init" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.619080 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerName="init" Mar 08 20:59:48 crc kubenswrapper[4885]: E0308 20:59:48.619120 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerName="dnsmasq-dns" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.619130 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerName="dnsmasq-dns" Mar 08 20:59:48 crc kubenswrapper[4885]: E0308 20:59:48.619148 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd0921d-5173-43dd-ac53-0ec3417dce77" containerName="keystone-bootstrap" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.619158 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd0921d-5173-43dd-ac53-0ec3417dce77" containerName="keystone-bootstrap" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.619403 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bd0921d-5173-43dd-ac53-0ec3417dce77" containerName="keystone-bootstrap" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.619427 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="97015d9c-53b3-463a-8953-0c5338fbaefe" containerName="dnsmasq-dns" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.620585 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.623819 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.626768 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.627163 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.627386 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-pcbss" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.631711 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-868b8c986d-gxm79"] Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.722879 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-config-data\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.723025 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-combined-ca-bundle\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.723062 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-credential-keys\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.723092 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-scripts\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.723712 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4sws\" (UniqueName: \"kubernetes.io/projected/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-kube-api-access-b4sws\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.723783 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-fernet-keys\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.825212 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-combined-ca-bundle\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.825284 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-credential-keys\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.825312 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-scripts\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.825359 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4sws\" (UniqueName: \"kubernetes.io/projected/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-kube-api-access-b4sws\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.825395 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-fernet-keys\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.825450 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-config-data\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.829628 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-credential-keys\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.829852 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-config-data\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.830804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-combined-ca-bundle\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.832785 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-scripts\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.833718 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-fernet-keys\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.845273 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4sws\" (UniqueName: \"kubernetes.io/projected/65bf82e2-5440-45b2-b1ff-1f6998ce46f8-kube-api-access-b4sws\") pod \"keystone-868b8c986d-gxm79\" (UID: \"65bf82e2-5440-45b2-b1ff-1f6998ce46f8\") " pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:48 crc kubenswrapper[4885]: I0308 20:59:48.945283 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:49 crc kubenswrapper[4885]: I0308 20:59:49.209764 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-868b8c986d-gxm79"] Mar 08 20:59:49 crc kubenswrapper[4885]: I0308 20:59:49.522681 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-868b8c986d-gxm79" event={"ID":"65bf82e2-5440-45b2-b1ff-1f6998ce46f8","Type":"ContainerStarted","Data":"104827b6c72c1a8e7cd022fab280bd3e172f1884d754f2d955aad0d9afa3f9eb"} Mar 08 20:59:49 crc kubenswrapper[4885]: I0308 20:59:49.522964 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-868b8c986d-gxm79" event={"ID":"65bf82e2-5440-45b2-b1ff-1f6998ce46f8","Type":"ContainerStarted","Data":"e72c2714ba37458d0847554e1cc8423328a7dc1970d1cb54b19ab93f33fc6e3e"} Mar 08 20:59:49 crc kubenswrapper[4885]: I0308 20:59:49.522985 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-868b8c986d-gxm79" Mar 08 20:59:49 crc kubenswrapper[4885]: I0308 20:59:49.540812 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-868b8c986d-gxm79" podStartSLOduration=1.540795559 podStartE2EDuration="1.540795559s" podCreationTimestamp="2026-03-08 20:59:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 20:59:49.539384261 +0000 UTC m=+5290.935438284" watchObservedRunningTime="2026-03-08 20:59:49.540795559 +0000 UTC m=+5290.936849582" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.149264 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550060-ztwmh"] Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.150663 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.154131 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.155494 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.156038 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.163831 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829"] Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.166369 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.170287 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.170780 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.174676 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550060-ztwmh"] Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.186323 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829"] Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.345501 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxjc7\" (UniqueName: \"kubernetes.io/projected/a0100b61-a97f-40b6-b8fd-91499667f3d9-kube-api-access-wxjc7\") pod \"auto-csr-approver-29550060-ztwmh\" (UID: \"a0100b61-a97f-40b6-b8fd-91499667f3d9\") " pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.345592 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwdhf\" (UniqueName: \"kubernetes.io/projected/62c3bcc1-5dd6-411d-8030-a152617aa0a3-kube-api-access-hwdhf\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.345906 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c3bcc1-5dd6-411d-8030-a152617aa0a3-config-volume\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.345999 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c3bcc1-5dd6-411d-8030-a152617aa0a3-secret-volume\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.447671 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxjc7\" (UniqueName: \"kubernetes.io/projected/a0100b61-a97f-40b6-b8fd-91499667f3d9-kube-api-access-wxjc7\") pod \"auto-csr-approver-29550060-ztwmh\" (UID: \"a0100b61-a97f-40b6-b8fd-91499667f3d9\") " pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.447797 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwdhf\" (UniqueName: \"kubernetes.io/projected/62c3bcc1-5dd6-411d-8030-a152617aa0a3-kube-api-access-hwdhf\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.447960 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c3bcc1-5dd6-411d-8030-a152617aa0a3-config-volume\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.448004 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c3bcc1-5dd6-411d-8030-a152617aa0a3-secret-volume\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.449737 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c3bcc1-5dd6-411d-8030-a152617aa0a3-config-volume\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.459395 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c3bcc1-5dd6-411d-8030-a152617aa0a3-secret-volume\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.468263 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxjc7\" (UniqueName: \"kubernetes.io/projected/a0100b61-a97f-40b6-b8fd-91499667f3d9-kube-api-access-wxjc7\") pod \"auto-csr-approver-29550060-ztwmh\" (UID: \"a0100b61-a97f-40b6-b8fd-91499667f3d9\") " pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.477266 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwdhf\" (UniqueName: \"kubernetes.io/projected/62c3bcc1-5dd6-411d-8030-a152617aa0a3-kube-api-access-hwdhf\") pod \"collect-profiles-29550060-4d829\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.496765 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:00 crc kubenswrapper[4885]: I0308 21:00:00.504549 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:01 crc kubenswrapper[4885]: I0308 21:00:01.078383 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550060-ztwmh"] Mar 08 21:00:01 crc kubenswrapper[4885]: I0308 21:00:01.133228 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829"] Mar 08 21:00:01 crc kubenswrapper[4885]: W0308 21:00:01.139813 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62c3bcc1_5dd6_411d_8030_a152617aa0a3.slice/crio-204b01726dbd3b56013b46b3a6e70491c2aaf33156b1d8429a412b4c2b7e81fc WatchSource:0}: Error finding container 204b01726dbd3b56013b46b3a6e70491c2aaf33156b1d8429a412b4c2b7e81fc: Status 404 returned error can't find the container with id 204b01726dbd3b56013b46b3a6e70491c2aaf33156b1d8429a412b4c2b7e81fc Mar 08 21:00:01 crc kubenswrapper[4885]: I0308 21:00:01.668780 4885 generic.go:334] "Generic (PLEG): container finished" podID="62c3bcc1-5dd6-411d-8030-a152617aa0a3" containerID="e07b444034fa8d1cd5c5dd9ad29413db942d04d6d0dccd4d4f03d228986183ea" exitCode=0 Mar 08 21:00:01 crc kubenswrapper[4885]: I0308 21:00:01.668848 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" event={"ID":"62c3bcc1-5dd6-411d-8030-a152617aa0a3","Type":"ContainerDied","Data":"e07b444034fa8d1cd5c5dd9ad29413db942d04d6d0dccd4d4f03d228986183ea"} Mar 08 21:00:01 crc kubenswrapper[4885]: I0308 21:00:01.669392 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" event={"ID":"62c3bcc1-5dd6-411d-8030-a152617aa0a3","Type":"ContainerStarted","Data":"204b01726dbd3b56013b46b3a6e70491c2aaf33156b1d8429a412b4c2b7e81fc"} Mar 08 21:00:01 crc kubenswrapper[4885]: I0308 21:00:01.671973 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" event={"ID":"a0100b61-a97f-40b6-b8fd-91499667f3d9","Type":"ContainerStarted","Data":"299d8e0735bd0570edd7a482b1c465695e0b3e1b085d547f6d5575811d194fae"} Mar 08 21:00:02 crc kubenswrapper[4885]: I0308 21:00:02.818696 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:00:02 crc kubenswrapper[4885]: I0308 21:00:02.819207 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.042906 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.197386 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwdhf\" (UniqueName: \"kubernetes.io/projected/62c3bcc1-5dd6-411d-8030-a152617aa0a3-kube-api-access-hwdhf\") pod \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.197590 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c3bcc1-5dd6-411d-8030-a152617aa0a3-secret-volume\") pod \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.197634 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c3bcc1-5dd6-411d-8030-a152617aa0a3-config-volume\") pod \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\" (UID: \"62c3bcc1-5dd6-411d-8030-a152617aa0a3\") " Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.199215 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62c3bcc1-5dd6-411d-8030-a152617aa0a3-config-volume" (OuterVolumeSpecName: "config-volume") pod "62c3bcc1-5dd6-411d-8030-a152617aa0a3" (UID: "62c3bcc1-5dd6-411d-8030-a152617aa0a3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.205213 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62c3bcc1-5dd6-411d-8030-a152617aa0a3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "62c3bcc1-5dd6-411d-8030-a152617aa0a3" (UID: "62c3bcc1-5dd6-411d-8030-a152617aa0a3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.205832 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62c3bcc1-5dd6-411d-8030-a152617aa0a3-kube-api-access-hwdhf" (OuterVolumeSpecName: "kube-api-access-hwdhf") pod "62c3bcc1-5dd6-411d-8030-a152617aa0a3" (UID: "62c3bcc1-5dd6-411d-8030-a152617aa0a3"). InnerVolumeSpecName "kube-api-access-hwdhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.300485 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwdhf\" (UniqueName: \"kubernetes.io/projected/62c3bcc1-5dd6-411d-8030-a152617aa0a3-kube-api-access-hwdhf\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.300600 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/62c3bcc1-5dd6-411d-8030-a152617aa0a3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.300654 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/62c3bcc1-5dd6-411d-8030-a152617aa0a3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.691717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" event={"ID":"62c3bcc1-5dd6-411d-8030-a152617aa0a3","Type":"ContainerDied","Data":"204b01726dbd3b56013b46b3a6e70491c2aaf33156b1d8429a412b4c2b7e81fc"} Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.691829 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="204b01726dbd3b56013b46b3a6e70491c2aaf33156b1d8429a412b4c2b7e81fc" Mar 08 21:00:03 crc kubenswrapper[4885]: I0308 21:00:03.691851 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829" Mar 08 21:00:04 crc kubenswrapper[4885]: I0308 21:00:04.132789 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb"] Mar 08 21:00:04 crc kubenswrapper[4885]: I0308 21:00:04.145196 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550015-5rkqb"] Mar 08 21:00:04 crc kubenswrapper[4885]: I0308 21:00:04.701959 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" event={"ID":"a0100b61-a97f-40b6-b8fd-91499667f3d9","Type":"ContainerStarted","Data":"c0d8f0f8a4a0c8ee7bf5279891872ae208bc8c52a779f98dff22752b9bff60d5"} Mar 08 21:00:04 crc kubenswrapper[4885]: I0308 21:00:04.726586 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" podStartSLOduration=1.6144750829999999 podStartE2EDuration="4.726559516s" podCreationTimestamp="2026-03-08 21:00:00 +0000 UTC" firstStartedPulling="2026-03-08 21:00:01.091533939 +0000 UTC m=+5302.487587992" lastFinishedPulling="2026-03-08 21:00:04.203618412 +0000 UTC m=+5305.599672425" observedRunningTime="2026-03-08 21:00:04.71881085 +0000 UTC m=+5306.114864903" watchObservedRunningTime="2026-03-08 21:00:04.726559516 +0000 UTC m=+5306.122613579" Mar 08 21:00:05 crc kubenswrapper[4885]: I0308 21:00:05.385586 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19d8979b-517e-4b02-8f5a-ead2361596ea" path="/var/lib/kubelet/pods/19d8979b-517e-4b02-8f5a-ead2361596ea/volumes" Mar 08 21:00:05 crc kubenswrapper[4885]: I0308 21:00:05.713298 4885 generic.go:334] "Generic (PLEG): container finished" podID="a0100b61-a97f-40b6-b8fd-91499667f3d9" containerID="c0d8f0f8a4a0c8ee7bf5279891872ae208bc8c52a779f98dff22752b9bff60d5" exitCode=0 Mar 08 21:00:05 crc kubenswrapper[4885]: I0308 21:00:05.713352 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" event={"ID":"a0100b61-a97f-40b6-b8fd-91499667f3d9","Type":"ContainerDied","Data":"c0d8f0f8a4a0c8ee7bf5279891872ae208bc8c52a779f98dff22752b9bff60d5"} Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.109743 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.269457 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxjc7\" (UniqueName: \"kubernetes.io/projected/a0100b61-a97f-40b6-b8fd-91499667f3d9-kube-api-access-wxjc7\") pod \"a0100b61-a97f-40b6-b8fd-91499667f3d9\" (UID: \"a0100b61-a97f-40b6-b8fd-91499667f3d9\") " Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.276030 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0100b61-a97f-40b6-b8fd-91499667f3d9-kube-api-access-wxjc7" (OuterVolumeSpecName: "kube-api-access-wxjc7") pod "a0100b61-a97f-40b6-b8fd-91499667f3d9" (UID: "a0100b61-a97f-40b6-b8fd-91499667f3d9"). InnerVolumeSpecName "kube-api-access-wxjc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.371872 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxjc7\" (UniqueName: \"kubernetes.io/projected/a0100b61-a97f-40b6-b8fd-91499667f3d9-kube-api-access-wxjc7\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.738575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" event={"ID":"a0100b61-a97f-40b6-b8fd-91499667f3d9","Type":"ContainerDied","Data":"299d8e0735bd0570edd7a482b1c465695e0b3e1b085d547f6d5575811d194fae"} Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.739082 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="299d8e0735bd0570edd7a482b1c465695e0b3e1b085d547f6d5575811d194fae" Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.738713 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550060-ztwmh" Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.806443 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550054-wsncs"] Mar 08 21:00:07 crc kubenswrapper[4885]: I0308 21:00:07.815490 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550054-wsncs"] Mar 08 21:00:09 crc kubenswrapper[4885]: I0308 21:00:09.386121 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b" path="/var/lib/kubelet/pods/66fb03cd-a32f-4aa6-90dc-a0d2b2e9265b/volumes" Mar 08 21:00:20 crc kubenswrapper[4885]: I0308 21:00:20.382157 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-868b8c986d-gxm79" Mar 08 21:00:21 crc kubenswrapper[4885]: I0308 21:00:21.822630 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-66zgf" podUID="d5136d34-82a8-47c5-9d7d-09e0206587e8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:00:21 crc kubenswrapper[4885]: I0308 21:00:21.860587 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-hq28v" podUID="dca42faa-df32-44b5-99e8-109120aa36a1" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.151123 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.151868 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62c3bcc1-5dd6-411d-8030-a152617aa0a3" containerName="collect-profiles" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.151917 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="62c3bcc1-5dd6-411d-8030-a152617aa0a3" containerName="collect-profiles" Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.151994 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0100b61-a97f-40b6-b8fd-91499667f3d9" containerName="oc" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.152012 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0100b61-a97f-40b6-b8fd-91499667f3d9" containerName="oc" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.152443 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0100b61-a97f-40b6-b8fd-91499667f3d9" containerName="oc" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.152496 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="62c3bcc1-5dd6-411d-8030-a152617aa0a3" containerName="collect-profiles" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.153718 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.156730 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.158017 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-q5rnq" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.164347 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.169214 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.179886 4885 reflector.go:158] "Unhandled Error" err="object-\"openstack\"/\"openstack-config\": Failed to watch *v1.ConfigMap: unknown (get configmaps)" logger="UnhandledError" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.185369 4885 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"243135a1-f055-4b63-b640-6f751ce8bd08\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T21:00:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T21:00:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T21:00:22Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-08T21:00:22Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient@sha256:2b0a455c844ca790160c48aed1aaf8bc69ceb4b9ed4a4fa1717114e6e2e2fda9\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cxzhn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-08T21:00:22Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.186493 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.187039 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cxzhn openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[kube-api-access-cxzhn openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="243135a1-f055-4b63-b640-6f751ce8bd08" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.199774 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.219497 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.220725 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.230050 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="243135a1-f055-4b63-b640-6f751ce8bd08" podUID="f7e4501e-3805-4590-b759-f520d3f98787" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.232939 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.247076 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config-secret\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.247134 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxzhn\" (UniqueName: \"kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.247208 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.349096 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.349174 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config-secret\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.349203 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.349228 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2tgf\" (UniqueName: \"kubernetes.io/projected/f7e4501e-3805-4590-b759-f520d3f98787-kube-api-access-v2tgf\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.349249 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxzhn\" (UniqueName: \"kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.349301 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config-secret\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.350041 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.352361 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cxzhn for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (243135a1-f055-4b63-b640-6f751ce8bd08) does not match the UID in record. The object might have been deleted and then recreated Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.352418 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn podName:243135a1-f055-4b63-b640-6f751ce8bd08 nodeName:}" failed. No retries permitted until 2026-03-08 21:00:22.852402677 +0000 UTC m=+5324.248456700 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cxzhn" (UniqueName: "kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn") pod "openstackclient" (UID: "243135a1-f055-4b63-b640-6f751ce8bd08") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (243135a1-f055-4b63-b640-6f751ce8bd08) does not match the UID in record. The object might have been deleted and then recreated Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.359911 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config-secret\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.450484 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.450569 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2tgf\" (UniqueName: \"kubernetes.io/projected/f7e4501e-3805-4590-b759-f520d3f98787-kube-api-access-v2tgf\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.450817 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config-secret\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.452011 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.456341 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config-secret\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.479259 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2tgf\" (UniqueName: \"kubernetes.io/projected/f7e4501e-3805-4590-b759-f520d3f98787-kube-api-access-v2tgf\") pod \"openstackclient\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.585611 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.863488 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.890072 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxzhn\" (UniqueName: \"kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn\") pod \"openstackclient\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.891491 4885 projected.go:194] Error preparing data for projected volume kube-api-access-cxzhn for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (243135a1-f055-4b63-b640-6f751ce8bd08) does not match the UID in record. The object might have been deleted and then recreated Mar 08 21:00:22 crc kubenswrapper[4885]: E0308 21:00:22.891568 4885 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn podName:243135a1-f055-4b63-b640-6f751ce8bd08 nodeName:}" failed. No retries permitted until 2026-03-08 21:00:23.891547574 +0000 UTC m=+5325.287601617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cxzhn" (UniqueName: "kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn") pod "openstackclient" (UID: "243135a1-f055-4b63-b640-6f751ce8bd08") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (243135a1-f055-4b63-b640-6f751ce8bd08) does not match the UID in record. The object might have been deleted and then recreated Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.900814 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.901402 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f7e4501e-3805-4590-b759-f520d3f98787","Type":"ContainerStarted","Data":"5df35decd7c07684767af516967f3d25c57bef185a8f0f1d37e89e339fed67be"} Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.906524 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="243135a1-f055-4b63-b640-6f751ce8bd08" podUID="f7e4501e-3805-4590-b759-f520d3f98787" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.909873 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.912082 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="243135a1-f055-4b63-b640-6f751ce8bd08" podUID="f7e4501e-3805-4590-b759-f520d3f98787" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.991578 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config\") pod \"243135a1-f055-4b63-b640-6f751ce8bd08\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.991681 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config-secret\") pod \"243135a1-f055-4b63-b640-6f751ce8bd08\" (UID: \"243135a1-f055-4b63-b640-6f751ce8bd08\") " Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.992093 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxzhn\" (UniqueName: \"kubernetes.io/projected/243135a1-f055-4b63-b640-6f751ce8bd08-kube-api-access-cxzhn\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.992257 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "243135a1-f055-4b63-b640-6f751ce8bd08" (UID: "243135a1-f055-4b63-b640-6f751ce8bd08"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:00:22 crc kubenswrapper[4885]: I0308 21:00:22.996047 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "243135a1-f055-4b63-b640-6f751ce8bd08" (UID: "243135a1-f055-4b63-b640-6f751ce8bd08"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.093783 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.093811 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/243135a1-f055-4b63-b640-6f751ce8bd08-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.219704 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.383511 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="243135a1-f055-4b63-b640-6f751ce8bd08" path="/var/lib/kubelet/pods/243135a1-f055-4b63-b640-6f751ce8bd08/volumes" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.912666 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.912682 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f7e4501e-3805-4590-b759-f520d3f98787","Type":"ContainerStarted","Data":"974e17a17c8c2918732ff271aeb4290a267934c8e410394a48e09833b501694e"} Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.916512 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="243135a1-f055-4b63-b640-6f751ce8bd08" podUID="f7e4501e-3805-4590-b759-f520d3f98787" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.940045 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="243135a1-f055-4b63-b640-6f751ce8bd08" podUID="f7e4501e-3805-4590-b759-f520d3f98787" Mar 08 21:00:23 crc kubenswrapper[4885]: I0308 21:00:23.948899 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.948878967 podStartE2EDuration="1.948878967s" podCreationTimestamp="2026-03-08 21:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:00:23.935442468 +0000 UTC m=+5325.331496581" watchObservedRunningTime="2026-03-08 21:00:23.948878967 +0000 UTC m=+5325.344933000" Mar 08 21:00:32 crc kubenswrapper[4885]: I0308 21:00:32.818234 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:00:32 crc kubenswrapper[4885]: I0308 21:00:32.818961 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.151831 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29550061-rh8tm"] Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.153562 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.164538 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29550061-rh8tm"] Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.224764 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsdz2\" (UniqueName: \"kubernetes.io/projected/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-kube-api-access-xsdz2\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.224885 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-fernet-keys\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.224958 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-config-data\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.225163 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-combined-ca-bundle\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.326769 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsdz2\" (UniqueName: \"kubernetes.io/projected/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-kube-api-access-xsdz2\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.326817 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-fernet-keys\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.326853 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-config-data\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.326914 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-combined-ca-bundle\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.332441 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-combined-ca-bundle\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.335663 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-fernet-keys\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.347393 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-config-data\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.354563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsdz2\" (UniqueName: \"kubernetes.io/projected/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-kube-api-access-xsdz2\") pod \"keystone-cron-29550061-rh8tm\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:00 crc kubenswrapper[4885]: I0308 21:01:00.507465 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:01 crc kubenswrapper[4885]: I0308 21:01:01.008112 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29550061-rh8tm"] Mar 08 21:01:01 crc kubenswrapper[4885]: I0308 21:01:01.289129 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550061-rh8tm" event={"ID":"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5","Type":"ContainerStarted","Data":"a564a36dc2b66e16038c0066ec717a9b0d3f0037f989ced3d1b5cec263d91d8d"} Mar 08 21:01:01 crc kubenswrapper[4885]: I0308 21:01:01.289209 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550061-rh8tm" event={"ID":"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5","Type":"ContainerStarted","Data":"97c35cb9aa0b7a6e145d576e3de6363972b8381ff34cfbe770dd0514bd03ac8f"} Mar 08 21:01:01 crc kubenswrapper[4885]: I0308 21:01:01.322269 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29550061-rh8tm" podStartSLOduration=1.322242705 podStartE2EDuration="1.322242705s" podCreationTimestamp="2026-03-08 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:01:01.308650182 +0000 UTC m=+5362.704704245" watchObservedRunningTime="2026-03-08 21:01:01.322242705 +0000 UTC m=+5362.718296768" Mar 08 21:01:02 crc kubenswrapper[4885]: I0308 21:01:02.817873 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:01:02 crc kubenswrapper[4885]: I0308 21:01:02.818229 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:01:02 crc kubenswrapper[4885]: I0308 21:01:02.818277 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:01:02 crc kubenswrapper[4885]: I0308 21:01:02.818885 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:01:02 crc kubenswrapper[4885]: I0308 21:01:02.818992 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" gracePeriod=600 Mar 08 21:01:02 crc kubenswrapper[4885]: E0308 21:01:02.960237 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:01:02 crc kubenswrapper[4885]: E0308 21:01:02.975445 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e397c25_ae37_4c30_83ce_3bdb83f5b9c5.slice/crio-a564a36dc2b66e16038c0066ec717a9b0d3f0037f989ced3d1b5cec263d91d8d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c5dda3b_3e01_4bb4_af02_b0f4eeadda58.slice/crio-conmon-088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e397c25_ae37_4c30_83ce_3bdb83f5b9c5.slice/crio-conmon-a564a36dc2b66e16038c0066ec717a9b0d3f0037f989ced3d1b5cec263d91d8d.scope\": RecentStats: unable to find data in memory cache]" Mar 08 21:01:03 crc kubenswrapper[4885]: I0308 21:01:03.317102 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" exitCode=0 Mar 08 21:01:03 crc kubenswrapper[4885]: I0308 21:01:03.317168 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080"} Mar 08 21:01:03 crc kubenswrapper[4885]: I0308 21:01:03.317642 4885 scope.go:117] "RemoveContainer" containerID="b63b0ab95208c6fa0889efffa8bab4195db0658e5a1af71b9988f5d8b91fa038" Mar 08 21:01:03 crc kubenswrapper[4885]: I0308 21:01:03.318535 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:01:03 crc kubenswrapper[4885]: E0308 21:01:03.318984 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:01:03 crc kubenswrapper[4885]: I0308 21:01:03.321174 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550061-rh8tm" event={"ID":"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5","Type":"ContainerDied","Data":"a564a36dc2b66e16038c0066ec717a9b0d3f0037f989ced3d1b5cec263d91d8d"} Mar 08 21:01:03 crc kubenswrapper[4885]: I0308 21:01:03.321217 4885 generic.go:334] "Generic (PLEG): container finished" podID="7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" containerID="a564a36dc2b66e16038c0066ec717a9b0d3f0037f989ced3d1b5cec263d91d8d" exitCode=0 Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.585306 4885 scope.go:117] "RemoveContainer" containerID="26e42e6d76089e28dc85056b673d6bddbefe770761a003ab85ecc351d49b7771" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.630335 4885 scope.go:117] "RemoveContainer" containerID="9b23f86db419001dec3042d5f280866857f260d2b86edbe13a17fd8cd9ba2fd4" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.723375 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.803754 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-combined-ca-bundle\") pod \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.803829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-fernet-keys\") pod \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.803892 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsdz2\" (UniqueName: \"kubernetes.io/projected/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-kube-api-access-xsdz2\") pod \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.803944 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-config-data\") pod \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\" (UID: \"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5\") " Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.809837 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-kube-api-access-xsdz2" (OuterVolumeSpecName: "kube-api-access-xsdz2") pod "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" (UID: "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5"). InnerVolumeSpecName "kube-api-access-xsdz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.823100 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" (UID: "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.854492 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" (UID: "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.866516 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-config-data" (OuterVolumeSpecName: "config-data") pod "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" (UID: "7e397c25-ae37-4c30-83ce-3bdb83f5b9c5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.907047 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.907099 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.907115 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 21:01:04 crc kubenswrapper[4885]: I0308 21:01:04.907127 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsdz2\" (UniqueName: \"kubernetes.io/projected/7e397c25-ae37-4c30-83ce-3bdb83f5b9c5-kube-api-access-xsdz2\") on node \"crc\" DevicePath \"\"" Mar 08 21:01:05 crc kubenswrapper[4885]: I0308 21:01:05.348139 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550061-rh8tm" event={"ID":"7e397c25-ae37-4c30-83ce-3bdb83f5b9c5","Type":"ContainerDied","Data":"97c35cb9aa0b7a6e145d576e3de6363972b8381ff34cfbe770dd0514bd03ac8f"} Mar 08 21:01:05 crc kubenswrapper[4885]: I0308 21:01:05.348227 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550061-rh8tm" Mar 08 21:01:05 crc kubenswrapper[4885]: I0308 21:01:05.348639 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97c35cb9aa0b7a6e145d576e3de6363972b8381ff34cfbe770dd0514bd03ac8f" Mar 08 21:01:14 crc kubenswrapper[4885]: I0308 21:01:14.368740 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:01:14 crc kubenswrapper[4885]: E0308 21:01:14.369830 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:01:26 crc kubenswrapper[4885]: I0308 21:01:26.369168 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:01:26 crc kubenswrapper[4885]: E0308 21:01:26.369951 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:01:31 crc kubenswrapper[4885]: I0308 21:01:31.096596 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-k2nvt"] Mar 08 21:01:31 crc kubenswrapper[4885]: I0308 21:01:31.106145 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-k2nvt"] Mar 08 21:01:31 crc kubenswrapper[4885]: I0308 21:01:31.408913 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd375ae-21ae-4fb9-87dc-f6a1a205736f" path="/var/lib/kubelet/pods/1dd375ae-21ae-4fb9-87dc-f6a1a205736f/volumes" Mar 08 21:01:40 crc kubenswrapper[4885]: I0308 21:01:40.382035 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:01:40 crc kubenswrapper[4885]: E0308 21:01:40.383728 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:01:52 crc kubenswrapper[4885]: I0308 21:01:52.369479 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:01:52 crc kubenswrapper[4885]: E0308 21:01:52.370436 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.150062 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550062-kdw5m"] Mar 08 21:02:00 crc kubenswrapper[4885]: E0308 21:02:00.151544 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" containerName="keystone-cron" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.151579 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" containerName="keystone-cron" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.151950 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e397c25-ae37-4c30-83ce-3bdb83f5b9c5" containerName="keystone-cron" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.152768 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.162145 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.162231 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.162454 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550062-kdw5m"] Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.162569 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.280375 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzlth\" (UniqueName: \"kubernetes.io/projected/86555f65-5ef4-4c45-9ac3-9b561d985b57-kube-api-access-tzlth\") pod \"auto-csr-approver-29550062-kdw5m\" (UID: \"86555f65-5ef4-4c45-9ac3-9b561d985b57\") " pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.382272 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzlth\" (UniqueName: \"kubernetes.io/projected/86555f65-5ef4-4c45-9ac3-9b561d985b57-kube-api-access-tzlth\") pod \"auto-csr-approver-29550062-kdw5m\" (UID: \"86555f65-5ef4-4c45-9ac3-9b561d985b57\") " pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.402155 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzlth\" (UniqueName: \"kubernetes.io/projected/86555f65-5ef4-4c45-9ac3-9b561d985b57-kube-api-access-tzlth\") pod \"auto-csr-approver-29550062-kdw5m\" (UID: \"86555f65-5ef4-4c45-9ac3-9b561d985b57\") " pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.479792 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:00 crc kubenswrapper[4885]: I0308 21:02:00.968960 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550062-kdw5m"] Mar 08 21:02:01 crc kubenswrapper[4885]: I0308 21:02:01.897426 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" event={"ID":"86555f65-5ef4-4c45-9ac3-9b561d985b57","Type":"ContainerStarted","Data":"93ccdabcc664dc32500aa049baa7622b407db997e21445ce32227e0dad61f2b7"} Mar 08 21:02:02 crc kubenswrapper[4885]: I0308 21:02:02.922638 4885 generic.go:334] "Generic (PLEG): container finished" podID="86555f65-5ef4-4c45-9ac3-9b561d985b57" containerID="f10312aca3cd4fa64b1d669949edd0e9d6f21408d583e376501255867513b217" exitCode=0 Mar 08 21:02:02 crc kubenswrapper[4885]: I0308 21:02:02.922849 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" event={"ID":"86555f65-5ef4-4c45-9ac3-9b561d985b57","Type":"ContainerDied","Data":"f10312aca3cd4fa64b1d669949edd0e9d6f21408d583e376501255867513b217"} Mar 08 21:02:03 crc kubenswrapper[4885]: I0308 21:02:03.368423 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:02:03 crc kubenswrapper[4885]: E0308 21:02:03.368868 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.425344 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.566613 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzlth\" (UniqueName: \"kubernetes.io/projected/86555f65-5ef4-4c45-9ac3-9b561d985b57-kube-api-access-tzlth\") pod \"86555f65-5ef4-4c45-9ac3-9b561d985b57\" (UID: \"86555f65-5ef4-4c45-9ac3-9b561d985b57\") " Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.571322 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86555f65-5ef4-4c45-9ac3-9b561d985b57-kube-api-access-tzlth" (OuterVolumeSpecName: "kube-api-access-tzlth") pod "86555f65-5ef4-4c45-9ac3-9b561d985b57" (UID: "86555f65-5ef4-4c45-9ac3-9b561d985b57"). InnerVolumeSpecName "kube-api-access-tzlth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.669415 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzlth\" (UniqueName: \"kubernetes.io/projected/86555f65-5ef4-4c45-9ac3-9b561d985b57-kube-api-access-tzlth\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.831759 4885 scope.go:117] "RemoveContainer" containerID="d3ae625b11e0cf7052090345483fbacc9c9a2ab6adc1e7e832c166efaabc3867" Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.952128 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.952263 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550062-kdw5m" event={"ID":"86555f65-5ef4-4c45-9ac3-9b561d985b57","Type":"ContainerDied","Data":"93ccdabcc664dc32500aa049baa7622b407db997e21445ce32227e0dad61f2b7"} Mar 08 21:02:04 crc kubenswrapper[4885]: I0308 21:02:04.952339 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93ccdabcc664dc32500aa049baa7622b407db997e21445ce32227e0dad61f2b7" Mar 08 21:02:05 crc kubenswrapper[4885]: I0308 21:02:05.512680 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550056-kpvwx"] Mar 08 21:02:05 crc kubenswrapper[4885]: I0308 21:02:05.518721 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550056-kpvwx"] Mar 08 21:02:07 crc kubenswrapper[4885]: I0308 21:02:07.386471 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2" path="/var/lib/kubelet/pods/97098aa1-1dc7-4efc-b2a2-0c0a97ae36f2/volumes" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.626644 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-2750-account-create-update-m8vl9"] Mar 08 21:02:08 crc kubenswrapper[4885]: E0308 21:02:08.627413 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86555f65-5ef4-4c45-9ac3-9b561d985b57" containerName="oc" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.627428 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86555f65-5ef4-4c45-9ac3-9b561d985b57" containerName="oc" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.627635 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86555f65-5ef4-4c45-9ac3-9b561d985b57" containerName="oc" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.628338 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.633296 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.674357 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/379d344d-9828-4fae-a4f4-5712113f506d-operator-scripts\") pod \"barbican-2750-account-create-update-m8vl9\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.674429 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt9gr\" (UniqueName: \"kubernetes.io/projected/379d344d-9828-4fae-a4f4-5712113f506d-kube-api-access-kt9gr\") pod \"barbican-2750-account-create-update-m8vl9\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.674562 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bd9sr"] Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.675503 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.697818 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2750-account-create-update-m8vl9"] Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.727162 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bd9sr"] Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.775872 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/379d344d-9828-4fae-a4f4-5712113f506d-operator-scripts\") pod \"barbican-2750-account-create-update-m8vl9\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.775962 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt9gr\" (UniqueName: \"kubernetes.io/projected/379d344d-9828-4fae-a4f4-5712113f506d-kube-api-access-kt9gr\") pod \"barbican-2750-account-create-update-m8vl9\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.776025 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9zq7\" (UniqueName: \"kubernetes.io/projected/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-kube-api-access-g9zq7\") pod \"barbican-db-create-bd9sr\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.776046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-operator-scripts\") pod \"barbican-db-create-bd9sr\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.776711 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/379d344d-9828-4fae-a4f4-5712113f506d-operator-scripts\") pod \"barbican-2750-account-create-update-m8vl9\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.807268 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt9gr\" (UniqueName: \"kubernetes.io/projected/379d344d-9828-4fae-a4f4-5712113f506d-kube-api-access-kt9gr\") pod \"barbican-2750-account-create-update-m8vl9\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.877542 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9zq7\" (UniqueName: \"kubernetes.io/projected/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-kube-api-access-g9zq7\") pod \"barbican-db-create-bd9sr\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.877589 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-operator-scripts\") pod \"barbican-db-create-bd9sr\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.878244 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-operator-scripts\") pod \"barbican-db-create-bd9sr\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:08 crc kubenswrapper[4885]: I0308 21:02:08.898191 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9zq7\" (UniqueName: \"kubernetes.io/projected/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-kube-api-access-g9zq7\") pod \"barbican-db-create-bd9sr\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.027983 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.054703 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.490216 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-2750-account-create-update-m8vl9"] Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.553870 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bd9sr"] Mar 08 21:02:09 crc kubenswrapper[4885]: W0308 21:02:09.561091 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf6b9c40_d823_4cb8_aadd_4f2aee7bd899.slice/crio-f8f2616a7dc653311d49558a7f6f48e68440f6f532c18471f16d13aceed02989 WatchSource:0}: Error finding container f8f2616a7dc653311d49558a7f6f48e68440f6f532c18471f16d13aceed02989: Status 404 returned error can't find the container with id f8f2616a7dc653311d49558a7f6f48e68440f6f532c18471f16d13aceed02989 Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.997829 4885 generic.go:334] "Generic (PLEG): container finished" podID="379d344d-9828-4fae-a4f4-5712113f506d" containerID="005e15265fa043b9b659e044fe35d74117bca8b49d4e6e5ad4ce0be3aeda6fee" exitCode=0 Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.998008 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2750-account-create-update-m8vl9" event={"ID":"379d344d-9828-4fae-a4f4-5712113f506d","Type":"ContainerDied","Data":"005e15265fa043b9b659e044fe35d74117bca8b49d4e6e5ad4ce0be3aeda6fee"} Mar 08 21:02:09 crc kubenswrapper[4885]: I0308 21:02:09.998209 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2750-account-create-update-m8vl9" event={"ID":"379d344d-9828-4fae-a4f4-5712113f506d","Type":"ContainerStarted","Data":"a5a2fb2544bb95844b76ec2e721c2ed06a33669f04a2220663ef3871f94f70ed"} Mar 08 21:02:10 crc kubenswrapper[4885]: I0308 21:02:10.001427 4885 generic.go:334] "Generic (PLEG): container finished" podID="df6b9c40-d823-4cb8-aadd-4f2aee7bd899" containerID="5238776febd95a86282109260190e0f71b38f87e63e8af8a383a946420238586" exitCode=0 Mar 08 21:02:10 crc kubenswrapper[4885]: I0308 21:02:10.001478 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bd9sr" event={"ID":"df6b9c40-d823-4cb8-aadd-4f2aee7bd899","Type":"ContainerDied","Data":"5238776febd95a86282109260190e0f71b38f87e63e8af8a383a946420238586"} Mar 08 21:02:10 crc kubenswrapper[4885]: I0308 21:02:10.001515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bd9sr" event={"ID":"df6b9c40-d823-4cb8-aadd-4f2aee7bd899","Type":"ContainerStarted","Data":"f8f2616a7dc653311d49558a7f6f48e68440f6f532c18471f16d13aceed02989"} Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.467028 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.474014 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.643027 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9zq7\" (UniqueName: \"kubernetes.io/projected/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-kube-api-access-g9zq7\") pod \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.643108 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/379d344d-9828-4fae-a4f4-5712113f506d-operator-scripts\") pod \"379d344d-9828-4fae-a4f4-5712113f506d\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.643159 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt9gr\" (UniqueName: \"kubernetes.io/projected/379d344d-9828-4fae-a4f4-5712113f506d-kube-api-access-kt9gr\") pod \"379d344d-9828-4fae-a4f4-5712113f506d\" (UID: \"379d344d-9828-4fae-a4f4-5712113f506d\") " Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.643189 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-operator-scripts\") pod \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\" (UID: \"df6b9c40-d823-4cb8-aadd-4f2aee7bd899\") " Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.644228 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df6b9c40-d823-4cb8-aadd-4f2aee7bd899" (UID: "df6b9c40-d823-4cb8-aadd-4f2aee7bd899"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.644378 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/379d344d-9828-4fae-a4f4-5712113f506d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "379d344d-9828-4fae-a4f4-5712113f506d" (UID: "379d344d-9828-4fae-a4f4-5712113f506d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.653168 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-kube-api-access-g9zq7" (OuterVolumeSpecName: "kube-api-access-g9zq7") pod "df6b9c40-d823-4cb8-aadd-4f2aee7bd899" (UID: "df6b9c40-d823-4cb8-aadd-4f2aee7bd899"). InnerVolumeSpecName "kube-api-access-g9zq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.656330 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379d344d-9828-4fae-a4f4-5712113f506d-kube-api-access-kt9gr" (OuterVolumeSpecName: "kube-api-access-kt9gr") pod "379d344d-9828-4fae-a4f4-5712113f506d" (UID: "379d344d-9828-4fae-a4f4-5712113f506d"). InnerVolumeSpecName "kube-api-access-kt9gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.745397 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt9gr\" (UniqueName: \"kubernetes.io/projected/379d344d-9828-4fae-a4f4-5712113f506d-kube-api-access-kt9gr\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.745460 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.745476 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9zq7\" (UniqueName: \"kubernetes.io/projected/df6b9c40-d823-4cb8-aadd-4f2aee7bd899-kube-api-access-g9zq7\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:11 crc kubenswrapper[4885]: I0308 21:02:11.745491 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/379d344d-9828-4fae-a4f4-5712113f506d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:12 crc kubenswrapper[4885]: I0308 21:02:12.020462 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-2750-account-create-update-m8vl9" event={"ID":"379d344d-9828-4fae-a4f4-5712113f506d","Type":"ContainerDied","Data":"a5a2fb2544bb95844b76ec2e721c2ed06a33669f04a2220663ef3871f94f70ed"} Mar 08 21:02:12 crc kubenswrapper[4885]: I0308 21:02:12.020529 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5a2fb2544bb95844b76ec2e721c2ed06a33669f04a2220663ef3871f94f70ed" Mar 08 21:02:12 crc kubenswrapper[4885]: I0308 21:02:12.020654 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-2750-account-create-update-m8vl9" Mar 08 21:02:12 crc kubenswrapper[4885]: I0308 21:02:12.023797 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bd9sr" event={"ID":"df6b9c40-d823-4cb8-aadd-4f2aee7bd899","Type":"ContainerDied","Data":"f8f2616a7dc653311d49558a7f6f48e68440f6f532c18471f16d13aceed02989"} Mar 08 21:02:12 crc kubenswrapper[4885]: I0308 21:02:12.024030 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8f2616a7dc653311d49558a7f6f48e68440f6f532c18471f16d13aceed02989" Mar 08 21:02:12 crc kubenswrapper[4885]: I0308 21:02:12.023892 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bd9sr" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.993346 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-lkccb"] Mar 08 21:02:13 crc kubenswrapper[4885]: E0308 21:02:13.993972 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df6b9c40-d823-4cb8-aadd-4f2aee7bd899" containerName="mariadb-database-create" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.993987 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="df6b9c40-d823-4cb8-aadd-4f2aee7bd899" containerName="mariadb-database-create" Mar 08 21:02:13 crc kubenswrapper[4885]: E0308 21:02:13.994026 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379d344d-9828-4fae-a4f4-5712113f506d" containerName="mariadb-account-create-update" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.994034 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="379d344d-9828-4fae-a4f4-5712113f506d" containerName="mariadb-account-create-update" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.994214 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="df6b9c40-d823-4cb8-aadd-4f2aee7bd899" containerName="mariadb-database-create" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.994238 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="379d344d-9828-4fae-a4f4-5712113f506d" containerName="mariadb-account-create-update" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.994836 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.997586 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 08 21:02:13 crc kubenswrapper[4885]: I0308 21:02:13.999537 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bnc8w" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.021083 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lkccb"] Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.192718 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-combined-ca-bundle\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.192785 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-db-sync-config-data\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.192853 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzn5\" (UniqueName: \"kubernetes.io/projected/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-kube-api-access-6mzn5\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.295103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzn5\" (UniqueName: \"kubernetes.io/projected/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-kube-api-access-6mzn5\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.295275 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-combined-ca-bundle\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.295353 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-db-sync-config-data\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.300901 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-db-sync-config-data\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.301411 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-combined-ca-bundle\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.315340 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzn5\" (UniqueName: \"kubernetes.io/projected/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-kube-api-access-6mzn5\") pod \"barbican-db-sync-lkccb\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.360872 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:14 crc kubenswrapper[4885]: I0308 21:02:14.807337 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lkccb"] Mar 08 21:02:15 crc kubenswrapper[4885]: I0308 21:02:15.079781 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lkccb" event={"ID":"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46","Type":"ContainerStarted","Data":"0976abd329fd9ab97b11b664eb364407380933db3aade963696b316b7306d1fa"} Mar 08 21:02:15 crc kubenswrapper[4885]: I0308 21:02:15.079852 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lkccb" event={"ID":"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46","Type":"ContainerStarted","Data":"c6a06d00faa7055232099d503297429068f1cc547212244ab7619732dc98b59d"} Mar 08 21:02:15 crc kubenswrapper[4885]: I0308 21:02:15.103256 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-lkccb" podStartSLOduration=2.103235845 podStartE2EDuration="2.103235845s" podCreationTimestamp="2026-03-08 21:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:15.098095267 +0000 UTC m=+5436.494149320" watchObservedRunningTime="2026-03-08 21:02:15.103235845 +0000 UTC m=+5436.499289868" Mar 08 21:02:15 crc kubenswrapper[4885]: I0308 21:02:15.369187 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:02:15 crc kubenswrapper[4885]: E0308 21:02:15.370693 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:02:17 crc kubenswrapper[4885]: I0308 21:02:17.105768 4885 generic.go:334] "Generic (PLEG): container finished" podID="ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" containerID="0976abd329fd9ab97b11b664eb364407380933db3aade963696b316b7306d1fa" exitCode=0 Mar 08 21:02:17 crc kubenswrapper[4885]: I0308 21:02:17.105981 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lkccb" event={"ID":"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46","Type":"ContainerDied","Data":"0976abd329fd9ab97b11b664eb364407380933db3aade963696b316b7306d1fa"} Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.545695 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.697671 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mzn5\" (UniqueName: \"kubernetes.io/projected/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-kube-api-access-6mzn5\") pod \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.698122 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-combined-ca-bundle\") pod \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.698200 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-db-sync-config-data\") pod \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\" (UID: \"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46\") " Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.705113 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-kube-api-access-6mzn5" (OuterVolumeSpecName: "kube-api-access-6mzn5") pod "ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" (UID: "ca4bc30a-842b-45d6-8eb4-c964cbdd2c46"). InnerVolumeSpecName "kube-api-access-6mzn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.705863 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" (UID: "ca4bc30a-842b-45d6-8eb4-c964cbdd2c46"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.747120 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" (UID: "ca4bc30a-842b-45d6-8eb4-c964cbdd2c46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.800383 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mzn5\" (UniqueName: \"kubernetes.io/projected/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-kube-api-access-6mzn5\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.800433 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:18 crc kubenswrapper[4885]: I0308 21:02:18.800505 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.133497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lkccb" event={"ID":"ca4bc30a-842b-45d6-8eb4-c964cbdd2c46","Type":"ContainerDied","Data":"c6a06d00faa7055232099d503297429068f1cc547212244ab7619732dc98b59d"} Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.133566 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6a06d00faa7055232099d503297429068f1cc547212244ab7619732dc98b59d" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.133613 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lkccb" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.448993 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-755df7d9d5-kl4vq"] Mar 08 21:02:19 crc kubenswrapper[4885]: E0308 21:02:19.449435 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" containerName="barbican-db-sync" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.449456 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" containerName="barbican-db-sync" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.449664 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" containerName="barbican-db-sync" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.450752 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.452595 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-bnc8w" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.455031 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.462537 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-c754cdbdb-h7rpz"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.465743 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.471216 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.476156 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.479049 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-755df7d9d5-kl4vq"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.507239 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c754cdbdb-h7rpz"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.550341 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-df5964f4c-t7mvc"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.551505 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.578226 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df5964f4c-t7mvc"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.621782 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8bdb095-595a-458e-870f-41fea2999d18-logs\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.621833 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-config-data\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.621889 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-combined-ca-bundle\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.621950 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-config-data-custom\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.621977 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zqjc\" (UniqueName: \"kubernetes.io/projected/a8bdb095-595a-458e-870f-41fea2999d18-kube-api-access-5zqjc\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.622000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-combined-ca-bundle\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.622019 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc4cm\" (UniqueName: \"kubernetes.io/projected/b7b24c26-4c9a-4442-a124-a66987404ec8-kube-api-access-wc4cm\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.622046 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-config-data-custom\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.622071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-config-data\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.622097 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7b24c26-4c9a-4442-a124-a66987404ec8-logs\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.643614 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-8469b78fd4-9xh8z"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.648819 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.651283 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.659357 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8469b78fd4-9xh8z"] Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.723827 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-combined-ca-bundle\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724240 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-sb\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724273 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-nb\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724307 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-config-data-custom\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724336 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zqjc\" (UniqueName: \"kubernetes.io/projected/a8bdb095-595a-458e-870f-41fea2999d18-kube-api-access-5zqjc\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724357 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-kube-api-access-xhppn\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724375 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-combined-ca-bundle\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724396 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc4cm\" (UniqueName: \"kubernetes.io/projected/b7b24c26-4c9a-4442-a124-a66987404ec8-kube-api-access-wc4cm\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724426 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-config-data-custom\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724476 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-config-data\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724506 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7b24c26-4c9a-4442-a124-a66987404ec8-logs\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724545 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-config\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724569 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-dns-svc\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8bdb095-595a-458e-870f-41fea2999d18-logs\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.724620 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-config-data\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.725745 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8bdb095-595a-458e-870f-41fea2999d18-logs\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.725832 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7b24c26-4c9a-4442-a124-a66987404ec8-logs\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.730694 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-config-data-custom\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.734917 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-config-data-custom\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.740696 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-combined-ca-bundle\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.741363 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-combined-ca-bundle\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.742971 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7b24c26-4c9a-4442-a124-a66987404ec8-config-data\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.743153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8bdb095-595a-458e-870f-41fea2999d18-config-data\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.743642 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc4cm\" (UniqueName: \"kubernetes.io/projected/b7b24c26-4c9a-4442-a124-a66987404ec8-kube-api-access-wc4cm\") pod \"barbican-worker-755df7d9d5-kl4vq\" (UID: \"b7b24c26-4c9a-4442-a124-a66987404ec8\") " pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.747568 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zqjc\" (UniqueName: \"kubernetes.io/projected/a8bdb095-595a-458e-870f-41fea2999d18-kube-api-access-5zqjc\") pod \"barbican-keystone-listener-c754cdbdb-h7rpz\" (UID: \"a8bdb095-595a-458e-870f-41fea2999d18\") " pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.777195 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-755df7d9d5-kl4vq" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.801565 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827044 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59vk9\" (UniqueName: \"kubernetes.io/projected/ebfd95bc-213c-417c-8dd5-b66637bd98e9-kube-api-access-59vk9\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827129 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebfd95bc-213c-417c-8dd5-b66637bd98e9-logs\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827191 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-config\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-dns-svc\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827330 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-config-data-custom\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827370 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-sb\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-nb\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827454 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-config-data\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827503 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-kube-api-access-xhppn\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.827574 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-combined-ca-bundle\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.830098 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-nb\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.830268 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-dns-svc\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.836030 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-sb\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.836346 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-config\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.845573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-kube-api-access-xhppn\") pod \"dnsmasq-dns-df5964f4c-t7mvc\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.874008 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.928652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59vk9\" (UniqueName: \"kubernetes.io/projected/ebfd95bc-213c-417c-8dd5-b66637bd98e9-kube-api-access-59vk9\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.929100 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebfd95bc-213c-417c-8dd5-b66637bd98e9-logs\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.929172 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-config-data-custom\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.929202 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-config-data\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.929241 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-combined-ca-bundle\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.930696 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebfd95bc-213c-417c-8dd5-b66637bd98e9-logs\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.937011 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-combined-ca-bundle\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.938016 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-config-data\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.941681 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ebfd95bc-213c-417c-8dd5-b66637bd98e9-config-data-custom\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.957467 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59vk9\" (UniqueName: \"kubernetes.io/projected/ebfd95bc-213c-417c-8dd5-b66637bd98e9-kube-api-access-59vk9\") pod \"barbican-api-8469b78fd4-9xh8z\" (UID: \"ebfd95bc-213c-417c-8dd5-b66637bd98e9\") " pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:19 crc kubenswrapper[4885]: I0308 21:02:19.978502 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:20 crc kubenswrapper[4885]: I0308 21:02:20.289321 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-755df7d9d5-kl4vq"] Mar 08 21:02:20 crc kubenswrapper[4885]: I0308 21:02:20.302516 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-c754cdbdb-h7rpz"] Mar 08 21:02:20 crc kubenswrapper[4885]: W0308 21:02:20.303669 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebfd95bc_213c_417c_8dd5_b66637bd98e9.slice/crio-47d0123acd29cc322bfd28e25306def81e683b35e8949a484cfb307c0c385e88 WatchSource:0}: Error finding container 47d0123acd29cc322bfd28e25306def81e683b35e8949a484cfb307c0c385e88: Status 404 returned error can't find the container with id 47d0123acd29cc322bfd28e25306def81e683b35e8949a484cfb307c0c385e88 Mar 08 21:02:20 crc kubenswrapper[4885]: I0308 21:02:20.311357 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-8469b78fd4-9xh8z"] Mar 08 21:02:20 crc kubenswrapper[4885]: W0308 21:02:20.435210 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b9b5286_1ced_4445_964d_2ec8fc6a17a4.slice/crio-926b79f9df7ed6725ef54961ffc104745dc49799aef4900236f92faa1797b387 WatchSource:0}: Error finding container 926b79f9df7ed6725ef54961ffc104745dc49799aef4900236f92faa1797b387: Status 404 returned error can't find the container with id 926b79f9df7ed6725ef54961ffc104745dc49799aef4900236f92faa1797b387 Mar 08 21:02:20 crc kubenswrapper[4885]: I0308 21:02:20.442423 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-df5964f4c-t7mvc"] Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.627328 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8469b78fd4-9xh8z" event={"ID":"ebfd95bc-213c-417c-8dd5-b66637bd98e9","Type":"ContainerStarted","Data":"4f39bd047d3fe77ffdac047ba95989e94eead27192344f44db80bb4a53e9d268"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.627851 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8469b78fd4-9xh8z" event={"ID":"ebfd95bc-213c-417c-8dd5-b66637bd98e9","Type":"ContainerStarted","Data":"72c7e26c6898ff8d593f798cd90453e0d384a2a77d538a79ee7908646c927680"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.627862 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-8469b78fd4-9xh8z" event={"ID":"ebfd95bc-213c-417c-8dd5-b66637bd98e9","Type":"ContainerStarted","Data":"47d0123acd29cc322bfd28e25306def81e683b35e8949a484cfb307c0c385e88"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.629027 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.629051 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.650631 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" event={"ID":"a8bdb095-595a-458e-870f-41fea2999d18","Type":"ContainerStarted","Data":"2ad3124b0caf02798cc751c8df70dda6dec35a8c8734b02d64f94638250471d6"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.650689 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" event={"ID":"a8bdb095-595a-458e-870f-41fea2999d18","Type":"ContainerStarted","Data":"7a62561ad7c35cf5a96cff3d24814864ea4a271a0749070d1dd9157fcb795d18"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.650703 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" event={"ID":"a8bdb095-595a-458e-870f-41fea2999d18","Type":"ContainerStarted","Data":"52e0a15f8359f92d82098188b2c46cadd052b25468209f9fb6556966e8f26965"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.671898 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-755df7d9d5-kl4vq" event={"ID":"b7b24c26-4c9a-4442-a124-a66987404ec8","Type":"ContainerStarted","Data":"47f2a26668a9b3bc5bfd6a84075276d605a5652cb0215d6064a55085e547dbdf"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.672035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-755df7d9d5-kl4vq" event={"ID":"b7b24c26-4c9a-4442-a124-a66987404ec8","Type":"ContainerStarted","Data":"0b350cd53edac40273754109de42b3373bc1cf10069ee5e6f8919617e766522b"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.672053 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-755df7d9d5-kl4vq" event={"ID":"b7b24c26-4c9a-4442-a124-a66987404ec8","Type":"ContainerStarted","Data":"a0f3d48b601ce1645c913dfe43e92db55ad2b11eed1e29579690130f9b487a52"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.675528 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-8469b78fd4-9xh8z" podStartSLOduration=2.6754955049999998 podStartE2EDuration="2.675495505s" podCreationTimestamp="2026-03-08 21:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:21.663495664 +0000 UTC m=+5443.059549727" watchObservedRunningTime="2026-03-08 21:02:21.675495505 +0000 UTC m=+5443.071549558" Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.681396 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" event={"ID":"3b9b5286-1ced-4445-964d-2ec8fc6a17a4","Type":"ContainerStarted","Data":"f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.681439 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" event={"ID":"3b9b5286-1ced-4445-964d-2ec8fc6a17a4","Type":"ContainerStarted","Data":"926b79f9df7ed6725ef54961ffc104745dc49799aef4900236f92faa1797b387"} Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.713773 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-c754cdbdb-h7rpz" podStartSLOduration=2.713743054 podStartE2EDuration="2.713743054s" podCreationTimestamp="2026-03-08 21:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:21.704429315 +0000 UTC m=+5443.100483348" watchObservedRunningTime="2026-03-08 21:02:21.713743054 +0000 UTC m=+5443.109797087" Mar 08 21:02:21 crc kubenswrapper[4885]: I0308 21:02:21.736837 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-755df7d9d5-kl4vq" podStartSLOduration=2.73681872 podStartE2EDuration="2.73681872s" podCreationTimestamp="2026-03-08 21:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:21.734696223 +0000 UTC m=+5443.130750266" watchObservedRunningTime="2026-03-08 21:02:21.73681872 +0000 UTC m=+5443.132872743" Mar 08 21:02:22 crc kubenswrapper[4885]: I0308 21:02:22.692386 4885 generic.go:334] "Generic (PLEG): container finished" podID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerID="f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729" exitCode=0 Mar 08 21:02:22 crc kubenswrapper[4885]: I0308 21:02:22.694578 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" event={"ID":"3b9b5286-1ced-4445-964d-2ec8fc6a17a4","Type":"ContainerDied","Data":"f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729"} Mar 08 21:02:22 crc kubenswrapper[4885]: I0308 21:02:22.695046 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" event={"ID":"3b9b5286-1ced-4445-964d-2ec8fc6a17a4","Type":"ContainerStarted","Data":"ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a"} Mar 08 21:02:22 crc kubenswrapper[4885]: I0308 21:02:22.695770 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:22 crc kubenswrapper[4885]: I0308 21:02:22.719654 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" podStartSLOduration=3.719635277 podStartE2EDuration="3.719635277s" podCreationTimestamp="2026-03-08 21:02:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:22.718157357 +0000 UTC m=+5444.114211390" watchObservedRunningTime="2026-03-08 21:02:22.719635277 +0000 UTC m=+5444.115689320" Mar 08 21:02:29 crc kubenswrapper[4885]: I0308 21:02:29.876417 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:02:29 crc kubenswrapper[4885]: I0308 21:02:29.992165 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c5b744dc-fp6n9"] Mar 08 21:02:29 crc kubenswrapper[4885]: I0308 21:02:29.992465 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerName="dnsmasq-dns" containerID="cri-o://c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0" gracePeriod=10 Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.368027 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:02:30 crc kubenswrapper[4885]: E0308 21:02:30.368594 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.420719 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.560264 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-config\") pod \"c4efef13-d123-4300-b581-2a9a52de6d1b\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.560335 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-nb\") pod \"c4efef13-d123-4300-b581-2a9a52de6d1b\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.560371 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-sb\") pod \"c4efef13-d123-4300-b581-2a9a52de6d1b\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.560413 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-dns-svc\") pod \"c4efef13-d123-4300-b581-2a9a52de6d1b\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.560519 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2d7q\" (UniqueName: \"kubernetes.io/projected/c4efef13-d123-4300-b581-2a9a52de6d1b-kube-api-access-g2d7q\") pod \"c4efef13-d123-4300-b581-2a9a52de6d1b\" (UID: \"c4efef13-d123-4300-b581-2a9a52de6d1b\") " Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.568290 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4efef13-d123-4300-b581-2a9a52de6d1b-kube-api-access-g2d7q" (OuterVolumeSpecName: "kube-api-access-g2d7q") pod "c4efef13-d123-4300-b581-2a9a52de6d1b" (UID: "c4efef13-d123-4300-b581-2a9a52de6d1b"). InnerVolumeSpecName "kube-api-access-g2d7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.632817 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-config" (OuterVolumeSpecName: "config") pod "c4efef13-d123-4300-b581-2a9a52de6d1b" (UID: "c4efef13-d123-4300-b581-2a9a52de6d1b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.633480 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4efef13-d123-4300-b581-2a9a52de6d1b" (UID: "c4efef13-d123-4300-b581-2a9a52de6d1b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.635150 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4efef13-d123-4300-b581-2a9a52de6d1b" (UID: "c4efef13-d123-4300-b581-2a9a52de6d1b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.639385 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4efef13-d123-4300-b581-2a9a52de6d1b" (UID: "c4efef13-d123-4300-b581-2a9a52de6d1b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.662031 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2d7q\" (UniqueName: \"kubernetes.io/projected/c4efef13-d123-4300-b581-2a9a52de6d1b-kube-api-access-g2d7q\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.662068 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.662078 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.662087 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.662096 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4efef13-d123-4300-b581-2a9a52de6d1b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.770034 4885 generic.go:334] "Generic (PLEG): container finished" podID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerID="c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0" exitCode=0 Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.770082 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" event={"ID":"c4efef13-d123-4300-b581-2a9a52de6d1b","Type":"ContainerDied","Data":"c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0"} Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.770109 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" event={"ID":"c4efef13-d123-4300-b581-2a9a52de6d1b","Type":"ContainerDied","Data":"951c5b88e4e526df079bda44e005142fe20c5e1f10cb2f30511bef46f3a2b1e7"} Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.770129 4885 scope.go:117] "RemoveContainer" containerID="c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.770264 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65c5b744dc-fp6n9" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.810128 4885 scope.go:117] "RemoveContainer" containerID="d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.831829 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65c5b744dc-fp6n9"] Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.846008 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65c5b744dc-fp6n9"] Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.848883 4885 scope.go:117] "RemoveContainer" containerID="c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0" Mar 08 21:02:30 crc kubenswrapper[4885]: E0308 21:02:30.849473 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0\": container with ID starting with c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0 not found: ID does not exist" containerID="c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.849509 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0"} err="failed to get container status \"c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0\": rpc error: code = NotFound desc = could not find container \"c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0\": container with ID starting with c10480b18f36900b9054498aa4f0911a75ada073c5a9934f27418480f9e6a6b0 not found: ID does not exist" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.849534 4885 scope.go:117] "RemoveContainer" containerID="d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9" Mar 08 21:02:30 crc kubenswrapper[4885]: E0308 21:02:30.850339 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9\": container with ID starting with d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9 not found: ID does not exist" containerID="d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9" Mar 08 21:02:30 crc kubenswrapper[4885]: I0308 21:02:30.850400 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9"} err="failed to get container status \"d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9\": rpc error: code = NotFound desc = could not find container \"d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9\": container with ID starting with d9c923a136e465ee1cd8f89ad30708884a31864c31b9040ae7e4c0842e7d82b9 not found: ID does not exist" Mar 08 21:02:31 crc kubenswrapper[4885]: I0308 21:02:31.333697 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:31 crc kubenswrapper[4885]: I0308 21:02:31.342673 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-8469b78fd4-9xh8z" Mar 08 21:02:31 crc kubenswrapper[4885]: I0308 21:02:31.383211 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" path="/var/lib/kubelet/pods/c4efef13-d123-4300-b581-2a9a52de6d1b/volumes" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.368081 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:02:43 crc kubenswrapper[4885]: E0308 21:02:43.370239 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.693537 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-6ln6s"] Mar 08 21:02:43 crc kubenswrapper[4885]: E0308 21:02:43.693863 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerName="init" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.693880 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerName="init" Mar 08 21:02:43 crc kubenswrapper[4885]: E0308 21:02:43.693903 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerName="dnsmasq-dns" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.693911 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerName="dnsmasq-dns" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.694097 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4efef13-d123-4300-b581-2a9a52de6d1b" containerName="dnsmasq-dns" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.694595 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.711766 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6ln6s"] Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.736856 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b046-account-create-update-7xtx9"] Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.737852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.741910 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.760344 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b046-account-create-update-7xtx9"] Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.830889 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ts69b\" (UniqueName: \"kubernetes.io/projected/d6ea4544-00f0-4646-a598-1efa92af4e49-kube-api-access-ts69b\") pod \"neutron-b046-account-create-update-7xtx9\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.830982 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ea4544-00f0-4646-a598-1efa92af4e49-operator-scripts\") pod \"neutron-b046-account-create-update-7xtx9\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.831010 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfw2z\" (UniqueName: \"kubernetes.io/projected/0e0914c1-cac8-4c2d-bbe4-615218170f10-kube-api-access-xfw2z\") pod \"neutron-db-create-6ln6s\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.831269 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0914c1-cac8-4c2d-bbe4-615218170f10-operator-scripts\") pod \"neutron-db-create-6ln6s\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.933090 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0914c1-cac8-4c2d-bbe4-615218170f10-operator-scripts\") pod \"neutron-db-create-6ln6s\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.933214 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ts69b\" (UniqueName: \"kubernetes.io/projected/d6ea4544-00f0-4646-a598-1efa92af4e49-kube-api-access-ts69b\") pod \"neutron-b046-account-create-update-7xtx9\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.933285 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ea4544-00f0-4646-a598-1efa92af4e49-operator-scripts\") pod \"neutron-b046-account-create-update-7xtx9\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.933312 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfw2z\" (UniqueName: \"kubernetes.io/projected/0e0914c1-cac8-4c2d-bbe4-615218170f10-kube-api-access-xfw2z\") pod \"neutron-db-create-6ln6s\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.934007 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0914c1-cac8-4c2d-bbe4-615218170f10-operator-scripts\") pod \"neutron-db-create-6ln6s\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.934483 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ea4544-00f0-4646-a598-1efa92af4e49-operator-scripts\") pod \"neutron-b046-account-create-update-7xtx9\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.964089 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ts69b\" (UniqueName: \"kubernetes.io/projected/d6ea4544-00f0-4646-a598-1efa92af4e49-kube-api-access-ts69b\") pod \"neutron-b046-account-create-update-7xtx9\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:43 crc kubenswrapper[4885]: I0308 21:02:43.971742 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfw2z\" (UniqueName: \"kubernetes.io/projected/0e0914c1-cac8-4c2d-bbe4-615218170f10-kube-api-access-xfw2z\") pod \"neutron-db-create-6ln6s\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.027297 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.060066 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.528986 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-6ln6s"] Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.607026 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b046-account-create-update-7xtx9"] Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.915789 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b046-account-create-update-7xtx9" event={"ID":"d6ea4544-00f0-4646-a598-1efa92af4e49","Type":"ContainerStarted","Data":"e6faf1d3d8e8aff220f85b274a14bd4ce7db4420e55b8b8af43285742e4b286e"} Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.915854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b046-account-create-update-7xtx9" event={"ID":"d6ea4544-00f0-4646-a598-1efa92af4e49","Type":"ContainerStarted","Data":"d6e4ebace88ecbfd56312b9d0ebe995dd277ce5ebc8f4ec907b7675479aff650"} Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.918188 4885 generic.go:334] "Generic (PLEG): container finished" podID="0e0914c1-cac8-4c2d-bbe4-615218170f10" containerID="71bb049c2d9773b9c9e48cbd2812e843fb5d6ca86b0d975e407dfe49238257fc" exitCode=0 Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.918265 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6ln6s" event={"ID":"0e0914c1-cac8-4c2d-bbe4-615218170f10","Type":"ContainerDied","Data":"71bb049c2d9773b9c9e48cbd2812e843fb5d6ca86b0d975e407dfe49238257fc"} Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.918312 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6ln6s" event={"ID":"0e0914c1-cac8-4c2d-bbe4-615218170f10","Type":"ContainerStarted","Data":"184ba71716d6bf4cb475dd9dfd1e7900c1abcf7ef45ea292a51b5046dcd43f26"} Mar 08 21:02:44 crc kubenswrapper[4885]: I0308 21:02:44.939575 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-b046-account-create-update-7xtx9" podStartSLOduration=1.939541539 podStartE2EDuration="1.939541539s" podCreationTimestamp="2026-03-08 21:02:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:44.929512532 +0000 UTC m=+5466.325566585" watchObservedRunningTime="2026-03-08 21:02:44.939541539 +0000 UTC m=+5466.335595602" Mar 08 21:02:45 crc kubenswrapper[4885]: I0308 21:02:45.932991 4885 generic.go:334] "Generic (PLEG): container finished" podID="d6ea4544-00f0-4646-a598-1efa92af4e49" containerID="e6faf1d3d8e8aff220f85b274a14bd4ce7db4420e55b8b8af43285742e4b286e" exitCode=0 Mar 08 21:02:45 crc kubenswrapper[4885]: I0308 21:02:45.934100 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b046-account-create-update-7xtx9" event={"ID":"d6ea4544-00f0-4646-a598-1efa92af4e49","Type":"ContainerDied","Data":"e6faf1d3d8e8aff220f85b274a14bd4ce7db4420e55b8b8af43285742e4b286e"} Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.312395 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.388031 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0914c1-cac8-4c2d-bbe4-615218170f10-operator-scripts\") pod \"0e0914c1-cac8-4c2d-bbe4-615218170f10\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.388105 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfw2z\" (UniqueName: \"kubernetes.io/projected/0e0914c1-cac8-4c2d-bbe4-615218170f10-kube-api-access-xfw2z\") pod \"0e0914c1-cac8-4c2d-bbe4-615218170f10\" (UID: \"0e0914c1-cac8-4c2d-bbe4-615218170f10\") " Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.389432 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e0914c1-cac8-4c2d-bbe4-615218170f10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e0914c1-cac8-4c2d-bbe4-615218170f10" (UID: "0e0914c1-cac8-4c2d-bbe4-615218170f10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.394629 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e0914c1-cac8-4c2d-bbe4-615218170f10-kube-api-access-xfw2z" (OuterVolumeSpecName: "kube-api-access-xfw2z") pod "0e0914c1-cac8-4c2d-bbe4-615218170f10" (UID: "0e0914c1-cac8-4c2d-bbe4-615218170f10"). InnerVolumeSpecName "kube-api-access-xfw2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.490208 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e0914c1-cac8-4c2d-bbe4-615218170f10-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.490998 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfw2z\" (UniqueName: \"kubernetes.io/projected/0e0914c1-cac8-4c2d-bbe4-615218170f10-kube-api-access-xfw2z\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.945068 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-6ln6s" Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.945080 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-6ln6s" event={"ID":"0e0914c1-cac8-4c2d-bbe4-615218170f10","Type":"ContainerDied","Data":"184ba71716d6bf4cb475dd9dfd1e7900c1abcf7ef45ea292a51b5046dcd43f26"} Mar 08 21:02:46 crc kubenswrapper[4885]: I0308 21:02:46.945205 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="184ba71716d6bf4cb475dd9dfd1e7900c1abcf7ef45ea292a51b5046dcd43f26" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.387640 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.507738 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ea4544-00f0-4646-a598-1efa92af4e49-operator-scripts\") pod \"d6ea4544-00f0-4646-a598-1efa92af4e49\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.507900 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ts69b\" (UniqueName: \"kubernetes.io/projected/d6ea4544-00f0-4646-a598-1efa92af4e49-kube-api-access-ts69b\") pod \"d6ea4544-00f0-4646-a598-1efa92af4e49\" (UID: \"d6ea4544-00f0-4646-a598-1efa92af4e49\") " Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.509876 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6ea4544-00f0-4646-a598-1efa92af4e49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6ea4544-00f0-4646-a598-1efa92af4e49" (UID: "d6ea4544-00f0-4646-a598-1efa92af4e49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.513292 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6ea4544-00f0-4646-a598-1efa92af4e49-kube-api-access-ts69b" (OuterVolumeSpecName: "kube-api-access-ts69b") pod "d6ea4544-00f0-4646-a598-1efa92af4e49" (UID: "d6ea4544-00f0-4646-a598-1efa92af4e49"). InnerVolumeSpecName "kube-api-access-ts69b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.610665 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6ea4544-00f0-4646-a598-1efa92af4e49-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.610703 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ts69b\" (UniqueName: \"kubernetes.io/projected/d6ea4544-00f0-4646-a598-1efa92af4e49-kube-api-access-ts69b\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.981832 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b046-account-create-update-7xtx9" event={"ID":"d6ea4544-00f0-4646-a598-1efa92af4e49","Type":"ContainerDied","Data":"d6e4ebace88ecbfd56312b9d0ebe995dd277ce5ebc8f4ec907b7675479aff650"} Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.981897 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e4ebace88ecbfd56312b9d0ebe995dd277ce5ebc8f4ec907b7675479aff650" Mar 08 21:02:47 crc kubenswrapper[4885]: I0308 21:02:47.982044 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b046-account-create-update-7xtx9" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.066944 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-th6xc"] Mar 08 21:02:49 crc kubenswrapper[4885]: E0308 21:02:49.067456 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6ea4544-00f0-4646-a598-1efa92af4e49" containerName="mariadb-account-create-update" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.067478 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6ea4544-00f0-4646-a598-1efa92af4e49" containerName="mariadb-account-create-update" Mar 08 21:02:49 crc kubenswrapper[4885]: E0308 21:02:49.067500 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e0914c1-cac8-4c2d-bbe4-615218170f10" containerName="mariadb-database-create" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.067510 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e0914c1-cac8-4c2d-bbe4-615218170f10" containerName="mariadb-database-create" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.067820 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e0914c1-cac8-4c2d-bbe4-615218170f10" containerName="mariadb-database-create" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.067865 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6ea4544-00f0-4646-a598-1efa92af4e49" containerName="mariadb-account-create-update" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.068676 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.070571 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.071557 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.071880 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-894h5" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.089447 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-th6xc"] Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.142077 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-config\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.142141 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-combined-ca-bundle\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.142313 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r925\" (UniqueName: \"kubernetes.io/projected/faeab210-5195-4d9a-a17e-5aed2f14dc68-kube-api-access-4r925\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.244569 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-combined-ca-bundle\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.244660 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r925\" (UniqueName: \"kubernetes.io/projected/faeab210-5195-4d9a-a17e-5aed2f14dc68-kube-api-access-4r925\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.244790 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-config\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.250950 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-config\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.256925 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-combined-ca-bundle\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.264998 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r925\" (UniqueName: \"kubernetes.io/projected/faeab210-5195-4d9a-a17e-5aed2f14dc68-kube-api-access-4r925\") pod \"neutron-db-sync-th6xc\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.395958 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:49 crc kubenswrapper[4885]: I0308 21:02:49.869181 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-th6xc"] Mar 08 21:02:50 crc kubenswrapper[4885]: I0308 21:02:50.007098 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-th6xc" event={"ID":"faeab210-5195-4d9a-a17e-5aed2f14dc68","Type":"ContainerStarted","Data":"105d64f67e44ac823e455dbc7ab3d14e07ef163def457f807561420bfad8e641"} Mar 08 21:02:51 crc kubenswrapper[4885]: I0308 21:02:51.024452 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-th6xc" event={"ID":"faeab210-5195-4d9a-a17e-5aed2f14dc68","Type":"ContainerStarted","Data":"bbb04e31c22cb5e6b28c152a2e21dcf4858fe3b83e03435366a3a4cecdd397ef"} Mar 08 21:02:51 crc kubenswrapper[4885]: I0308 21:02:51.056995 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-th6xc" podStartSLOduration=2.056976089 podStartE2EDuration="2.056976089s" podCreationTimestamp="2026-03-08 21:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:51.05359913 +0000 UTC m=+5472.449653213" watchObservedRunningTime="2026-03-08 21:02:51.056976089 +0000 UTC m=+5472.453030122" Mar 08 21:02:54 crc kubenswrapper[4885]: I0308 21:02:54.058332 4885 generic.go:334] "Generic (PLEG): container finished" podID="faeab210-5195-4d9a-a17e-5aed2f14dc68" containerID="bbb04e31c22cb5e6b28c152a2e21dcf4858fe3b83e03435366a3a4cecdd397ef" exitCode=0 Mar 08 21:02:54 crc kubenswrapper[4885]: I0308 21:02:54.058402 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-th6xc" event={"ID":"faeab210-5195-4d9a-a17e-5aed2f14dc68","Type":"ContainerDied","Data":"bbb04e31c22cb5e6b28c152a2e21dcf4858fe3b83e03435366a3a4cecdd397ef"} Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.511046 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.690640 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-combined-ca-bundle\") pod \"faeab210-5195-4d9a-a17e-5aed2f14dc68\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.690849 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r925\" (UniqueName: \"kubernetes.io/projected/faeab210-5195-4d9a-a17e-5aed2f14dc68-kube-api-access-4r925\") pod \"faeab210-5195-4d9a-a17e-5aed2f14dc68\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.691023 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-config\") pod \"faeab210-5195-4d9a-a17e-5aed2f14dc68\" (UID: \"faeab210-5195-4d9a-a17e-5aed2f14dc68\") " Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.704272 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faeab210-5195-4d9a-a17e-5aed2f14dc68-kube-api-access-4r925" (OuterVolumeSpecName: "kube-api-access-4r925") pod "faeab210-5195-4d9a-a17e-5aed2f14dc68" (UID: "faeab210-5195-4d9a-a17e-5aed2f14dc68"). InnerVolumeSpecName "kube-api-access-4r925". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.730248 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faeab210-5195-4d9a-a17e-5aed2f14dc68" (UID: "faeab210-5195-4d9a-a17e-5aed2f14dc68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.731064 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-config" (OuterVolumeSpecName: "config") pod "faeab210-5195-4d9a-a17e-5aed2f14dc68" (UID: "faeab210-5195-4d9a-a17e-5aed2f14dc68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.793098 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.793131 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r925\" (UniqueName: \"kubernetes.io/projected/faeab210-5195-4d9a-a17e-5aed2f14dc68-kube-api-access-4r925\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:55 crc kubenswrapper[4885]: I0308 21:02:55.793143 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/faeab210-5195-4d9a-a17e-5aed2f14dc68-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.099143 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-th6xc" event={"ID":"faeab210-5195-4d9a-a17e-5aed2f14dc68","Type":"ContainerDied","Data":"105d64f67e44ac823e455dbc7ab3d14e07ef163def457f807561420bfad8e641"} Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.099224 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="105d64f67e44ac823e455dbc7ab3d14e07ef163def457f807561420bfad8e641" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.099306 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-th6xc" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.248762 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5757f7d959-bpxjw"] Mar 08 21:02:56 crc kubenswrapper[4885]: E0308 21:02:56.249197 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faeab210-5195-4d9a-a17e-5aed2f14dc68" containerName="neutron-db-sync" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.249222 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="faeab210-5195-4d9a-a17e-5aed2f14dc68" containerName="neutron-db-sync" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.249450 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="faeab210-5195-4d9a-a17e-5aed2f14dc68" containerName="neutron-db-sync" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.250468 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.258645 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5757f7d959-bpxjw"] Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.368146 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:02:56 crc kubenswrapper[4885]: E0308 21:02:56.368412 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.412165 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-nb\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.412257 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvkns\" (UniqueName: \"kubernetes.io/projected/25b65d53-5174-48c6-a687-f32c1e685bd4-kube-api-access-vvkns\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.412330 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-dns-svc\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.412512 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-config\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.412598 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-sb\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.470344 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67c4b97569-rrjw7"] Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.471596 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.473346 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.473475 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.473762 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-894h5" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.488876 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67c4b97569-rrjw7"] Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.514479 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-nb\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.514588 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvkns\" (UniqueName: \"kubernetes.io/projected/25b65d53-5174-48c6-a687-f32c1e685bd4-kube-api-access-vvkns\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.514616 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-dns-svc\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.514709 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-config\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.514734 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-sb\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.515335 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-nb\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.515477 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-sb\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.515848 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-dns-svc\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.516388 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-config\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.544313 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvkns\" (UniqueName: \"kubernetes.io/projected/25b65d53-5174-48c6-a687-f32c1e685bd4-kube-api-access-vvkns\") pod \"dnsmasq-dns-5757f7d959-bpxjw\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.571270 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.616538 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtbck\" (UniqueName: \"kubernetes.io/projected/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-kube-api-access-rtbck\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.616859 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-combined-ca-bundle\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.616923 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-httpd-config\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.617006 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-config\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.718557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtbck\" (UniqueName: \"kubernetes.io/projected/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-kube-api-access-rtbck\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.718893 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-combined-ca-bundle\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.718972 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-httpd-config\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.719034 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-config\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.728928 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-httpd-config\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.730329 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-config\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.751588 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-combined-ca-bundle\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.758591 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtbck\" (UniqueName: \"kubernetes.io/projected/68cdeb73-eb92-4d18-8a9f-a5e3a0a53900-kube-api-access-rtbck\") pod \"neutron-67c4b97569-rrjw7\" (UID: \"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900\") " pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:56 crc kubenswrapper[4885]: I0308 21:02:56.792654 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:57 crc kubenswrapper[4885]: I0308 21:02:57.055289 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5757f7d959-bpxjw"] Mar 08 21:02:57 crc kubenswrapper[4885]: I0308 21:02:57.107913 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" event={"ID":"25b65d53-5174-48c6-a687-f32c1e685bd4","Type":"ContainerStarted","Data":"68e32acd4bade8c213ec06817b3f3e8059d848ef922c170c62e2bbc8d3910483"} Mar 08 21:02:57 crc kubenswrapper[4885]: I0308 21:02:57.447288 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67c4b97569-rrjw7"] Mar 08 21:02:57 crc kubenswrapper[4885]: W0308 21:02:57.451405 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68cdeb73_eb92_4d18_8a9f_a5e3a0a53900.slice/crio-447ca89b2f74ea73c5fcd7037926d0556764f0ee9336edf62c3e5af0b9229cdf WatchSource:0}: Error finding container 447ca89b2f74ea73c5fcd7037926d0556764f0ee9336edf62c3e5af0b9229cdf: Status 404 returned error can't find the container with id 447ca89b2f74ea73c5fcd7037926d0556764f0ee9336edf62c3e5af0b9229cdf Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.139683 4885 generic.go:334] "Generic (PLEG): container finished" podID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerID="22e2cca09cd12eb836eb1e30ec93ff3eca0cfdcebc523a4eeae36a7ba702ee56" exitCode=0 Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.139769 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" event={"ID":"25b65d53-5174-48c6-a687-f32c1e685bd4","Type":"ContainerDied","Data":"22e2cca09cd12eb836eb1e30ec93ff3eca0cfdcebc523a4eeae36a7ba702ee56"} Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.143384 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c4b97569-rrjw7" event={"ID":"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900","Type":"ContainerStarted","Data":"8a20111a453a7ae4487d2f4c906c192d92d9594ce01edaf2227dc794150134f6"} Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.143419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c4b97569-rrjw7" event={"ID":"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900","Type":"ContainerStarted","Data":"d965a4c3078e1bbdfe2f1bf34f64c1819b3f6cc7a0fb71576d2de428ca874ef1"} Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.143429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c4b97569-rrjw7" event={"ID":"68cdeb73-eb92-4d18-8a9f-a5e3a0a53900","Type":"ContainerStarted","Data":"447ca89b2f74ea73c5fcd7037926d0556764f0ee9336edf62c3e5af0b9229cdf"} Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.143824 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:02:58 crc kubenswrapper[4885]: I0308 21:02:58.192212 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67c4b97569-rrjw7" podStartSLOduration=2.192188071 podStartE2EDuration="2.192188071s" podCreationTimestamp="2026-03-08 21:02:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:58.185974104 +0000 UTC m=+5479.582028127" watchObservedRunningTime="2026-03-08 21:02:58.192188071 +0000 UTC m=+5479.588242094" Mar 08 21:02:59 crc kubenswrapper[4885]: I0308 21:02:59.157021 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" event={"ID":"25b65d53-5174-48c6-a687-f32c1e685bd4","Type":"ContainerStarted","Data":"a8688238cdfd1dcfb7637a786ad101466774be95a20bd05d6000a2fb72881a5c"} Mar 08 21:02:59 crc kubenswrapper[4885]: I0308 21:02:59.157255 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:02:59 crc kubenswrapper[4885]: I0308 21:02:59.197751 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" podStartSLOduration=3.197723683 podStartE2EDuration="3.197723683s" podCreationTimestamp="2026-03-08 21:02:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:02:59.188259891 +0000 UTC m=+5480.584313964" watchObservedRunningTime="2026-03-08 21:02:59.197723683 +0000 UTC m=+5480.593777746" Mar 08 21:03:04 crc kubenswrapper[4885]: I0308 21:03:04.905502 4885 scope.go:117] "RemoveContainer" containerID="06ba2614b2073ae88c1afd46a3629242eb9b0dfca6cc39c42c6f2b45e68e1af1" Mar 08 21:03:06 crc kubenswrapper[4885]: I0308 21:03:06.573203 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:03:06 crc kubenswrapper[4885]: I0308 21:03:06.662831 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-df5964f4c-t7mvc"] Mar 08 21:03:06 crc kubenswrapper[4885]: I0308 21:03:06.663336 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerName="dnsmasq-dns" containerID="cri-o://ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a" gracePeriod=10 Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.162308 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.238591 4885 generic.go:334] "Generic (PLEG): container finished" podID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerID="ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a" exitCode=0 Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.238648 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.238878 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" event={"ID":"3b9b5286-1ced-4445-964d-2ec8fc6a17a4","Type":"ContainerDied","Data":"ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a"} Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.239065 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-df5964f4c-t7mvc" event={"ID":"3b9b5286-1ced-4445-964d-2ec8fc6a17a4","Type":"ContainerDied","Data":"926b79f9df7ed6725ef54961ffc104745dc49799aef4900236f92faa1797b387"} Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.239142 4885 scope.go:117] "RemoveContainer" containerID="ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.265584 4885 scope.go:117] "RemoveContainer" containerID="f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.284766 4885 scope.go:117] "RemoveContainer" containerID="ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a" Mar 08 21:03:07 crc kubenswrapper[4885]: E0308 21:03:07.285388 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a\": container with ID starting with ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a not found: ID does not exist" containerID="ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.285460 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a"} err="failed to get container status \"ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a\": rpc error: code = NotFound desc = could not find container \"ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a\": container with ID starting with ab3d0b0d125b721dd0fdee12e87b946949109aaad28ff29060ad15c3bbf25f1a not found: ID does not exist" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.285504 4885 scope.go:117] "RemoveContainer" containerID="f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729" Mar 08 21:03:07 crc kubenswrapper[4885]: E0308 21:03:07.285899 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729\": container with ID starting with f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729 not found: ID does not exist" containerID="f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.285967 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729"} err="failed to get container status \"f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729\": rpc error: code = NotFound desc = could not find container \"f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729\": container with ID starting with f7a01966f222b5e2e79f68dbc0652966c84151c427a2809f4d727e7e4755e729 not found: ID does not exist" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.318834 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-config\") pod \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.318967 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-dns-svc\") pod \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.319037 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-kube-api-access-xhppn\") pod \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.319072 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-nb\") pod \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.319122 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-sb\") pod \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\" (UID: \"3b9b5286-1ced-4445-964d-2ec8fc6a17a4\") " Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.331147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-kube-api-access-xhppn" (OuterVolumeSpecName: "kube-api-access-xhppn") pod "3b9b5286-1ced-4445-964d-2ec8fc6a17a4" (UID: "3b9b5286-1ced-4445-964d-2ec8fc6a17a4"). InnerVolumeSpecName "kube-api-access-xhppn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.359255 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-config" (OuterVolumeSpecName: "config") pod "3b9b5286-1ced-4445-964d-2ec8fc6a17a4" (UID: "3b9b5286-1ced-4445-964d-2ec8fc6a17a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.366365 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3b9b5286-1ced-4445-964d-2ec8fc6a17a4" (UID: "3b9b5286-1ced-4445-964d-2ec8fc6a17a4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.372823 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3b9b5286-1ced-4445-964d-2ec8fc6a17a4" (UID: "3b9b5286-1ced-4445-964d-2ec8fc6a17a4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.373576 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b9b5286-1ced-4445-964d-2ec8fc6a17a4" (UID: "3b9b5286-1ced-4445-964d-2ec8fc6a17a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.421351 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-kube-api-access-xhppn\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.421387 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.421396 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.421404 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.421413 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b9b5286-1ced-4445-964d-2ec8fc6a17a4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.574910 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-df5964f4c-t7mvc"] Mar 08 21:03:07 crc kubenswrapper[4885]: I0308 21:03:07.592563 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-df5964f4c-t7mvc"] Mar 08 21:03:08 crc kubenswrapper[4885]: I0308 21:03:08.368748 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:03:08 crc kubenswrapper[4885]: E0308 21:03:08.369355 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:03:09 crc kubenswrapper[4885]: I0308 21:03:09.376987 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" path="/var/lib/kubelet/pods/3b9b5286-1ced-4445-964d-2ec8fc6a17a4/volumes" Mar 08 21:03:20 crc kubenswrapper[4885]: I0308 21:03:20.368052 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:03:20 crc kubenswrapper[4885]: E0308 21:03:20.369223 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:03:26 crc kubenswrapper[4885]: I0308 21:03:26.803673 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67c4b97569-rrjw7" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.115582 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-plxkf"] Mar 08 21:03:34 crc kubenswrapper[4885]: E0308 21:03:34.116779 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerName="dnsmasq-dns" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.116796 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerName="dnsmasq-dns" Mar 08 21:03:34 crc kubenswrapper[4885]: E0308 21:03:34.116816 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerName="init" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.116821 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerName="init" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.117103 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b9b5286-1ced-4445-964d-2ec8fc6a17a4" containerName="dnsmasq-dns" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.117658 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.178449 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-plxkf"] Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.210969 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-75f7-account-create-update-7q9gr"] Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.212399 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.215954 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.218306 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-75f7-account-create-update-7q9gr"] Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.311909 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j9dd\" (UniqueName: \"kubernetes.io/projected/e7ace920-d540-4598-82db-315caa467acb-kube-api-access-4j9dd\") pod \"glance-75f7-account-create-update-7q9gr\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.312055 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ace920-d540-4598-82db-315caa467acb-operator-scripts\") pod \"glance-75f7-account-create-update-7q9gr\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.312114 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-operator-scripts\") pod \"glance-db-create-plxkf\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.312148 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxsm4\" (UniqueName: \"kubernetes.io/projected/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-kube-api-access-vxsm4\") pod \"glance-db-create-plxkf\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.413738 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ace920-d540-4598-82db-315caa467acb-operator-scripts\") pod \"glance-75f7-account-create-update-7q9gr\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.413864 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-operator-scripts\") pod \"glance-db-create-plxkf\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.413944 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxsm4\" (UniqueName: \"kubernetes.io/projected/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-kube-api-access-vxsm4\") pod \"glance-db-create-plxkf\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.414047 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j9dd\" (UniqueName: \"kubernetes.io/projected/e7ace920-d540-4598-82db-315caa467acb-kube-api-access-4j9dd\") pod \"glance-75f7-account-create-update-7q9gr\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.414472 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ace920-d540-4598-82db-315caa467acb-operator-scripts\") pod \"glance-75f7-account-create-update-7q9gr\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.414998 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-operator-scripts\") pod \"glance-db-create-plxkf\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.431854 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j9dd\" (UniqueName: \"kubernetes.io/projected/e7ace920-d540-4598-82db-315caa467acb-kube-api-access-4j9dd\") pod \"glance-75f7-account-create-update-7q9gr\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.433727 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxsm4\" (UniqueName: \"kubernetes.io/projected/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-kube-api-access-vxsm4\") pod \"glance-db-create-plxkf\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.434145 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-plxkf" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.535141 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:34 crc kubenswrapper[4885]: I0308 21:03:34.905430 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-plxkf"] Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.009017 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-75f7-account-create-update-7q9gr"] Mar 08 21:03:35 crc kubenswrapper[4885]: W0308 21:03:35.019320 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7ace920_d540_4598_82db_315caa467acb.slice/crio-09d3288c8e0f004e21e20573b8249cca34cc47b18806f4203426007ec43e5ff5 WatchSource:0}: Error finding container 09d3288c8e0f004e21e20573b8249cca34cc47b18806f4203426007ec43e5ff5: Status 404 returned error can't find the container with id 09d3288c8e0f004e21e20573b8249cca34cc47b18806f4203426007ec43e5ff5 Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.368800 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:03:35 crc kubenswrapper[4885]: E0308 21:03:35.369345 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.560030 4885 generic.go:334] "Generic (PLEG): container finished" podID="e7ace920-d540-4598-82db-315caa467acb" containerID="dff338c2a2d2f522bdec5e9f4d11ce93afde12127a49bc5918d50f6e48f1aa67" exitCode=0 Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.560119 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-75f7-account-create-update-7q9gr" event={"ID":"e7ace920-d540-4598-82db-315caa467acb","Type":"ContainerDied","Data":"dff338c2a2d2f522bdec5e9f4d11ce93afde12127a49bc5918d50f6e48f1aa67"} Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.560178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-75f7-account-create-update-7q9gr" event={"ID":"e7ace920-d540-4598-82db-315caa467acb","Type":"ContainerStarted","Data":"09d3288c8e0f004e21e20573b8249cca34cc47b18806f4203426007ec43e5ff5"} Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.562848 4885 generic.go:334] "Generic (PLEG): container finished" podID="af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" containerID="a8dfd7b1b0ea895893398ba92cb9f076303594f9c38eca7bff04c272e28927af" exitCode=0 Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.562960 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-plxkf" event={"ID":"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d","Type":"ContainerDied","Data":"a8dfd7b1b0ea895893398ba92cb9f076303594f9c38eca7bff04c272e28927af"} Mar 08 21:03:35 crc kubenswrapper[4885]: I0308 21:03:35.563008 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-plxkf" event={"ID":"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d","Type":"ContainerStarted","Data":"99afc00164b4887db4df66ab7dbb63ab25d278af78e8cf13c894a4a9565021f8"} Mar 08 21:03:36 crc kubenswrapper[4885]: I0308 21:03:36.992344 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:36 crc kubenswrapper[4885]: I0308 21:03:36.997547 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-plxkf" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.171098 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-operator-scripts\") pod \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.171155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxsm4\" (UniqueName: \"kubernetes.io/projected/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-kube-api-access-vxsm4\") pod \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\" (UID: \"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d\") " Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.171286 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j9dd\" (UniqueName: \"kubernetes.io/projected/e7ace920-d540-4598-82db-315caa467acb-kube-api-access-4j9dd\") pod \"e7ace920-d540-4598-82db-315caa467acb\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.171351 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ace920-d540-4598-82db-315caa467acb-operator-scripts\") pod \"e7ace920-d540-4598-82db-315caa467acb\" (UID: \"e7ace920-d540-4598-82db-315caa467acb\") " Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.175344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7ace920-d540-4598-82db-315caa467acb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7ace920-d540-4598-82db-315caa467acb" (UID: "e7ace920-d540-4598-82db-315caa467acb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.177514 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" (UID: "af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.185843 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7ace920-d540-4598-82db-315caa467acb-kube-api-access-4j9dd" (OuterVolumeSpecName: "kube-api-access-4j9dd") pod "e7ace920-d540-4598-82db-315caa467acb" (UID: "e7ace920-d540-4598-82db-315caa467acb"). InnerVolumeSpecName "kube-api-access-4j9dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.190604 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-kube-api-access-vxsm4" (OuterVolumeSpecName: "kube-api-access-vxsm4") pod "af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" (UID: "af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d"). InnerVolumeSpecName "kube-api-access-vxsm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.273863 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7ace920-d540-4598-82db-315caa467acb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.274099 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.274165 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxsm4\" (UniqueName: \"kubernetes.io/projected/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d-kube-api-access-vxsm4\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.274258 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j9dd\" (UniqueName: \"kubernetes.io/projected/e7ace920-d540-4598-82db-315caa467acb-kube-api-access-4j9dd\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.581015 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-75f7-account-create-update-7q9gr" event={"ID":"e7ace920-d540-4598-82db-315caa467acb","Type":"ContainerDied","Data":"09d3288c8e0f004e21e20573b8249cca34cc47b18806f4203426007ec43e5ff5"} Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.581616 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09d3288c8e0f004e21e20573b8249cca34cc47b18806f4203426007ec43e5ff5" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.581031 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-75f7-account-create-update-7q9gr" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.583354 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-plxkf" event={"ID":"af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d","Type":"ContainerDied","Data":"99afc00164b4887db4df66ab7dbb63ab25d278af78e8cf13c894a4a9565021f8"} Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.583415 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99afc00164b4887db4df66ab7dbb63ab25d278af78e8cf13c894a4a9565021f8" Mar 08 21:03:37 crc kubenswrapper[4885]: I0308 21:03:37.583498 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-plxkf" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.440690 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-m792h"] Mar 08 21:03:39 crc kubenswrapper[4885]: E0308 21:03:39.441263 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7ace920-d540-4598-82db-315caa467acb" containerName="mariadb-account-create-update" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.441287 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7ace920-d540-4598-82db-315caa467acb" containerName="mariadb-account-create-update" Mar 08 21:03:39 crc kubenswrapper[4885]: E0308 21:03:39.441321 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" containerName="mariadb-database-create" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.441336 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" containerName="mariadb-database-create" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.441656 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7ace920-d540-4598-82db-315caa467acb" containerName="mariadb-account-create-update" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.441703 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" containerName="mariadb-database-create" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.442657 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.445222 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.453870 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zrqqr" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.455265 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-m792h"] Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.524176 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-config-data\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.524493 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-db-sync-config-data\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.524547 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mngp9\" (UniqueName: \"kubernetes.io/projected/07fa5fd1-f6b9-4206-809d-c1f04533cab4-kube-api-access-mngp9\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.524611 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-combined-ca-bundle\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.627188 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-config-data\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.627275 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-db-sync-config-data\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.627351 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mngp9\" (UniqueName: \"kubernetes.io/projected/07fa5fd1-f6b9-4206-809d-c1f04533cab4-kube-api-access-mngp9\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.627452 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-combined-ca-bundle\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.633026 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-config-data\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.649657 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-combined-ca-bundle\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.650849 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-db-sync-config-data\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.656492 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mngp9\" (UniqueName: \"kubernetes.io/projected/07fa5fd1-f6b9-4206-809d-c1f04533cab4-kube-api-access-mngp9\") pod \"glance-db-sync-m792h\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " pod="openstack/glance-db-sync-m792h" Mar 08 21:03:39 crc kubenswrapper[4885]: I0308 21:03:39.769284 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m792h" Mar 08 21:03:40 crc kubenswrapper[4885]: I0308 21:03:40.126243 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-m792h"] Mar 08 21:03:40 crc kubenswrapper[4885]: I0308 21:03:40.613530 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m792h" event={"ID":"07fa5fd1-f6b9-4206-809d-c1f04533cab4","Type":"ContainerStarted","Data":"fe5f1b7fdd56f4ad8279dbe7f7cd91d8f6dc677fcbcff32122312a7eeb244697"} Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.438129 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ljp6s"] Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.442715 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.453023 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ljp6s"] Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.561088 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-utilities\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.561167 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-catalog-content\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.561273 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwpp2\" (UniqueName: \"kubernetes.io/projected/0e84cb88-21d6-41d6-9352-b29d1953fa9f-kube-api-access-nwpp2\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.627401 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m792h" event={"ID":"07fa5fd1-f6b9-4206-809d-c1f04533cab4","Type":"ContainerStarted","Data":"238463b8258e15f4cb33c673abe4bc3d05f4cb5a4961563b7dbb833ea2602b95"} Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.663021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-utilities\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.663094 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-catalog-content\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.663266 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwpp2\" (UniqueName: \"kubernetes.io/projected/0e84cb88-21d6-41d6-9352-b29d1953fa9f-kube-api-access-nwpp2\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.663955 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-utilities\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.664017 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-catalog-content\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.686083 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwpp2\" (UniqueName: \"kubernetes.io/projected/0e84cb88-21d6-41d6-9352-b29d1953fa9f-kube-api-access-nwpp2\") pod \"community-operators-ljp6s\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:41 crc kubenswrapper[4885]: I0308 21:03:41.775318 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:42 crc kubenswrapper[4885]: I0308 21:03:42.368006 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-m792h" podStartSLOduration=3.367979201 podStartE2EDuration="3.367979201s" podCreationTimestamp="2026-03-08 21:03:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:03:41.648332945 +0000 UTC m=+5523.044386978" watchObservedRunningTime="2026-03-08 21:03:42.367979201 +0000 UTC m=+5523.764033254" Mar 08 21:03:42 crc kubenswrapper[4885]: I0308 21:03:42.377036 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ljp6s"] Mar 08 21:03:42 crc kubenswrapper[4885]: W0308 21:03:42.380048 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e84cb88_21d6_41d6_9352_b29d1953fa9f.slice/crio-9cbee94b6981b1190afd3a1f4953df7c9ada70d60fa8e7e0f457106f9ec4cc05 WatchSource:0}: Error finding container 9cbee94b6981b1190afd3a1f4953df7c9ada70d60fa8e7e0f457106f9ec4cc05: Status 404 returned error can't find the container with id 9cbee94b6981b1190afd3a1f4953df7c9ada70d60fa8e7e0f457106f9ec4cc05 Mar 08 21:03:42 crc kubenswrapper[4885]: I0308 21:03:42.645298 4885 generic.go:334] "Generic (PLEG): container finished" podID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerID="18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140" exitCode=0 Mar 08 21:03:42 crc kubenswrapper[4885]: I0308 21:03:42.645393 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerDied","Data":"18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140"} Mar 08 21:03:42 crc kubenswrapper[4885]: I0308 21:03:42.645468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerStarted","Data":"9cbee94b6981b1190afd3a1f4953df7c9ada70d60fa8e7e0f457106f9ec4cc05"} Mar 08 21:03:42 crc kubenswrapper[4885]: I0308 21:03:42.656653 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:03:43 crc kubenswrapper[4885]: I0308 21:03:43.658107 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerStarted","Data":"1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c"} Mar 08 21:03:44 crc kubenswrapper[4885]: I0308 21:03:44.677195 4885 generic.go:334] "Generic (PLEG): container finished" podID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerID="1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c" exitCode=0 Mar 08 21:03:44 crc kubenswrapper[4885]: I0308 21:03:44.677570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerDied","Data":"1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c"} Mar 08 21:03:44 crc kubenswrapper[4885]: I0308 21:03:44.684642 4885 generic.go:334] "Generic (PLEG): container finished" podID="07fa5fd1-f6b9-4206-809d-c1f04533cab4" containerID="238463b8258e15f4cb33c673abe4bc3d05f4cb5a4961563b7dbb833ea2602b95" exitCode=0 Mar 08 21:03:44 crc kubenswrapper[4885]: I0308 21:03:44.684696 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m792h" event={"ID":"07fa5fd1-f6b9-4206-809d-c1f04533cab4","Type":"ContainerDied","Data":"238463b8258e15f4cb33c673abe4bc3d05f4cb5a4961563b7dbb833ea2602b95"} Mar 08 21:03:45 crc kubenswrapper[4885]: I0308 21:03:45.699053 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerStarted","Data":"bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e"} Mar 08 21:03:45 crc kubenswrapper[4885]: I0308 21:03:45.735331 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ljp6s" podStartSLOduration=2.194778705 podStartE2EDuration="4.735305668s" podCreationTimestamp="2026-03-08 21:03:41 +0000 UTC" firstStartedPulling="2026-03-08 21:03:42.652623104 +0000 UTC m=+5524.048677167" lastFinishedPulling="2026-03-08 21:03:45.193150097 +0000 UTC m=+5526.589204130" observedRunningTime="2026-03-08 21:03:45.729572865 +0000 UTC m=+5527.125626948" watchObservedRunningTime="2026-03-08 21:03:45.735305668 +0000 UTC m=+5527.131359701" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.112649 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m792h" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.255097 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-db-sync-config-data\") pod \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.255178 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mngp9\" (UniqueName: \"kubernetes.io/projected/07fa5fd1-f6b9-4206-809d-c1f04533cab4-kube-api-access-mngp9\") pod \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.255280 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-config-data\") pod \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.255460 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-combined-ca-bundle\") pod \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\" (UID: \"07fa5fd1-f6b9-4206-809d-c1f04533cab4\") " Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.261388 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07fa5fd1-f6b9-4206-809d-c1f04533cab4-kube-api-access-mngp9" (OuterVolumeSpecName: "kube-api-access-mngp9") pod "07fa5fd1-f6b9-4206-809d-c1f04533cab4" (UID: "07fa5fd1-f6b9-4206-809d-c1f04533cab4"). InnerVolumeSpecName "kube-api-access-mngp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.261425 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "07fa5fd1-f6b9-4206-809d-c1f04533cab4" (UID: "07fa5fd1-f6b9-4206-809d-c1f04533cab4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.304408 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07fa5fd1-f6b9-4206-809d-c1f04533cab4" (UID: "07fa5fd1-f6b9-4206-809d-c1f04533cab4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.339642 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-config-data" (OuterVolumeSpecName: "config-data") pod "07fa5fd1-f6b9-4206-809d-c1f04533cab4" (UID: "07fa5fd1-f6b9-4206-809d-c1f04533cab4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.358375 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.358402 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mngp9\" (UniqueName: \"kubernetes.io/projected/07fa5fd1-f6b9-4206-809d-c1f04533cab4-kube-api-access-mngp9\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.358416 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.358427 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fa5fd1-f6b9-4206-809d-c1f04533cab4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.706678 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-m792h" event={"ID":"07fa5fd1-f6b9-4206-809d-c1f04533cab4","Type":"ContainerDied","Data":"fe5f1b7fdd56f4ad8279dbe7f7cd91d8f6dc677fcbcff32122312a7eeb244697"} Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.707029 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe5f1b7fdd56f4ad8279dbe7f7cd91d8f6dc677fcbcff32122312a7eeb244697" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.706739 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-m792h" Mar 08 21:03:46 crc kubenswrapper[4885]: E0308 21:03:46.918490 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07fa5fd1_f6b9_4206_809d_c1f04533cab4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07fa5fd1_f6b9_4206_809d_c1f04533cab4.slice/crio-fe5f1b7fdd56f4ad8279dbe7f7cd91d8f6dc677fcbcff32122312a7eeb244697\": RecentStats: unable to find data in memory cache]" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.993138 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:46 crc kubenswrapper[4885]: E0308 21:03:46.993487 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07fa5fd1-f6b9-4206-809d-c1f04533cab4" containerName="glance-db-sync" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.993504 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="07fa5fd1-f6b9-4206-809d-c1f04533cab4" containerName="glance-db-sync" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.993667 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="07fa5fd1-f6b9-4206-809d-c1f04533cab4" containerName="glance-db-sync" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.994469 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.999314 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 08 21:03:46 crc kubenswrapper[4885]: I0308 21:03:46.999840 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.000085 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zrqqr" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.001078 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.010328 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.075871 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.075952 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.075989 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.076019 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbdl9\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-kube-api-access-rbdl9\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.076042 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.076064 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-ceph\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.076142 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-logs\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.126709 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54fd489df-k9th6"] Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.127978 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.149348 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54fd489df-k9th6"] Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.177778 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-logs\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.177835 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.177862 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.177892 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.177978 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbdl9\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-kube-api-access-rbdl9\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.178003 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.178023 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-ceph\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.178991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.179199 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-logs\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.186529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.187208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-ceph\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.190290 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.192907 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.205561 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbdl9\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-kube-api-access-rbdl9\") pod \"glance-default-external-api-0\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.220443 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.221740 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.231084 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.242751 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.281945 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-dns-svc\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.282007 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-sb\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.282025 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-config\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.282128 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-nb\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.282726 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276lt\" (UniqueName: \"kubernetes.io/projected/a16f7c8b-b930-4591-a967-9db46c52391c-kube-api-access-276lt\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.321290 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385206 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385250 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385273 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385303 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-nb\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385327 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-276lt\" (UniqueName: \"kubernetes.io/projected/a16f7c8b-b930-4591-a967-9db46c52391c-kube-api-access-276lt\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385359 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-dns-svc\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385383 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385403 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-sb\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385421 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385435 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385451 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-config\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.385495 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptbgp\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-kube-api-access-ptbgp\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.386334 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-nb\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.387100 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-dns-svc\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.387590 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-sb\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.395427 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-config\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.452800 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-276lt\" (UniqueName: \"kubernetes.io/projected/a16f7c8b-b930-4591-a967-9db46c52391c-kube-api-access-276lt\") pod \"dnsmasq-dns-54fd489df-k9th6\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.454581 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.488809 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptbgp\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-kube-api-access-ptbgp\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.488902 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.488956 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.488983 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.489042 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.489065 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.489081 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.494758 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.503529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.504351 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.504588 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-logs\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.507901 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.518934 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-ceph\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.527243 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptbgp\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-kube-api-access-ptbgp\") pod \"glance-default-internal-api-0\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.593474 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:03:47 crc kubenswrapper[4885]: I0308 21:03:47.936381 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:47 crc kubenswrapper[4885]: W0308 21:03:47.942034 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdfd0414_9452_430e_a4ea_bb57121b0424.slice/crio-5908c98ec19f6c2b530558ced726b2bfe547ec2e7f58b25b7f725fb479f01bb1 WatchSource:0}: Error finding container 5908c98ec19f6c2b530558ced726b2bfe547ec2e7f58b25b7f725fb479f01bb1: Status 404 returned error can't find the container with id 5908c98ec19f6c2b530558ced726b2bfe547ec2e7f58b25b7f725fb479f01bb1 Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.027671 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54fd489df-k9th6"] Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.223818 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.237414 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:48 crc kubenswrapper[4885]: W0308 21:03:48.238562 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a6f23b6_b553_46eb_9a92_f33f68a80294.slice/crio-96f6b15c1d1937a90bc46575bd54e1aa028b4e4abfbd6494dc64c71238146a31 WatchSource:0}: Error finding container 96f6b15c1d1937a90bc46575bd54e1aa028b4e4abfbd6494dc64c71238146a31: Status 404 returned error can't find the container with id 96f6b15c1d1937a90bc46575bd54e1aa028b4e4abfbd6494dc64c71238146a31 Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.736336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a6f23b6-b553-46eb-9a92-f33f68a80294","Type":"ContainerStarted","Data":"96f6b15c1d1937a90bc46575bd54e1aa028b4e4abfbd6494dc64c71238146a31"} Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.738580 4885 generic.go:334] "Generic (PLEG): container finished" podID="a16f7c8b-b930-4591-a967-9db46c52391c" containerID="1ebba46867f58104b9cbcc29fb91d1649cf147537f67ba19ec589a45bcb62ce8" exitCode=0 Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.738652 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54fd489df-k9th6" event={"ID":"a16f7c8b-b930-4591-a967-9db46c52391c","Type":"ContainerDied","Data":"1ebba46867f58104b9cbcc29fb91d1649cf147537f67ba19ec589a45bcb62ce8"} Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.738668 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54fd489df-k9th6" event={"ID":"a16f7c8b-b930-4591-a967-9db46c52391c","Type":"ContainerStarted","Data":"1a913a93c16bbf12b9927ddd796c54de956ba88ff05aa50f518989cc7d07d3d0"} Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.744862 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfd0414-9452-430e-a4ea-bb57121b0424","Type":"ContainerStarted","Data":"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815"} Mar 08 21:03:48 crc kubenswrapper[4885]: I0308 21:03:48.744887 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfd0414-9452-430e-a4ea-bb57121b0424","Type":"ContainerStarted","Data":"5908c98ec19f6c2b530558ced726b2bfe547ec2e7f58b25b7f725fb479f01bb1"} Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.374550 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:03:49 crc kubenswrapper[4885]: E0308 21:03:49.375047 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.754257 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a6f23b6-b553-46eb-9a92-f33f68a80294","Type":"ContainerStarted","Data":"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266"} Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.754312 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a6f23b6-b553-46eb-9a92-f33f68a80294","Type":"ContainerStarted","Data":"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a"} Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.756017 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54fd489df-k9th6" event={"ID":"a16f7c8b-b930-4591-a967-9db46c52391c","Type":"ContainerStarted","Data":"7dd39daad27eae124834c42bda6676c4988f5e52cc85f87170d8845fcdc1c6e4"} Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.756431 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.758432 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfd0414-9452-430e-a4ea-bb57121b0424","Type":"ContainerStarted","Data":"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88"} Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.758536 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-log" containerID="cri-o://440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815" gracePeriod=30 Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.758756 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-httpd" containerID="cri-o://d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88" gracePeriod=30 Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.787344 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.787292368 podStartE2EDuration="2.787292368s" podCreationTimestamp="2026-03-08 21:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:03:49.78546554 +0000 UTC m=+5531.181519573" watchObservedRunningTime="2026-03-08 21:03:49.787292368 +0000 UTC m=+5531.183346441" Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.808133 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.808115994 podStartE2EDuration="3.808115994s" podCreationTimestamp="2026-03-08 21:03:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:03:49.803128521 +0000 UTC m=+5531.199182554" watchObservedRunningTime="2026-03-08 21:03:49.808115994 +0000 UTC m=+5531.204170027" Mar 08 21:03:49 crc kubenswrapper[4885]: I0308 21:03:49.828494 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54fd489df-k9th6" podStartSLOduration=2.828474597 podStartE2EDuration="2.828474597s" podCreationTimestamp="2026-03-08 21:03:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:03:49.826252287 +0000 UTC m=+5531.222306310" watchObservedRunningTime="2026-03-08 21:03:49.828474597 +0000 UTC m=+5531.224528640" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.032894 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.296239 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.449781 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-scripts\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.449969 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-ceph\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450023 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-config-data\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450049 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbdl9\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-kube-api-access-rbdl9\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450116 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-logs\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450157 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-httpd-run\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450195 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-combined-ca-bundle\") pod \"cdfd0414-9452-430e-a4ea-bb57121b0424\" (UID: \"cdfd0414-9452-430e-a4ea-bb57121b0424\") " Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450799 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.450863 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-logs" (OuterVolumeSpecName: "logs") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.451060 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.456562 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-kube-api-access-rbdl9" (OuterVolumeSpecName: "kube-api-access-rbdl9") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "kube-api-access-rbdl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.456621 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-scripts" (OuterVolumeSpecName: "scripts") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.458226 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-ceph" (OuterVolumeSpecName: "ceph") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.489555 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.503183 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-config-data" (OuterVolumeSpecName: "config-data") pod "cdfd0414-9452-430e-a4ea-bb57121b0424" (UID: "cdfd0414-9452-430e-a4ea-bb57121b0424"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.552408 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.552450 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbdl9\" (UniqueName: \"kubernetes.io/projected/cdfd0414-9452-430e-a4ea-bb57121b0424-kube-api-access-rbdl9\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.552463 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.552473 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdfd0414-9452-430e-a4ea-bb57121b0424-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.552481 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.552489 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdfd0414-9452-430e-a4ea-bb57121b0424-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767634 4885 generic.go:334] "Generic (PLEG): container finished" podID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerID="d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88" exitCode=0 Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767666 4885 generic.go:334] "Generic (PLEG): container finished" podID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerID="440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815" exitCode=143 Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767690 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfd0414-9452-430e-a4ea-bb57121b0424","Type":"ContainerDied","Data":"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88"} Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767715 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767744 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfd0414-9452-430e-a4ea-bb57121b0424","Type":"ContainerDied","Data":"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815"} Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdfd0414-9452-430e-a4ea-bb57121b0424","Type":"ContainerDied","Data":"5908c98ec19f6c2b530558ced726b2bfe547ec2e7f58b25b7f725fb479f01bb1"} Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.767791 4885 scope.go:117] "RemoveContainer" containerID="d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.791411 4885 scope.go:117] "RemoveContainer" containerID="440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.806789 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.824299 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.838120 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:50 crc kubenswrapper[4885]: E0308 21:03:50.838687 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-log" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.838717 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-log" Mar 08 21:03:50 crc kubenswrapper[4885]: E0308 21:03:50.838746 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-httpd" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.838758 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-httpd" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.839041 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-httpd" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.839067 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" containerName="glance-log" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.840125 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.841216 4885 scope.go:117] "RemoveContainer" containerID="d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.842287 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 21:03:50 crc kubenswrapper[4885]: E0308 21:03:50.842896 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88\": container with ID starting with d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88 not found: ID does not exist" containerID="d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.842944 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88"} err="failed to get container status \"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88\": rpc error: code = NotFound desc = could not find container \"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88\": container with ID starting with d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88 not found: ID does not exist" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.842968 4885 scope.go:117] "RemoveContainer" containerID="440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815" Mar 08 21:03:50 crc kubenswrapper[4885]: E0308 21:03:50.843761 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815\": container with ID starting with 440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815 not found: ID does not exist" containerID="440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.843816 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815"} err="failed to get container status \"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815\": rpc error: code = NotFound desc = could not find container \"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815\": container with ID starting with 440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815 not found: ID does not exist" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.843842 4885 scope.go:117] "RemoveContainer" containerID="d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.847481 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88"} err="failed to get container status \"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88\": rpc error: code = NotFound desc = could not find container \"d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88\": container with ID starting with d21c4da8f4487b3b8cd11ae9c6d8ebe37863342895017984ae9b8cfdbc32ab88 not found: ID does not exist" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.847531 4885 scope.go:117] "RemoveContainer" containerID="440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.861256 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815"} err="failed to get container status \"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815\": rpc error: code = NotFound desc = could not find container \"440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815\": container with ID starting with 440bee62eaf7ecf619401b85f7aa3412238c9ac07bd9660cea0683e3f9468815 not found: ID does not exist" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.870841 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.966425 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-config-data\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.966601 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-scripts\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.966701 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.966780 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qvkv\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-kube-api-access-6qvkv\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.966885 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-logs\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.967042 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-ceph\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:50 crc kubenswrapper[4885]: I0308 21:03:50.967120 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069365 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-config-data\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069479 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-scripts\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069546 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069621 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qvkv\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-kube-api-access-6qvkv\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069667 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-logs\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069734 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-ceph\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.069785 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.070624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.070763 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-logs\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.074967 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-scripts\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.075637 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-config-data\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.076113 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.076392 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-ceph\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.087811 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qvkv\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-kube-api-access-6qvkv\") pod \"glance-default-external-api-0\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.171265 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.381323 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdfd0414-9452-430e-a4ea-bb57121b0424" path="/var/lib/kubelet/pods/cdfd0414-9452-430e-a4ea-bb57121b0424/volumes" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.753963 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.776376 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.776501 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.785831 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e","Type":"ContainerStarted","Data":"8a73a30c989c3b2ba3367e2ccdac633bed1d9b688e4b2e65328b8a2f07a6fe3b"} Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.786193 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-log" containerID="cri-o://33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a" gracePeriod=30 Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.786281 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-httpd" containerID="cri-o://6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266" gracePeriod=30 Mar 08 21:03:51 crc kubenswrapper[4885]: I0308 21:03:51.841858 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.350332 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-httpd-run\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497587 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptbgp\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-kube-api-access-ptbgp\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497640 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-scripts\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497699 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-combined-ca-bundle\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497718 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-ceph\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497743 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-config-data\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.497804 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-logs\") pod \"0a6f23b6-b553-46eb-9a92-f33f68a80294\" (UID: \"0a6f23b6-b553-46eb-9a92-f33f68a80294\") " Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.498023 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.498773 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.501039 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-scripts" (OuterVolumeSpecName: "scripts") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.501269 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-kube-api-access-ptbgp" (OuterVolumeSpecName: "kube-api-access-ptbgp") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "kube-api-access-ptbgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.501282 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-logs" (OuterVolumeSpecName: "logs") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.509244 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-ceph" (OuterVolumeSpecName: "ceph") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.530520 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.548946 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-config-data" (OuterVolumeSpecName: "config-data") pod "0a6f23b6-b553-46eb-9a92-f33f68a80294" (UID: "0a6f23b6-b553-46eb-9a92-f33f68a80294"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.600428 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ptbgp\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-kube-api-access-ptbgp\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.600474 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.600491 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.600505 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0a6f23b6-b553-46eb-9a92-f33f68a80294-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.600518 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a6f23b6-b553-46eb-9a92-f33f68a80294-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.600531 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a6f23b6-b553-46eb-9a92-f33f68a80294-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.812304 4885 generic.go:334] "Generic (PLEG): container finished" podID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerID="6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266" exitCode=0 Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.812349 4885 generic.go:334] "Generic (PLEG): container finished" podID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerID="33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a" exitCode=143 Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.812497 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.813901 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a6f23b6-b553-46eb-9a92-f33f68a80294","Type":"ContainerDied","Data":"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266"} Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.813972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a6f23b6-b553-46eb-9a92-f33f68a80294","Type":"ContainerDied","Data":"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a"} Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.813984 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0a6f23b6-b553-46eb-9a92-f33f68a80294","Type":"ContainerDied","Data":"96f6b15c1d1937a90bc46575bd54e1aa028b4e4abfbd6494dc64c71238146a31"} Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.814005 4885 scope.go:117] "RemoveContainer" containerID="6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.823586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e","Type":"ContainerStarted","Data":"393e441f3ad9404ef23527c1976a928e080636334f6d4fef814d8251c19b8033"} Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.870753 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.874341 4885 scope.go:117] "RemoveContainer" containerID="33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.877137 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.893641 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:52 crc kubenswrapper[4885]: E0308 21:03:52.894303 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-httpd" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.894323 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-httpd" Mar 08 21:03:52 crc kubenswrapper[4885]: E0308 21:03:52.894337 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-log" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.894343 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-log" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.894508 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-httpd" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.894522 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" containerName="glance-log" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.895373 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.897499 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.907611 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.910820 4885 scope.go:117] "RemoveContainer" containerID="6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.910890 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:52 crc kubenswrapper[4885]: E0308 21:03:52.911476 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266\": container with ID starting with 6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266 not found: ID does not exist" containerID="6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.911509 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266"} err="failed to get container status \"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266\": rpc error: code = NotFound desc = could not find container \"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266\": container with ID starting with 6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266 not found: ID does not exist" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.911533 4885 scope.go:117] "RemoveContainer" containerID="33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a" Mar 08 21:03:52 crc kubenswrapper[4885]: E0308 21:03:52.911872 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a\": container with ID starting with 33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a not found: ID does not exist" containerID="33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.911897 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a"} err="failed to get container status \"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a\": rpc error: code = NotFound desc = could not find container \"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a\": container with ID starting with 33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a not found: ID does not exist" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.911911 4885 scope.go:117] "RemoveContainer" containerID="6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.912400 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266"} err="failed to get container status \"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266\": rpc error: code = NotFound desc = could not find container \"6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266\": container with ID starting with 6a966cc6d33a255ba9645c66fce48a8988b31fabdb9aff20a5e955f2c83d9266 not found: ID does not exist" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.912437 4885 scope.go:117] "RemoveContainer" containerID="33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.912737 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a"} err="failed to get container status \"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a\": rpc error: code = NotFound desc = could not find container \"33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a\": container with ID starting with 33634138c9d8eaf207b8b391c47b618693ea96878b12a35c665af82c3c81789a not found: ID does not exist" Mar 08 21:03:52 crc kubenswrapper[4885]: I0308 21:03:52.979573 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ljp6s"] Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006552 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006648 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006671 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006719 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhs2t\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-kube-api-access-xhs2t\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-logs\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006828 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.006859 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108154 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108234 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108255 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108294 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhs2t\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-kube-api-access-xhs2t\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108327 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-logs\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108392 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108415 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.108898 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.110319 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-logs\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.112684 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-ceph\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.114136 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.116137 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.117358 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.125457 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhs2t\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-kube-api-access-xhs2t\") pod \"glance-default-internal-api-0\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.236811 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.384498 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a6f23b6-b553-46eb-9a92-f33f68a80294" path="/var/lib/kubelet/pods/0a6f23b6-b553-46eb-9a92-f33f68a80294/volumes" Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.805804 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:03:53 crc kubenswrapper[4885]: W0308 21:03:53.824525 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf82d5c8f_6a14_4b1b_9143_eb52cf7e67e8.slice/crio-c1d3ebe64d38a65fd438521c81fee0953dda62071a7f9b6b92b7e2810c5ba230 WatchSource:0}: Error finding container c1d3ebe64d38a65fd438521c81fee0953dda62071a7f9b6b92b7e2810c5ba230: Status 404 returned error can't find the container with id c1d3ebe64d38a65fd438521c81fee0953dda62071a7f9b6b92b7e2810c5ba230 Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.835802 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8","Type":"ContainerStarted","Data":"c1d3ebe64d38a65fd438521c81fee0953dda62071a7f9b6b92b7e2810c5ba230"} Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.838414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e","Type":"ContainerStarted","Data":"45cd60c6ca50a5d19396518626fd9ead690756368037eda8deb040859bd438c4"} Mar 08 21:03:53 crc kubenswrapper[4885]: I0308 21:03:53.860804 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.860788382 podStartE2EDuration="3.860788382s" podCreationTimestamp="2026-03-08 21:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:03:53.858248874 +0000 UTC m=+5535.254302887" watchObservedRunningTime="2026-03-08 21:03:53.860788382 +0000 UTC m=+5535.256842405" Mar 08 21:03:54 crc kubenswrapper[4885]: I0308 21:03:54.863528 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ljp6s" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="registry-server" containerID="cri-o://bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e" gracePeriod=2 Mar 08 21:03:54 crc kubenswrapper[4885]: I0308 21:03:54.864016 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8","Type":"ContainerStarted","Data":"6b230c929340da8fad3a15c45277ba0e659ca5d1578d43a5536de68f31fcf158"} Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.355491 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.472116 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-utilities\") pod \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.472745 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-catalog-content\") pod \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.472771 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwpp2\" (UniqueName: \"kubernetes.io/projected/0e84cb88-21d6-41d6-9352-b29d1953fa9f-kube-api-access-nwpp2\") pod \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\" (UID: \"0e84cb88-21d6-41d6-9352-b29d1953fa9f\") " Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.472893 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-utilities" (OuterVolumeSpecName: "utilities") pod "0e84cb88-21d6-41d6-9352-b29d1953fa9f" (UID: "0e84cb88-21d6-41d6-9352-b29d1953fa9f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.473617 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.480423 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e84cb88-21d6-41d6-9352-b29d1953fa9f-kube-api-access-nwpp2" (OuterVolumeSpecName: "kube-api-access-nwpp2") pod "0e84cb88-21d6-41d6-9352-b29d1953fa9f" (UID: "0e84cb88-21d6-41d6-9352-b29d1953fa9f"). InnerVolumeSpecName "kube-api-access-nwpp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.520119 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e84cb88-21d6-41d6-9352-b29d1953fa9f" (UID: "0e84cb88-21d6-41d6-9352-b29d1953fa9f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.576076 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e84cb88-21d6-41d6-9352-b29d1953fa9f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.576139 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwpp2\" (UniqueName: \"kubernetes.io/projected/0e84cb88-21d6-41d6-9352-b29d1953fa9f-kube-api-access-nwpp2\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.878790 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8","Type":"ContainerStarted","Data":"77cc037af22a97f866052e3343ac5fbb7bf64bc7562100db2c700da1dbaae719"} Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.886706 4885 generic.go:334] "Generic (PLEG): container finished" podID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerID="bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e" exitCode=0 Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.886774 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerDied","Data":"bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e"} Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.886823 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ljp6s" event={"ID":"0e84cb88-21d6-41d6-9352-b29d1953fa9f","Type":"ContainerDied","Data":"9cbee94b6981b1190afd3a1f4953df7c9ada70d60fa8e7e0f457106f9ec4cc05"} Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.886852 4885 scope.go:117] "RemoveContainer" containerID="bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.887137 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ljp6s" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.914627 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.9145978919999997 podStartE2EDuration="3.914597892s" podCreationTimestamp="2026-03-08 21:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:03:55.909387763 +0000 UTC m=+5537.305441866" watchObservedRunningTime="2026-03-08 21:03:55.914597892 +0000 UTC m=+5537.310651945" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.925728 4885 scope.go:117] "RemoveContainer" containerID="1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c" Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.956133 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ljp6s"] Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.965446 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ljp6s"] Mar 08 21:03:55 crc kubenswrapper[4885]: I0308 21:03:55.977820 4885 scope.go:117] "RemoveContainer" containerID="18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140" Mar 08 21:03:56 crc kubenswrapper[4885]: I0308 21:03:56.016707 4885 scope.go:117] "RemoveContainer" containerID="bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e" Mar 08 21:03:56 crc kubenswrapper[4885]: E0308 21:03:56.017377 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e\": container with ID starting with bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e not found: ID does not exist" containerID="bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e" Mar 08 21:03:56 crc kubenswrapper[4885]: I0308 21:03:56.017409 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e"} err="failed to get container status \"bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e\": rpc error: code = NotFound desc = could not find container \"bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e\": container with ID starting with bb073c57e50f041684348446510222cc61da648cbcd91e3a264334edff35815e not found: ID does not exist" Mar 08 21:03:56 crc kubenswrapper[4885]: I0308 21:03:56.017432 4885 scope.go:117] "RemoveContainer" containerID="1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c" Mar 08 21:03:56 crc kubenswrapper[4885]: E0308 21:03:56.017803 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c\": container with ID starting with 1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c not found: ID does not exist" containerID="1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c" Mar 08 21:03:56 crc kubenswrapper[4885]: I0308 21:03:56.017830 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c"} err="failed to get container status \"1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c\": rpc error: code = NotFound desc = could not find container \"1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c\": container with ID starting with 1e7554ef537c68942ea0c82399d708d418d1b6dbe1277db87c174f5fa393d66c not found: ID does not exist" Mar 08 21:03:56 crc kubenswrapper[4885]: I0308 21:03:56.017850 4885 scope.go:117] "RemoveContainer" containerID="18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140" Mar 08 21:03:56 crc kubenswrapper[4885]: E0308 21:03:56.018328 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140\": container with ID starting with 18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140 not found: ID does not exist" containerID="18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140" Mar 08 21:03:56 crc kubenswrapper[4885]: I0308 21:03:56.018354 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140"} err="failed to get container status \"18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140\": rpc error: code = NotFound desc = could not find container \"18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140\": container with ID starting with 18f721fb4e926028de53d87f87e746231c07913834ba4785e7b40fa487d3d140 not found: ID does not exist" Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.387846 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" path="/var/lib/kubelet/pods/0e84cb88-21d6-41d6-9352-b29d1953fa9f/volumes" Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.457114 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.516289 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5757f7d959-bpxjw"] Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.516710 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerName="dnsmasq-dns" containerID="cri-o://a8688238cdfd1dcfb7637a786ad101466774be95a20bd05d6000a2fb72881a5c" gracePeriod=10 Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.915075 4885 generic.go:334] "Generic (PLEG): container finished" podID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerID="a8688238cdfd1dcfb7637a786ad101466774be95a20bd05d6000a2fb72881a5c" exitCode=0 Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.915178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" event={"ID":"25b65d53-5174-48c6-a687-f32c1e685bd4","Type":"ContainerDied","Data":"a8688238cdfd1dcfb7637a786ad101466774be95a20bd05d6000a2fb72881a5c"} Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.915353 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" event={"ID":"25b65d53-5174-48c6-a687-f32c1e685bd4","Type":"ContainerDied","Data":"68e32acd4bade8c213ec06817b3f3e8059d848ef922c170c62e2bbc8d3910483"} Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.915369 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e32acd4bade8c213ec06817b3f3e8059d848ef922c170c62e2bbc8d3910483" Mar 08 21:03:57 crc kubenswrapper[4885]: I0308 21:03:57.989720 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.128875 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-nb\") pod \"25b65d53-5174-48c6-a687-f32c1e685bd4\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.129081 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-dns-svc\") pod \"25b65d53-5174-48c6-a687-f32c1e685bd4\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.129426 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvkns\" (UniqueName: \"kubernetes.io/projected/25b65d53-5174-48c6-a687-f32c1e685bd4-kube-api-access-vvkns\") pod \"25b65d53-5174-48c6-a687-f32c1e685bd4\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.129535 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-sb\") pod \"25b65d53-5174-48c6-a687-f32c1e685bd4\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.129826 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-config\") pod \"25b65d53-5174-48c6-a687-f32c1e685bd4\" (UID: \"25b65d53-5174-48c6-a687-f32c1e685bd4\") " Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.137830 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25b65d53-5174-48c6-a687-f32c1e685bd4-kube-api-access-vvkns" (OuterVolumeSpecName: "kube-api-access-vvkns") pod "25b65d53-5174-48c6-a687-f32c1e685bd4" (UID: "25b65d53-5174-48c6-a687-f32c1e685bd4"). InnerVolumeSpecName "kube-api-access-vvkns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.193001 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25b65d53-5174-48c6-a687-f32c1e685bd4" (UID: "25b65d53-5174-48c6-a687-f32c1e685bd4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.193141 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-config" (OuterVolumeSpecName: "config") pod "25b65d53-5174-48c6-a687-f32c1e685bd4" (UID: "25b65d53-5174-48c6-a687-f32c1e685bd4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.209532 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25b65d53-5174-48c6-a687-f32c1e685bd4" (UID: "25b65d53-5174-48c6-a687-f32c1e685bd4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.223290 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25b65d53-5174-48c6-a687-f32c1e685bd4" (UID: "25b65d53-5174-48c6-a687-f32c1e685bd4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.232434 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.232482 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.232501 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.232518 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25b65d53-5174-48c6-a687-f32c1e685bd4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.232535 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvkns\" (UniqueName: \"kubernetes.io/projected/25b65d53-5174-48c6-a687-f32c1e685bd4-kube-api-access-vvkns\") on node \"crc\" DevicePath \"\"" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.924306 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5757f7d959-bpxjw" Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.967196 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5757f7d959-bpxjw"] Mar 08 21:03:58 crc kubenswrapper[4885]: I0308 21:03:58.975340 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5757f7d959-bpxjw"] Mar 08 21:03:59 crc kubenswrapper[4885]: I0308 21:03:59.387652 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" path="/var/lib/kubelet/pods/25b65d53-5174-48c6-a687-f32c1e685bd4/volumes" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.157837 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550064-54cxw"] Mar 08 21:04:00 crc kubenswrapper[4885]: E0308 21:04:00.158389 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerName="init" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.158410 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerName="init" Mar 08 21:04:00 crc kubenswrapper[4885]: E0308 21:04:00.158434 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="registry-server" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.158448 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="registry-server" Mar 08 21:04:00 crc kubenswrapper[4885]: E0308 21:04:00.158481 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerName="dnsmasq-dns" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.158499 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerName="dnsmasq-dns" Mar 08 21:04:00 crc kubenswrapper[4885]: E0308 21:04:00.158527 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="extract-content" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.158539 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="extract-content" Mar 08 21:04:00 crc kubenswrapper[4885]: E0308 21:04:00.158577 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="extract-utilities" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.158590 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="extract-utilities" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.158915 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="25b65d53-5174-48c6-a687-f32c1e685bd4" containerName="dnsmasq-dns" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.159029 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e84cb88-21d6-41d6-9352-b29d1953fa9f" containerName="registry-server" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.161167 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.163984 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.164302 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.164550 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.177367 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550064-54cxw"] Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.271625 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzngm\" (UniqueName: \"kubernetes.io/projected/94b5a791-d720-4a5c-9138-abe584a56755-kube-api-access-tzngm\") pod \"auto-csr-approver-29550064-54cxw\" (UID: \"94b5a791-d720-4a5c-9138-abe584a56755\") " pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.373965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzngm\" (UniqueName: \"kubernetes.io/projected/94b5a791-d720-4a5c-9138-abe584a56755-kube-api-access-tzngm\") pod \"auto-csr-approver-29550064-54cxw\" (UID: \"94b5a791-d720-4a5c-9138-abe584a56755\") " pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.412874 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzngm\" (UniqueName: \"kubernetes.io/projected/94b5a791-d720-4a5c-9138-abe584a56755-kube-api-access-tzngm\") pod \"auto-csr-approver-29550064-54cxw\" (UID: \"94b5a791-d720-4a5c-9138-abe584a56755\") " pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.493418 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.893162 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550064-54cxw"] Mar 08 21:04:00 crc kubenswrapper[4885]: I0308 21:04:00.957151 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550064-54cxw" event={"ID":"94b5a791-d720-4a5c-9138-abe584a56755","Type":"ContainerStarted","Data":"41311d57695392d2a865a329f680d241f0b1767d02d297dae39a86facba3ade4"} Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.172467 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.172590 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.223518 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.239985 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.369489 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:04:01 crc kubenswrapper[4885]: E0308 21:04:01.370128 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.968871 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 21:04:01 crc kubenswrapper[4885]: I0308 21:04:01.968965 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 21:04:02 crc kubenswrapper[4885]: I0308 21:04:02.985872 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550064-54cxw" event={"ID":"94b5a791-d720-4a5c-9138-abe584a56755","Type":"ContainerStarted","Data":"d40c8b02d2c6b1b5fefbc9a10d09bda45776bb36be850d751409c013d3a63ca6"} Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.015009 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550064-54cxw" podStartSLOduration=1.518519096 podStartE2EDuration="3.014984882s" podCreationTimestamp="2026-03-08 21:04:00 +0000 UTC" firstStartedPulling="2026-03-08 21:04:00.901304663 +0000 UTC m=+5542.297358726" lastFinishedPulling="2026-03-08 21:04:02.397770459 +0000 UTC m=+5543.793824512" observedRunningTime="2026-03-08 21:04:03.008726065 +0000 UTC m=+5544.404780118" watchObservedRunningTime="2026-03-08 21:04:03.014984882 +0000 UTC m=+5544.411038935" Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.237570 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.237648 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.282001 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.311898 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.906304 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 21:04:03 crc kubenswrapper[4885]: I0308 21:04:03.907405 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 21:04:04 crc kubenswrapper[4885]: I0308 21:04:04.001372 4885 generic.go:334] "Generic (PLEG): container finished" podID="94b5a791-d720-4a5c-9138-abe584a56755" containerID="d40c8b02d2c6b1b5fefbc9a10d09bda45776bb36be850d751409c013d3a63ca6" exitCode=0 Mar 08 21:04:04 crc kubenswrapper[4885]: I0308 21:04:04.001516 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550064-54cxw" event={"ID":"94b5a791-d720-4a5c-9138-abe584a56755","Type":"ContainerDied","Data":"d40c8b02d2c6b1b5fefbc9a10d09bda45776bb36be850d751409c013d3a63ca6"} Mar 08 21:04:04 crc kubenswrapper[4885]: I0308 21:04:04.002098 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:04 crc kubenswrapper[4885]: I0308 21:04:04.002143 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:05 crc kubenswrapper[4885]: I0308 21:04:05.348054 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:05 crc kubenswrapper[4885]: I0308 21:04:05.483218 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzngm\" (UniqueName: \"kubernetes.io/projected/94b5a791-d720-4a5c-9138-abe584a56755-kube-api-access-tzngm\") pod \"94b5a791-d720-4a5c-9138-abe584a56755\" (UID: \"94b5a791-d720-4a5c-9138-abe584a56755\") " Mar 08 21:04:05 crc kubenswrapper[4885]: I0308 21:04:05.521147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b5a791-d720-4a5c-9138-abe584a56755-kube-api-access-tzngm" (OuterVolumeSpecName: "kube-api-access-tzngm") pod "94b5a791-d720-4a5c-9138-abe584a56755" (UID: "94b5a791-d720-4a5c-9138-abe584a56755"). InnerVolumeSpecName "kube-api-access-tzngm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:04:05 crc kubenswrapper[4885]: I0308 21:04:05.587809 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzngm\" (UniqueName: \"kubernetes.io/projected/94b5a791-d720-4a5c-9138-abe584a56755-kube-api-access-tzngm\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:05 crc kubenswrapper[4885]: I0308 21:04:05.907520 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:05 crc kubenswrapper[4885]: I0308 21:04:05.942104 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 21:04:06 crc kubenswrapper[4885]: I0308 21:04:06.028334 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550064-54cxw" Mar 08 21:04:06 crc kubenswrapper[4885]: I0308 21:04:06.028345 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550064-54cxw" event={"ID":"94b5a791-d720-4a5c-9138-abe584a56755","Type":"ContainerDied","Data":"41311d57695392d2a865a329f680d241f0b1767d02d297dae39a86facba3ade4"} Mar 08 21:04:06 crc kubenswrapper[4885]: I0308 21:04:06.028426 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41311d57695392d2a865a329f680d241f0b1767d02d297dae39a86facba3ade4" Mar 08 21:04:06 crc kubenswrapper[4885]: I0308 21:04:06.085107 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550058-z6lnn"] Mar 08 21:04:06 crc kubenswrapper[4885]: I0308 21:04:06.091318 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550058-z6lnn"] Mar 08 21:04:07 crc kubenswrapper[4885]: I0308 21:04:07.377381 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95f14f7f-4dec-4d9d-a320-7a5c927d4983" path="/var/lib/kubelet/pods/95f14f7f-4dec-4d9d-a320-7a5c927d4983/volumes" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.048570 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-txw9w"] Mar 08 21:04:14 crc kubenswrapper[4885]: E0308 21:04:14.049431 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94b5a791-d720-4a5c-9138-abe584a56755" containerName="oc" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.049443 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b5a791-d720-4a5c-9138-abe584a56755" containerName="oc" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.049584 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="94b5a791-d720-4a5c-9138-abe584a56755" containerName="oc" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.050118 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.068947 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-txw9w"] Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.148007 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5511-account-create-update-fvjhm"] Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.170653 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.174321 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.175496 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzjsb\" (UniqueName: \"kubernetes.io/projected/14dd4829-951f-4e19-885f-f466dcbf9d1b-kube-api-access-vzjsb\") pod \"placement-db-create-txw9w\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.175562 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14dd4829-951f-4e19-885f-f466dcbf9d1b-operator-scripts\") pod \"placement-db-create-txw9w\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.186987 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5511-account-create-update-fvjhm"] Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.277814 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzjsb\" (UniqueName: \"kubernetes.io/projected/14dd4829-951f-4e19-885f-f466dcbf9d1b-kube-api-access-vzjsb\") pod \"placement-db-create-txw9w\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.277903 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14dd4829-951f-4e19-885f-f466dcbf9d1b-operator-scripts\") pod \"placement-db-create-txw9w\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.278059 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g775d\" (UniqueName: \"kubernetes.io/projected/8664af8f-0cf2-4ef8-a701-adbaba058240-kube-api-access-g775d\") pod \"placement-5511-account-create-update-fvjhm\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.278165 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8664af8f-0cf2-4ef8-a701-adbaba058240-operator-scripts\") pod \"placement-5511-account-create-update-fvjhm\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.279204 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14dd4829-951f-4e19-885f-f466dcbf9d1b-operator-scripts\") pod \"placement-db-create-txw9w\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.312211 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzjsb\" (UniqueName: \"kubernetes.io/projected/14dd4829-951f-4e19-885f-f466dcbf9d1b-kube-api-access-vzjsb\") pod \"placement-db-create-txw9w\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.373535 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-txw9w" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.380221 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8664af8f-0cf2-4ef8-a701-adbaba058240-operator-scripts\") pod \"placement-5511-account-create-update-fvjhm\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.380485 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g775d\" (UniqueName: \"kubernetes.io/projected/8664af8f-0cf2-4ef8-a701-adbaba058240-kube-api-access-g775d\") pod \"placement-5511-account-create-update-fvjhm\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.381763 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8664af8f-0cf2-4ef8-a701-adbaba058240-operator-scripts\") pod \"placement-5511-account-create-update-fvjhm\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.408577 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g775d\" (UniqueName: \"kubernetes.io/projected/8664af8f-0cf2-4ef8-a701-adbaba058240-kube-api-access-g775d\") pod \"placement-5511-account-create-update-fvjhm\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.510273 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:14 crc kubenswrapper[4885]: I0308 21:04:14.928178 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-txw9w"] Mar 08 21:04:14 crc kubenswrapper[4885]: W0308 21:04:14.935187 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14dd4829_951f_4e19_885f_f466dcbf9d1b.slice/crio-bb2db1da079995208ecccaa8eccf7097099e715606eb94fc33b2ea05201ef687 WatchSource:0}: Error finding container bb2db1da079995208ecccaa8eccf7097099e715606eb94fc33b2ea05201ef687: Status 404 returned error can't find the container with id bb2db1da079995208ecccaa8eccf7097099e715606eb94fc33b2ea05201ef687 Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.003210 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5511-account-create-update-fvjhm"] Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.148020 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-txw9w" event={"ID":"14dd4829-951f-4e19-885f-f466dcbf9d1b","Type":"ContainerStarted","Data":"38c31fe4cedc7f6d10ab5073880d121f8be2e24d735953c27d9d2bfdad42cb59"} Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.148068 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-txw9w" event={"ID":"14dd4829-951f-4e19-885f-f466dcbf9d1b","Type":"ContainerStarted","Data":"bb2db1da079995208ecccaa8eccf7097099e715606eb94fc33b2ea05201ef687"} Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.150515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5511-account-create-update-fvjhm" event={"ID":"8664af8f-0cf2-4ef8-a701-adbaba058240","Type":"ContainerStarted","Data":"3fa287746faf0fe91da972d4c41143470d41b6c0871ff490ed2e7825740674d4"} Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.162996 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-txw9w" podStartSLOduration=1.162977288 podStartE2EDuration="1.162977288s" podCreationTimestamp="2026-03-08 21:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:04:15.158851918 +0000 UTC m=+5556.554905961" watchObservedRunningTime="2026-03-08 21:04:15.162977288 +0000 UTC m=+5556.559031321" Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.177442 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5511-account-create-update-fvjhm" podStartSLOduration=1.177420873 podStartE2EDuration="1.177420873s" podCreationTimestamp="2026-03-08 21:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:04:15.173749586 +0000 UTC m=+5556.569803609" watchObservedRunningTime="2026-03-08 21:04:15.177420873 +0000 UTC m=+5556.573474916" Mar 08 21:04:15 crc kubenswrapper[4885]: I0308 21:04:15.369127 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:04:15 crc kubenswrapper[4885]: E0308 21:04:15.369652 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:04:16 crc kubenswrapper[4885]: I0308 21:04:16.175264 4885 generic.go:334] "Generic (PLEG): container finished" podID="8664af8f-0cf2-4ef8-a701-adbaba058240" containerID="5a8b5b45c081a377860a6fc52da869749d2af03a3b4e62e944ef9b2a484b5105" exitCode=0 Mar 08 21:04:16 crc kubenswrapper[4885]: I0308 21:04:16.175329 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5511-account-create-update-fvjhm" event={"ID":"8664af8f-0cf2-4ef8-a701-adbaba058240","Type":"ContainerDied","Data":"5a8b5b45c081a377860a6fc52da869749d2af03a3b4e62e944ef9b2a484b5105"} Mar 08 21:04:16 crc kubenswrapper[4885]: I0308 21:04:16.179393 4885 generic.go:334] "Generic (PLEG): container finished" podID="14dd4829-951f-4e19-885f-f466dcbf9d1b" containerID="38c31fe4cedc7f6d10ab5073880d121f8be2e24d735953c27d9d2bfdad42cb59" exitCode=0 Mar 08 21:04:16 crc kubenswrapper[4885]: I0308 21:04:16.179477 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-txw9w" event={"ID":"14dd4829-951f-4e19-885f-f466dcbf9d1b","Type":"ContainerDied","Data":"38c31fe4cedc7f6d10ab5073880d121f8be2e24d735953c27d9d2bfdad42cb59"} Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.582782 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-txw9w" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.636630 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14dd4829-951f-4e19-885f-f466dcbf9d1b-operator-scripts\") pod \"14dd4829-951f-4e19-885f-f466dcbf9d1b\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.636681 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzjsb\" (UniqueName: \"kubernetes.io/projected/14dd4829-951f-4e19-885f-f466dcbf9d1b-kube-api-access-vzjsb\") pod \"14dd4829-951f-4e19-885f-f466dcbf9d1b\" (UID: \"14dd4829-951f-4e19-885f-f466dcbf9d1b\") " Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.640497 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14dd4829-951f-4e19-885f-f466dcbf9d1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "14dd4829-951f-4e19-885f-f466dcbf9d1b" (UID: "14dd4829-951f-4e19-885f-f466dcbf9d1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.664942 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14dd4829-951f-4e19-885f-f466dcbf9d1b-kube-api-access-vzjsb" (OuterVolumeSpecName: "kube-api-access-vzjsb") pod "14dd4829-951f-4e19-885f-f466dcbf9d1b" (UID: "14dd4829-951f-4e19-885f-f466dcbf9d1b"). InnerVolumeSpecName "kube-api-access-vzjsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.695637 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.740607 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g775d\" (UniqueName: \"kubernetes.io/projected/8664af8f-0cf2-4ef8-a701-adbaba058240-kube-api-access-g775d\") pod \"8664af8f-0cf2-4ef8-a701-adbaba058240\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.740936 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8664af8f-0cf2-4ef8-a701-adbaba058240-operator-scripts\") pod \"8664af8f-0cf2-4ef8-a701-adbaba058240\" (UID: \"8664af8f-0cf2-4ef8-a701-adbaba058240\") " Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.741355 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/14dd4829-951f-4e19-885f-f466dcbf9d1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.741372 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzjsb\" (UniqueName: \"kubernetes.io/projected/14dd4829-951f-4e19-885f-f466dcbf9d1b-kube-api-access-vzjsb\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.741868 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8664af8f-0cf2-4ef8-a701-adbaba058240-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8664af8f-0cf2-4ef8-a701-adbaba058240" (UID: "8664af8f-0cf2-4ef8-a701-adbaba058240"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.743770 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8664af8f-0cf2-4ef8-a701-adbaba058240-kube-api-access-g775d" (OuterVolumeSpecName: "kube-api-access-g775d") pod "8664af8f-0cf2-4ef8-a701-adbaba058240" (UID: "8664af8f-0cf2-4ef8-a701-adbaba058240"). InnerVolumeSpecName "kube-api-access-g775d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.843255 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g775d\" (UniqueName: \"kubernetes.io/projected/8664af8f-0cf2-4ef8-a701-adbaba058240-kube-api-access-g775d\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:17 crc kubenswrapper[4885]: I0308 21:04:17.843285 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8664af8f-0cf2-4ef8-a701-adbaba058240-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:18 crc kubenswrapper[4885]: I0308 21:04:18.214455 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-txw9w" event={"ID":"14dd4829-951f-4e19-885f-f466dcbf9d1b","Type":"ContainerDied","Data":"bb2db1da079995208ecccaa8eccf7097099e715606eb94fc33b2ea05201ef687"} Mar 08 21:04:18 crc kubenswrapper[4885]: I0308 21:04:18.214530 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb2db1da079995208ecccaa8eccf7097099e715606eb94fc33b2ea05201ef687" Mar 08 21:04:18 crc kubenswrapper[4885]: I0308 21:04:18.215112 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-txw9w" Mar 08 21:04:18 crc kubenswrapper[4885]: I0308 21:04:18.217861 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5511-account-create-update-fvjhm" event={"ID":"8664af8f-0cf2-4ef8-a701-adbaba058240","Type":"ContainerDied","Data":"3fa287746faf0fe91da972d4c41143470d41b6c0871ff490ed2e7825740674d4"} Mar 08 21:04:18 crc kubenswrapper[4885]: I0308 21:04:18.217951 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fa287746faf0fe91da972d4c41143470d41b6c0871ff490ed2e7825740674d4" Mar 08 21:04:18 crc kubenswrapper[4885]: I0308 21:04:18.217990 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5511-account-create-update-fvjhm" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.498816 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67dd79dd4c-22rq2"] Mar 08 21:04:19 crc kubenswrapper[4885]: E0308 21:04:19.499455 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8664af8f-0cf2-4ef8-a701-adbaba058240" containerName="mariadb-account-create-update" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.499468 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8664af8f-0cf2-4ef8-a701-adbaba058240" containerName="mariadb-account-create-update" Mar 08 21:04:19 crc kubenswrapper[4885]: E0308 21:04:19.499491 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14dd4829-951f-4e19-885f-f466dcbf9d1b" containerName="mariadb-database-create" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.499497 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="14dd4829-951f-4e19-885f-f466dcbf9d1b" containerName="mariadb-database-create" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.499699 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="14dd4829-951f-4e19-885f-f466dcbf9d1b" containerName="mariadb-database-create" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.499721 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8664af8f-0cf2-4ef8-a701-adbaba058240" containerName="mariadb-account-create-update" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.500644 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.505126 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-q8g6n"] Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.506294 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.509696 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rbkt5" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.511283 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.512556 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.526831 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-q8g6n"] Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.534426 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dd79dd4c-22rq2"] Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685059 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-nb\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685317 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-config\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685385 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qffxd\" (UniqueName: \"kubernetes.io/projected/1aad146d-597d-436f-ba72-59a57f223ad0-kube-api-access-qffxd\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685470 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-combined-ca-bundle\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-dns-svc\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685541 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-sb\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685570 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aad146d-597d-436f-ba72-59a57f223ad0-logs\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685608 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dkcc\" (UniqueName: \"kubernetes.io/projected/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-kube-api-access-7dkcc\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685668 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-config-data\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.685721 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-scripts\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.786731 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-config\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.786804 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qffxd\" (UniqueName: \"kubernetes.io/projected/1aad146d-597d-436f-ba72-59a57f223ad0-kube-api-access-qffxd\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.786832 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-combined-ca-bundle\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.786853 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-dns-svc\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.786877 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-sb\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.786895 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aad146d-597d-436f-ba72-59a57f223ad0-logs\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787059 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dkcc\" (UniqueName: \"kubernetes.io/projected/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-kube-api-access-7dkcc\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787102 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-config-data\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787121 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-scripts\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787149 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-nb\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787866 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-config\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787882 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-dns-svc\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.787939 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-sb\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.788121 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-nb\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.788449 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aad146d-597d-436f-ba72-59a57f223ad0-logs\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.792383 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-scripts\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.797459 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-combined-ca-bundle\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.797626 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-config-data\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.810519 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qffxd\" (UniqueName: \"kubernetes.io/projected/1aad146d-597d-436f-ba72-59a57f223ad0-kube-api-access-qffxd\") pod \"placement-db-sync-q8g6n\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.818912 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dkcc\" (UniqueName: \"kubernetes.io/projected/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-kube-api-access-7dkcc\") pod \"dnsmasq-dns-67dd79dd4c-22rq2\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.824704 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:19 crc kubenswrapper[4885]: I0308 21:04:19.834804 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:20 crc kubenswrapper[4885]: I0308 21:04:20.308937 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-q8g6n"] Mar 08 21:04:20 crc kubenswrapper[4885]: W0308 21:04:20.392390 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda18bb9fd_f7a3_4935_9d89_26654d7e08c5.slice/crio-b86733ef92f0d474db415f2ffb90c760fd1a463fdc711ed1f41f539df4b11b99 WatchSource:0}: Error finding container b86733ef92f0d474db415f2ffb90c760fd1a463fdc711ed1f41f539df4b11b99: Status 404 returned error can't find the container with id b86733ef92f0d474db415f2ffb90c760fd1a463fdc711ed1f41f539df4b11b99 Mar 08 21:04:20 crc kubenswrapper[4885]: I0308 21:04:20.398977 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67dd79dd4c-22rq2"] Mar 08 21:04:21 crc kubenswrapper[4885]: I0308 21:04:21.248994 4885 generic.go:334] "Generic (PLEG): container finished" podID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerID="59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3" exitCode=0 Mar 08 21:04:21 crc kubenswrapper[4885]: I0308 21:04:21.249257 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" event={"ID":"a18bb9fd-f7a3-4935-9d89-26654d7e08c5","Type":"ContainerDied","Data":"59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3"} Mar 08 21:04:21 crc kubenswrapper[4885]: I0308 21:04:21.249469 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" event={"ID":"a18bb9fd-f7a3-4935-9d89-26654d7e08c5","Type":"ContainerStarted","Data":"b86733ef92f0d474db415f2ffb90c760fd1a463fdc711ed1f41f539df4b11b99"} Mar 08 21:04:21 crc kubenswrapper[4885]: I0308 21:04:21.255301 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q8g6n" event={"ID":"1aad146d-597d-436f-ba72-59a57f223ad0","Type":"ContainerStarted","Data":"ce1aec0cb989ce899ff18178129c96b0d95ae41f48f578bbb62ca6f679d83d8f"} Mar 08 21:04:21 crc kubenswrapper[4885]: I0308 21:04:21.255363 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q8g6n" event={"ID":"1aad146d-597d-436f-ba72-59a57f223ad0","Type":"ContainerStarted","Data":"e032ad73bf79a84291b02435a87a64f950ed21417a90d6c779f230809c445017"} Mar 08 21:04:21 crc kubenswrapper[4885]: I0308 21:04:21.301504 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-q8g6n" podStartSLOduration=2.301483842 podStartE2EDuration="2.301483842s" podCreationTimestamp="2026-03-08 21:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:04:21.294078184 +0000 UTC m=+5562.690132217" watchObservedRunningTime="2026-03-08 21:04:21.301483842 +0000 UTC m=+5562.697537875" Mar 08 21:04:22 crc kubenswrapper[4885]: I0308 21:04:22.266661 4885 generic.go:334] "Generic (PLEG): container finished" podID="1aad146d-597d-436f-ba72-59a57f223ad0" containerID="ce1aec0cb989ce899ff18178129c96b0d95ae41f48f578bbb62ca6f679d83d8f" exitCode=0 Mar 08 21:04:22 crc kubenswrapper[4885]: I0308 21:04:22.266773 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q8g6n" event={"ID":"1aad146d-597d-436f-ba72-59a57f223ad0","Type":"ContainerDied","Data":"ce1aec0cb989ce899ff18178129c96b0d95ae41f48f578bbb62ca6f679d83d8f"} Mar 08 21:04:22 crc kubenswrapper[4885]: I0308 21:04:22.270025 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" event={"ID":"a18bb9fd-f7a3-4935-9d89-26654d7e08c5","Type":"ContainerStarted","Data":"4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8"} Mar 08 21:04:22 crc kubenswrapper[4885]: I0308 21:04:22.270431 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:22 crc kubenswrapper[4885]: I0308 21:04:22.315774 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" podStartSLOduration=3.315747106 podStartE2EDuration="3.315747106s" podCreationTimestamp="2026-03-08 21:04:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:04:22.30843788 +0000 UTC m=+5563.704491943" watchObservedRunningTime="2026-03-08 21:04:22.315747106 +0000 UTC m=+5563.711801159" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.697784 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.867798 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aad146d-597d-436f-ba72-59a57f223ad0-logs\") pod \"1aad146d-597d-436f-ba72-59a57f223ad0\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.867863 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-scripts\") pod \"1aad146d-597d-436f-ba72-59a57f223ad0\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.867933 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-combined-ca-bundle\") pod \"1aad146d-597d-436f-ba72-59a57f223ad0\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.868004 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-config-data\") pod \"1aad146d-597d-436f-ba72-59a57f223ad0\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.868035 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qffxd\" (UniqueName: \"kubernetes.io/projected/1aad146d-597d-436f-ba72-59a57f223ad0-kube-api-access-qffxd\") pod \"1aad146d-597d-436f-ba72-59a57f223ad0\" (UID: \"1aad146d-597d-436f-ba72-59a57f223ad0\") " Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.870015 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aad146d-597d-436f-ba72-59a57f223ad0-logs" (OuterVolumeSpecName: "logs") pod "1aad146d-597d-436f-ba72-59a57f223ad0" (UID: "1aad146d-597d-436f-ba72-59a57f223ad0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.876680 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aad146d-597d-436f-ba72-59a57f223ad0-kube-api-access-qffxd" (OuterVolumeSpecName: "kube-api-access-qffxd") pod "1aad146d-597d-436f-ba72-59a57f223ad0" (UID: "1aad146d-597d-436f-ba72-59a57f223ad0"). InnerVolumeSpecName "kube-api-access-qffxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.893049 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-scripts" (OuterVolumeSpecName: "scripts") pod "1aad146d-597d-436f-ba72-59a57f223ad0" (UID: "1aad146d-597d-436f-ba72-59a57f223ad0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.900795 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-config-data" (OuterVolumeSpecName: "config-data") pod "1aad146d-597d-436f-ba72-59a57f223ad0" (UID: "1aad146d-597d-436f-ba72-59a57f223ad0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.912068 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1aad146d-597d-436f-ba72-59a57f223ad0" (UID: "1aad146d-597d-436f-ba72-59a57f223ad0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.971391 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aad146d-597d-436f-ba72-59a57f223ad0-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.971609 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.971734 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.971853 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aad146d-597d-436f-ba72-59a57f223ad0-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:23 crc kubenswrapper[4885]: I0308 21:04:23.972059 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qffxd\" (UniqueName: \"kubernetes.io/projected/1aad146d-597d-436f-ba72-59a57f223ad0-kube-api-access-qffxd\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.295121 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q8g6n" event={"ID":"1aad146d-597d-436f-ba72-59a57f223ad0","Type":"ContainerDied","Data":"e032ad73bf79a84291b02435a87a64f950ed21417a90d6c779f230809c445017"} Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.295185 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e032ad73bf79a84291b02435a87a64f950ed21417a90d6c779f230809c445017" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.295265 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q8g6n" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.903747 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5bc985567d-hcdbz"] Mar 08 21:04:24 crc kubenswrapper[4885]: E0308 21:04:24.904332 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aad146d-597d-436f-ba72-59a57f223ad0" containerName="placement-db-sync" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.904355 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aad146d-597d-436f-ba72-59a57f223ad0" containerName="placement-db-sync" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.904648 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aad146d-597d-436f-ba72-59a57f223ad0" containerName="placement-db-sync" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.906186 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.909094 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.910418 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.912279 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-rbkt5" Mar 08 21:04:24 crc kubenswrapper[4885]: I0308 21:04:24.919598 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bc985567d-hcdbz"] Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.093887 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rptk6\" (UniqueName: \"kubernetes.io/projected/0741bee5-7932-4af4-a8c1-1e56b754e359-kube-api-access-rptk6\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.094075 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-scripts\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.094253 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-combined-ca-bundle\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.094324 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0741bee5-7932-4af4-a8c1-1e56b754e359-logs\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.094519 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-config-data\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.196343 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-scripts\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.196444 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-combined-ca-bundle\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.196522 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0741bee5-7932-4af4-a8c1-1e56b754e359-logs\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.196574 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-config-data\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.196631 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rptk6\" (UniqueName: \"kubernetes.io/projected/0741bee5-7932-4af4-a8c1-1e56b754e359-kube-api-access-rptk6\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.197031 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0741bee5-7932-4af4-a8c1-1e56b754e359-logs\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.200842 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-scripts\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.202593 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-config-data\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.203153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0741bee5-7932-4af4-a8c1-1e56b754e359-combined-ca-bundle\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.219176 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rptk6\" (UniqueName: \"kubernetes.io/projected/0741bee5-7932-4af4-a8c1-1e56b754e359-kube-api-access-rptk6\") pod \"placement-5bc985567d-hcdbz\" (UID: \"0741bee5-7932-4af4-a8c1-1e56b754e359\") " pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.251470 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:25 crc kubenswrapper[4885]: I0308 21:04:25.716744 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5bc985567d-hcdbz"] Mar 08 21:04:25 crc kubenswrapper[4885]: W0308 21:04:25.718295 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0741bee5_7932_4af4_a8c1_1e56b754e359.slice/crio-eb57921c96eb72e57a87008b629ed4c389f542d5e6b70bb62d48c50a5b6a885d WatchSource:0}: Error finding container eb57921c96eb72e57a87008b629ed4c389f542d5e6b70bb62d48c50a5b6a885d: Status 404 returned error can't find the container with id eb57921c96eb72e57a87008b629ed4c389f542d5e6b70bb62d48c50a5b6a885d Mar 08 21:04:26 crc kubenswrapper[4885]: I0308 21:04:26.316081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bc985567d-hcdbz" event={"ID":"0741bee5-7932-4af4-a8c1-1e56b754e359","Type":"ContainerStarted","Data":"42dac47964c4a24ba20aabbedfa225db530aeffae1a720e47c3e106ce647762f"} Mar 08 21:04:26 crc kubenswrapper[4885]: I0308 21:04:26.316158 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bc985567d-hcdbz" event={"ID":"0741bee5-7932-4af4-a8c1-1e56b754e359","Type":"ContainerStarted","Data":"f4f9f28f407cb131f6b8101d3af1ec3565049a3a9de862df15a7783034d3e20a"} Mar 08 21:04:26 crc kubenswrapper[4885]: I0308 21:04:26.316185 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5bc985567d-hcdbz" event={"ID":"0741bee5-7932-4af4-a8c1-1e56b754e359","Type":"ContainerStarted","Data":"eb57921c96eb72e57a87008b629ed4c389f542d5e6b70bb62d48c50a5b6a885d"} Mar 08 21:04:26 crc kubenswrapper[4885]: I0308 21:04:26.316712 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:26 crc kubenswrapper[4885]: I0308 21:04:26.316796 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:26 crc kubenswrapper[4885]: I0308 21:04:26.361801 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5bc985567d-hcdbz" podStartSLOduration=2.361772726 podStartE2EDuration="2.361772726s" podCreationTimestamp="2026-03-08 21:04:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:04:26.341979248 +0000 UTC m=+5567.738033311" watchObservedRunningTime="2026-03-08 21:04:26.361772726 +0000 UTC m=+5567.757826779" Mar 08 21:04:28 crc kubenswrapper[4885]: I0308 21:04:28.368192 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:04:28 crc kubenswrapper[4885]: E0308 21:04:28.370734 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:04:29 crc kubenswrapper[4885]: I0308 21:04:29.826386 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:04:29 crc kubenswrapper[4885]: I0308 21:04:29.929518 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54fd489df-k9th6"] Mar 08 21:04:29 crc kubenswrapper[4885]: I0308 21:04:29.929800 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54fd489df-k9th6" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" containerName="dnsmasq-dns" containerID="cri-o://7dd39daad27eae124834c42bda6676c4988f5e52cc85f87170d8845fcdc1c6e4" gracePeriod=10 Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.359713 4885 generic.go:334] "Generic (PLEG): container finished" podID="a16f7c8b-b930-4591-a967-9db46c52391c" containerID="7dd39daad27eae124834c42bda6676c4988f5e52cc85f87170d8845fcdc1c6e4" exitCode=0 Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.360023 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54fd489df-k9th6" event={"ID":"a16f7c8b-b930-4591-a967-9db46c52391c","Type":"ContainerDied","Data":"7dd39daad27eae124834c42bda6676c4988f5e52cc85f87170d8845fcdc1c6e4"} Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.360047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54fd489df-k9th6" event={"ID":"a16f7c8b-b930-4591-a967-9db46c52391c","Type":"ContainerDied","Data":"1a913a93c16bbf12b9927ddd796c54de956ba88ff05aa50f518989cc7d07d3d0"} Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.360058 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a913a93c16bbf12b9927ddd796c54de956ba88ff05aa50f518989cc7d07d3d0" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.395406 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.517556 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-sb\") pod \"a16f7c8b-b930-4591-a967-9db46c52391c\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.517635 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-nb\") pod \"a16f7c8b-b930-4591-a967-9db46c52391c\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.517742 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-config\") pod \"a16f7c8b-b930-4591-a967-9db46c52391c\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.517825 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-276lt\" (UniqueName: \"kubernetes.io/projected/a16f7c8b-b930-4591-a967-9db46c52391c-kube-api-access-276lt\") pod \"a16f7c8b-b930-4591-a967-9db46c52391c\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.517898 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-dns-svc\") pod \"a16f7c8b-b930-4591-a967-9db46c52391c\" (UID: \"a16f7c8b-b930-4591-a967-9db46c52391c\") " Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.565287 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a16f7c8b-b930-4591-a967-9db46c52391c-kube-api-access-276lt" (OuterVolumeSpecName: "kube-api-access-276lt") pod "a16f7c8b-b930-4591-a967-9db46c52391c" (UID: "a16f7c8b-b930-4591-a967-9db46c52391c"). InnerVolumeSpecName "kube-api-access-276lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.592166 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a16f7c8b-b930-4591-a967-9db46c52391c" (UID: "a16f7c8b-b930-4591-a967-9db46c52391c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.621005 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.621038 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-276lt\" (UniqueName: \"kubernetes.io/projected/a16f7c8b-b930-4591-a967-9db46c52391c-kube-api-access-276lt\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.634517 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a16f7c8b-b930-4591-a967-9db46c52391c" (UID: "a16f7c8b-b930-4591-a967-9db46c52391c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.644423 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a16f7c8b-b930-4591-a967-9db46c52391c" (UID: "a16f7c8b-b930-4591-a967-9db46c52391c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.656430 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-config" (OuterVolumeSpecName: "config") pod "a16f7c8b-b930-4591-a967-9db46c52391c" (UID: "a16f7c8b-b930-4591-a967-9db46c52391c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.722124 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.722155 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:30 crc kubenswrapper[4885]: I0308 21:04:30.722166 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a16f7c8b-b930-4591-a967-9db46c52391c-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:04:31 crc kubenswrapper[4885]: I0308 21:04:31.369637 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54fd489df-k9th6" Mar 08 21:04:31 crc kubenswrapper[4885]: I0308 21:04:31.432687 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54fd489df-k9th6"] Mar 08 21:04:31 crc kubenswrapper[4885]: I0308 21:04:31.444090 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54fd489df-k9th6"] Mar 08 21:04:33 crc kubenswrapper[4885]: I0308 21:04:33.387370 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" path="/var/lib/kubelet/pods/a16f7c8b-b930-4591-a967-9db46c52391c/volumes" Mar 08 21:04:43 crc kubenswrapper[4885]: I0308 21:04:43.369183 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:04:43 crc kubenswrapper[4885]: E0308 21:04:43.372197 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:04:56 crc kubenswrapper[4885]: I0308 21:04:56.284504 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:56 crc kubenswrapper[4885]: I0308 21:04:56.298796 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5bc985567d-hcdbz" Mar 08 21:04:56 crc kubenswrapper[4885]: I0308 21:04:56.369026 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:04:56 crc kubenswrapper[4885]: E0308 21:04:56.369513 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:05:05 crc kubenswrapper[4885]: I0308 21:05:05.142494 4885 scope.go:117] "RemoveContainer" containerID="7d907d9b611e54cf97509bd0480731a1868d207218e0f55feb360b3b591d95c2" Mar 08 21:05:05 crc kubenswrapper[4885]: I0308 21:05:05.175657 4885 scope.go:117] "RemoveContainer" containerID="21b8174ae95621e7d89055b3e4716d5a83a8f7fb3dd103300c6b0dc26e415bb4" Mar 08 21:05:11 crc kubenswrapper[4885]: I0308 21:05:11.369047 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:05:11 crc kubenswrapper[4885]: E0308 21:05:11.370087 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.890766 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-qgblt"] Mar 08 21:05:20 crc kubenswrapper[4885]: E0308 21:05:20.891611 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" containerName="dnsmasq-dns" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.891626 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" containerName="dnsmasq-dns" Mar 08 21:05:20 crc kubenswrapper[4885]: E0308 21:05:20.891657 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" containerName="init" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.891665 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" containerName="init" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.891874 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16f7c8b-b930-4591-a967-9db46c52391c" containerName="dnsmasq-dns" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.892503 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.899733 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qgblt"] Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.964150 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a6e4793-0be0-4d9f-b96a-c8877648415e-operator-scripts\") pod \"nova-api-db-create-qgblt\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.964426 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqk66\" (UniqueName: \"kubernetes.io/projected/9a6e4793-0be0-4d9f-b96a-c8877648415e-kube-api-access-kqk66\") pod \"nova-api-db-create-qgblt\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.972625 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-58ntc"] Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.973641 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:20 crc kubenswrapper[4885]: I0308 21:05:20.996537 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-58ntc"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.073770 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-cd23-account-create-update-5qvdh"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.074818 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.075952 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a6e4793-0be0-4d9f-b96a-c8877648415e-operator-scripts\") pod \"nova-api-db-create-qgblt\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.076010 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869febc8-e7d9-4723-bc87-567e08849a27-operator-scripts\") pod \"nova-cell0-db-create-58ntc\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.076044 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqk66\" (UniqueName: \"kubernetes.io/projected/9a6e4793-0be0-4d9f-b96a-c8877648415e-kube-api-access-kqk66\") pod \"nova-api-db-create-qgblt\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.076151 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56hkd\" (UniqueName: \"kubernetes.io/projected/869febc8-e7d9-4723-bc87-567e08849a27-kube-api-access-56hkd\") pod \"nova-cell0-db-create-58ntc\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.076658 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.076804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a6e4793-0be0-4d9f-b96a-c8877648415e-operator-scripts\") pod \"nova-api-db-create-qgblt\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.084138 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nk6qt"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.085400 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.093028 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cd23-account-create-update-5qvdh"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.103518 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nk6qt"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.105434 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqk66\" (UniqueName: \"kubernetes.io/projected/9a6e4793-0be0-4d9f-b96a-c8877648415e-kube-api-access-kqk66\") pod \"nova-api-db-create-qgblt\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178082 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-operator-scripts\") pod \"nova-api-cd23-account-create-update-5qvdh\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178161 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hxp2\" (UniqueName: \"kubernetes.io/projected/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-kube-api-access-6hxp2\") pod \"nova-api-cd23-account-create-update-5qvdh\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178246 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-operator-scripts\") pod \"nova-cell1-db-create-nk6qt\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178276 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56hkd\" (UniqueName: \"kubernetes.io/projected/869febc8-e7d9-4723-bc87-567e08849a27-kube-api-access-56hkd\") pod \"nova-cell0-db-create-58ntc\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178335 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869febc8-e7d9-4723-bc87-567e08849a27-operator-scripts\") pod \"nova-cell0-db-create-58ntc\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwxt2\" (UniqueName: \"kubernetes.io/projected/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-kube-api-access-bwxt2\") pod \"nova-cell1-db-create-nk6qt\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.178971 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869febc8-e7d9-4723-bc87-567e08849a27-operator-scripts\") pod \"nova-cell0-db-create-58ntc\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.193985 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56hkd\" (UniqueName: \"kubernetes.io/projected/869febc8-e7d9-4723-bc87-567e08849a27-kube-api-access-56hkd\") pod \"nova-cell0-db-create-58ntc\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.245026 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.278474 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-eab3-account-create-update-8r4nd"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.279856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.281400 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-operator-scripts\") pod \"nova-cell1-db-create-nk6qt\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.281490 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwxt2\" (UniqueName: \"kubernetes.io/projected/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-kube-api-access-bwxt2\") pod \"nova-cell1-db-create-nk6qt\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.281547 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-operator-scripts\") pod \"nova-api-cd23-account-create-update-5qvdh\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.281606 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hxp2\" (UniqueName: \"kubernetes.io/projected/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-kube-api-access-6hxp2\") pod \"nova-api-cd23-account-create-update-5qvdh\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.282722 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-operator-scripts\") pod \"nova-cell1-db-create-nk6qt\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.283502 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-operator-scripts\") pod \"nova-api-cd23-account-create-update-5qvdh\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.283891 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.289684 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-eab3-account-create-update-8r4nd"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.293616 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.299202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwxt2\" (UniqueName: \"kubernetes.io/projected/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-kube-api-access-bwxt2\") pod \"nova-cell1-db-create-nk6qt\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.319689 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hxp2\" (UniqueName: \"kubernetes.io/projected/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-kube-api-access-6hxp2\") pod \"nova-api-cd23-account-create-update-5qvdh\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.384769 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdsk7\" (UniqueName: \"kubernetes.io/projected/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-kube-api-access-wdsk7\") pod \"nova-cell0-eab3-account-create-update-8r4nd\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.384993 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-operator-scripts\") pod \"nova-cell0-eab3-account-create-update-8r4nd\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.396245 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-9f6d-account-create-update-jn2lc"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.401883 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.417040 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9f6d-account-create-update-jn2lc"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.417289 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.421471 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.444754 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.486815 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-operator-scripts\") pod \"nova-cell0-eab3-account-create-update-8r4nd\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.486914 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v89t\" (UniqueName: \"kubernetes.io/projected/636cf333-497f-4fcf-9d2d-ebfe48c81d75-kube-api-access-7v89t\") pod \"nova-cell1-9f6d-account-create-update-jn2lc\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.486988 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/636cf333-497f-4fcf-9d2d-ebfe48c81d75-operator-scripts\") pod \"nova-cell1-9f6d-account-create-update-jn2lc\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.487034 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdsk7\" (UniqueName: \"kubernetes.io/projected/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-kube-api-access-wdsk7\") pod \"nova-cell0-eab3-account-create-update-8r4nd\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.487504 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-operator-scripts\") pod \"nova-cell0-eab3-account-create-update-8r4nd\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.508026 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdsk7\" (UniqueName: \"kubernetes.io/projected/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-kube-api-access-wdsk7\") pod \"nova-cell0-eab3-account-create-update-8r4nd\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.589142 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v89t\" (UniqueName: \"kubernetes.io/projected/636cf333-497f-4fcf-9d2d-ebfe48c81d75-kube-api-access-7v89t\") pod \"nova-cell1-9f6d-account-create-update-jn2lc\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.589194 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/636cf333-497f-4fcf-9d2d-ebfe48c81d75-operator-scripts\") pod \"nova-cell1-9f6d-account-create-update-jn2lc\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.589896 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/636cf333-497f-4fcf-9d2d-ebfe48c81d75-operator-scripts\") pod \"nova-cell1-9f6d-account-create-update-jn2lc\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.607022 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v89t\" (UniqueName: \"kubernetes.io/projected/636cf333-497f-4fcf-9d2d-ebfe48c81d75-kube-api-access-7v89t\") pod \"nova-cell1-9f6d-account-create-update-jn2lc\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.713687 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qgblt"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.722953 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.748272 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.863597 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-58ntc"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.940766 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cd23-account-create-update-5qvdh"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.984346 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nk6qt"] Mar 08 21:05:21 crc kubenswrapper[4885]: I0308 21:05:21.996510 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-eab3-account-create-update-8r4nd"] Mar 08 21:05:21 crc kubenswrapper[4885]: W0308 21:05:21.997403 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbafc9a8b_2cbe_465d_8055_e6c2675b80a4.slice/crio-bce021303610d9138b7d5fffc427ae7f62f1891b34d48aef2d114c78af353642 WatchSource:0}: Error finding container bce021303610d9138b7d5fffc427ae7f62f1891b34d48aef2d114c78af353642: Status 404 returned error can't find the container with id bce021303610d9138b7d5fffc427ae7f62f1891b34d48aef2d114c78af353642 Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.047549 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nk6qt" event={"ID":"bafc9a8b-2cbe-465d-8055-e6c2675b80a4","Type":"ContainerStarted","Data":"bce021303610d9138b7d5fffc427ae7f62f1891b34d48aef2d114c78af353642"} Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.060204 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cd23-account-create-update-5qvdh" event={"ID":"64ea00b6-97bd-459b-ad43-bbfc5862cc4c","Type":"ContainerStarted","Data":"66d955d9a60f35aac16847ce69e422cd51ab8223508959adf034f2e8417adab2"} Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.061670 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" event={"ID":"b7b81f14-560e-4a64-88c7-164fbb0b4f8b","Type":"ContainerStarted","Data":"c6714401753337bb32708fcc881bfc6ecb5b029fe410a588e80ff9ca0fd71fc2"} Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.063052 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-58ntc" event={"ID":"869febc8-e7d9-4723-bc87-567e08849a27","Type":"ContainerStarted","Data":"a24514f8689fd17b2e912c8026a4da02c2996c25c6e1ecc2bda139d4ad6a64d4"} Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.064057 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qgblt" event={"ID":"9a6e4793-0be0-4d9f-b96a-c8877648415e","Type":"ContainerStarted","Data":"293fcdf5f1f3770069df599650a0ea581f09c4e28effba9f99eb6879ddb6a2a4"} Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.064075 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qgblt" event={"ID":"9a6e4793-0be0-4d9f-b96a-c8877648415e","Type":"ContainerStarted","Data":"7c6b8a131d6cbfb22fe18be558dad0ece897fdb49c10f467670f61287fccfca6"} Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.273995 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-qgblt" podStartSLOduration=2.2739789 podStartE2EDuration="2.2739789s" podCreationTimestamp="2026-03-08 21:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:22.080729246 +0000 UTC m=+5623.476783469" watchObservedRunningTime="2026-03-08 21:05:22.2739789 +0000 UTC m=+5623.670032923" Mar 08 21:05:22 crc kubenswrapper[4885]: I0308 21:05:22.280085 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-9f6d-account-create-update-jn2lc"] Mar 08 21:05:22 crc kubenswrapper[4885]: W0308 21:05:22.302812 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod636cf333_497f_4fcf_9d2d_ebfe48c81d75.slice/crio-10db64beeab162b765a2f5ea0faae7675a89e8d662c56dd5f2bab74ca4a60dc3 WatchSource:0}: Error finding container 10db64beeab162b765a2f5ea0faae7675a89e8d662c56dd5f2bab74ca4a60dc3: Status 404 returned error can't find the container with id 10db64beeab162b765a2f5ea0faae7675a89e8d662c56dd5f2bab74ca4a60dc3 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.071577 4885 generic.go:334] "Generic (PLEG): container finished" podID="bafc9a8b-2cbe-465d-8055-e6c2675b80a4" containerID="08cb7392dd836d2cf5e583b01bad8a88a737b02245c6ec9a4a8e07b52e00a8cf" exitCode=0 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.071624 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nk6qt" event={"ID":"bafc9a8b-2cbe-465d-8055-e6c2675b80a4","Type":"ContainerDied","Data":"08cb7392dd836d2cf5e583b01bad8a88a737b02245c6ec9a4a8e07b52e00a8cf"} Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.073676 4885 generic.go:334] "Generic (PLEG): container finished" podID="64ea00b6-97bd-459b-ad43-bbfc5862cc4c" containerID="3f3b93600a59d7fdfedddb2e79ea7fb7eee2ed381b6d60e917ab50e93241509a" exitCode=0 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.073749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cd23-account-create-update-5qvdh" event={"ID":"64ea00b6-97bd-459b-ad43-bbfc5862cc4c","Type":"ContainerDied","Data":"3f3b93600a59d7fdfedddb2e79ea7fb7eee2ed381b6d60e917ab50e93241509a"} Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.075351 4885 generic.go:334] "Generic (PLEG): container finished" podID="b7b81f14-560e-4a64-88c7-164fbb0b4f8b" containerID="2a5e8c0d61eedd0069d39190cdfa7686da395e0e45e1d4b7133ef0d8e637e513" exitCode=0 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.075397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" event={"ID":"b7b81f14-560e-4a64-88c7-164fbb0b4f8b","Type":"ContainerDied","Data":"2a5e8c0d61eedd0069d39190cdfa7686da395e0e45e1d4b7133ef0d8e637e513"} Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.076664 4885 generic.go:334] "Generic (PLEG): container finished" podID="636cf333-497f-4fcf-9d2d-ebfe48c81d75" containerID="085db1d51848063091ed8cc366e74589bc9b1a67399db7aae932f752c5c7bcca" exitCode=0 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.076696 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" event={"ID":"636cf333-497f-4fcf-9d2d-ebfe48c81d75","Type":"ContainerDied","Data":"085db1d51848063091ed8cc366e74589bc9b1a67399db7aae932f752c5c7bcca"} Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.076725 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" event={"ID":"636cf333-497f-4fcf-9d2d-ebfe48c81d75","Type":"ContainerStarted","Data":"10db64beeab162b765a2f5ea0faae7675a89e8d662c56dd5f2bab74ca4a60dc3"} Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.077885 4885 generic.go:334] "Generic (PLEG): container finished" podID="869febc8-e7d9-4723-bc87-567e08849a27" containerID="bdd2c701bb858773f060623b06a914478bf58cb8470912a63df694c3493b2a12" exitCode=0 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.077927 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-58ntc" event={"ID":"869febc8-e7d9-4723-bc87-567e08849a27","Type":"ContainerDied","Data":"bdd2c701bb858773f060623b06a914478bf58cb8470912a63df694c3493b2a12"} Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.079187 4885 generic.go:334] "Generic (PLEG): container finished" podID="9a6e4793-0be0-4d9f-b96a-c8877648415e" containerID="293fcdf5f1f3770069df599650a0ea581f09c4e28effba9f99eb6879ddb6a2a4" exitCode=0 Mar 08 21:05:23 crc kubenswrapper[4885]: I0308 21:05:23.079224 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qgblt" event={"ID":"9a6e4793-0be0-4d9f-b96a-c8877648415e","Type":"ContainerDied","Data":"293fcdf5f1f3770069df599650a0ea581f09c4e28effba9f99eb6879ddb6a2a4"} Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.517064 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.643103 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.643529 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a6e4793-0be0-4d9f-b96a-c8877648415e-operator-scripts\") pod \"9a6e4793-0be0-4d9f-b96a-c8877648415e\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.643681 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqk66\" (UniqueName: \"kubernetes.io/projected/9a6e4793-0be0-4d9f-b96a-c8877648415e-kube-api-access-kqk66\") pod \"9a6e4793-0be0-4d9f-b96a-c8877648415e\" (UID: \"9a6e4793-0be0-4d9f-b96a-c8877648415e\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.644412 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a6e4793-0be0-4d9f-b96a-c8877648415e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9a6e4793-0be0-4d9f-b96a-c8877648415e" (UID: "9a6e4793-0be0-4d9f-b96a-c8877648415e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.649389 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a6e4793-0be0-4d9f-b96a-c8877648415e-kube-api-access-kqk66" (OuterVolumeSpecName: "kube-api-access-kqk66") pod "9a6e4793-0be0-4d9f-b96a-c8877648415e" (UID: "9a6e4793-0be0-4d9f-b96a-c8877648415e"). InnerVolumeSpecName "kube-api-access-kqk66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.649949 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.690854 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.698532 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.703881 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.745370 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869febc8-e7d9-4723-bc87-567e08849a27-operator-scripts\") pod \"869febc8-e7d9-4723-bc87-567e08849a27\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.745463 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56hkd\" (UniqueName: \"kubernetes.io/projected/869febc8-e7d9-4723-bc87-567e08849a27-kube-api-access-56hkd\") pod \"869febc8-e7d9-4723-bc87-567e08849a27\" (UID: \"869febc8-e7d9-4723-bc87-567e08849a27\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.747182 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869febc8-e7d9-4723-bc87-567e08849a27-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "869febc8-e7d9-4723-bc87-567e08849a27" (UID: "869febc8-e7d9-4723-bc87-567e08849a27"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.748164 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/869febc8-e7d9-4723-bc87-567e08849a27-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.748196 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9a6e4793-0be0-4d9f-b96a-c8877648415e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.748211 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqk66\" (UniqueName: \"kubernetes.io/projected/9a6e4793-0be0-4d9f-b96a-c8877648415e-kube-api-access-kqk66\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.754082 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869febc8-e7d9-4723-bc87-567e08849a27-kube-api-access-56hkd" (OuterVolumeSpecName: "kube-api-access-56hkd") pod "869febc8-e7d9-4723-bc87-567e08849a27" (UID: "869febc8-e7d9-4723-bc87-567e08849a27"). InnerVolumeSpecName "kube-api-access-56hkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.848977 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwxt2\" (UniqueName: \"kubernetes.io/projected/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-kube-api-access-bwxt2\") pod \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849095 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7v89t\" (UniqueName: \"kubernetes.io/projected/636cf333-497f-4fcf-9d2d-ebfe48c81d75-kube-api-access-7v89t\") pod \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849124 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-operator-scripts\") pod \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849163 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-operator-scripts\") pod \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\" (UID: \"bafc9a8b-2cbe-465d-8055-e6c2675b80a4\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849205 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/636cf333-497f-4fcf-9d2d-ebfe48c81d75-operator-scripts\") pod \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\" (UID: \"636cf333-497f-4fcf-9d2d-ebfe48c81d75\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849238 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdsk7\" (UniqueName: \"kubernetes.io/projected/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-kube-api-access-wdsk7\") pod \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\" (UID: \"b7b81f14-560e-4a64-88c7-164fbb0b4f8b\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849292 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-operator-scripts\") pod \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849368 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hxp2\" (UniqueName: \"kubernetes.io/projected/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-kube-api-access-6hxp2\") pod \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\" (UID: \"64ea00b6-97bd-459b-ad43-bbfc5862cc4c\") " Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849748 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56hkd\" (UniqueName: \"kubernetes.io/projected/869febc8-e7d9-4723-bc87-567e08849a27-kube-api-access-56hkd\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.849835 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b7b81f14-560e-4a64-88c7-164fbb0b4f8b" (UID: "b7b81f14-560e-4a64-88c7-164fbb0b4f8b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.850082 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bafc9a8b-2cbe-465d-8055-e6c2675b80a4" (UID: "bafc9a8b-2cbe-465d-8055-e6c2675b80a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.850498 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "64ea00b6-97bd-459b-ad43-bbfc5862cc4c" (UID: "64ea00b6-97bd-459b-ad43-bbfc5862cc4c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.850858 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/636cf333-497f-4fcf-9d2d-ebfe48c81d75-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "636cf333-497f-4fcf-9d2d-ebfe48c81d75" (UID: "636cf333-497f-4fcf-9d2d-ebfe48c81d75"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.852742 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/636cf333-497f-4fcf-9d2d-ebfe48c81d75-kube-api-access-7v89t" (OuterVolumeSpecName: "kube-api-access-7v89t") pod "636cf333-497f-4fcf-9d2d-ebfe48c81d75" (UID: "636cf333-497f-4fcf-9d2d-ebfe48c81d75"). InnerVolumeSpecName "kube-api-access-7v89t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.852897 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-kube-api-access-6hxp2" (OuterVolumeSpecName: "kube-api-access-6hxp2") pod "64ea00b6-97bd-459b-ad43-bbfc5862cc4c" (UID: "64ea00b6-97bd-459b-ad43-bbfc5862cc4c"). InnerVolumeSpecName "kube-api-access-6hxp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.853238 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-kube-api-access-bwxt2" (OuterVolumeSpecName: "kube-api-access-bwxt2") pod "bafc9a8b-2cbe-465d-8055-e6c2675b80a4" (UID: "bafc9a8b-2cbe-465d-8055-e6c2675b80a4"). InnerVolumeSpecName "kube-api-access-bwxt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.854232 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-kube-api-access-wdsk7" (OuterVolumeSpecName: "kube-api-access-wdsk7") pod "b7b81f14-560e-4a64-88c7-164fbb0b4f8b" (UID: "b7b81f14-560e-4a64-88c7-164fbb0b4f8b"). InnerVolumeSpecName "kube-api-access-wdsk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.951914 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.951972 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/636cf333-497f-4fcf-9d2d-ebfe48c81d75-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.951986 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdsk7\" (UniqueName: \"kubernetes.io/projected/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-kube-api-access-wdsk7\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.952000 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.952012 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hxp2\" (UniqueName: \"kubernetes.io/projected/64ea00b6-97bd-459b-ad43-bbfc5862cc4c-kube-api-access-6hxp2\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.952023 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwxt2\" (UniqueName: \"kubernetes.io/projected/bafc9a8b-2cbe-465d-8055-e6c2675b80a4-kube-api-access-bwxt2\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.952035 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7v89t\" (UniqueName: \"kubernetes.io/projected/636cf333-497f-4fcf-9d2d-ebfe48c81d75-kube-api-access-7v89t\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:24 crc kubenswrapper[4885]: I0308 21:05:24.952047 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b7b81f14-560e-4a64-88c7-164fbb0b4f8b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.103007 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cd23-account-create-update-5qvdh" event={"ID":"64ea00b6-97bd-459b-ad43-bbfc5862cc4c","Type":"ContainerDied","Data":"66d955d9a60f35aac16847ce69e422cd51ab8223508959adf034f2e8417adab2"} Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.103053 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66d955d9a60f35aac16847ce69e422cd51ab8223508959adf034f2e8417adab2" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.103072 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cd23-account-create-update-5qvdh" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.115935 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" event={"ID":"b7b81f14-560e-4a64-88c7-164fbb0b4f8b","Type":"ContainerDied","Data":"c6714401753337bb32708fcc881bfc6ecb5b029fe410a588e80ff9ca0fd71fc2"} Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.115976 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eab3-account-create-update-8r4nd" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.115980 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6714401753337bb32708fcc881bfc6ecb5b029fe410a588e80ff9ca0fd71fc2" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.121308 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" event={"ID":"636cf333-497f-4fcf-9d2d-ebfe48c81d75","Type":"ContainerDied","Data":"10db64beeab162b765a2f5ea0faae7675a89e8d662c56dd5f2bab74ca4a60dc3"} Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.121370 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10db64beeab162b765a2f5ea0faae7675a89e8d662c56dd5f2bab74ca4a60dc3" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.121459 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-9f6d-account-create-update-jn2lc" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.124349 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-58ntc" event={"ID":"869febc8-e7d9-4723-bc87-567e08849a27","Type":"ContainerDied","Data":"a24514f8689fd17b2e912c8026a4da02c2996c25c6e1ecc2bda139d4ad6a64d4"} Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.124414 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a24514f8689fd17b2e912c8026a4da02c2996c25c6e1ecc2bda139d4ad6a64d4" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.124508 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-58ntc" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.127554 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qgblt" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.128223 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qgblt" event={"ID":"9a6e4793-0be0-4d9f-b96a-c8877648415e","Type":"ContainerDied","Data":"7c6b8a131d6cbfb22fe18be558dad0ece897fdb49c10f467670f61287fccfca6"} Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.128287 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c6b8a131d6cbfb22fe18be558dad0ece897fdb49c10f467670f61287fccfca6" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.132550 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nk6qt" event={"ID":"bafc9a8b-2cbe-465d-8055-e6c2675b80a4","Type":"ContainerDied","Data":"bce021303610d9138b7d5fffc427ae7f62f1891b34d48aef2d114c78af353642"} Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.132607 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bce021303610d9138b7d5fffc427ae7f62f1891b34d48aef2d114c78af353642" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.132688 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nk6qt" Mar 08 21:05:25 crc kubenswrapper[4885]: I0308 21:05:25.369313 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:05:25 crc kubenswrapper[4885]: E0308 21:05:25.369903 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.674600 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7pj4r"] Mar 08 21:05:26 crc kubenswrapper[4885]: E0308 21:05:26.675197 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a6e4793-0be0-4d9f-b96a-c8877648415e" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675209 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a6e4793-0be0-4d9f-b96a-c8877648415e" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: E0308 21:05:26.675220 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="636cf333-497f-4fcf-9d2d-ebfe48c81d75" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675226 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="636cf333-497f-4fcf-9d2d-ebfe48c81d75" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: E0308 21:05:26.675238 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869febc8-e7d9-4723-bc87-567e08849a27" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675245 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="869febc8-e7d9-4723-bc87-567e08849a27" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: E0308 21:05:26.675253 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bafc9a8b-2cbe-465d-8055-e6c2675b80a4" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675259 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bafc9a8b-2cbe-465d-8055-e6c2675b80a4" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: E0308 21:05:26.675266 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b81f14-560e-4a64-88c7-164fbb0b4f8b" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675271 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b81f14-560e-4a64-88c7-164fbb0b4f8b" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: E0308 21:05:26.675286 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ea00b6-97bd-459b-ad43-bbfc5862cc4c" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675292 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ea00b6-97bd-459b-ad43-bbfc5862cc4c" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675452 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="636cf333-497f-4fcf-9d2d-ebfe48c81d75" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675463 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="869febc8-e7d9-4723-bc87-567e08849a27" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675472 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ea00b6-97bd-459b-ad43-bbfc5862cc4c" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675486 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bafc9a8b-2cbe-465d-8055-e6c2675b80a4" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675495 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b81f14-560e-4a64-88c7-164fbb0b4f8b" containerName="mariadb-account-create-update" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.675503 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a6e4793-0be0-4d9f-b96a-c8877648415e" containerName="mariadb-database-create" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.676528 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.693706 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7pj4r"] Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.700282 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.700504 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-h5r8g" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.700656 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.788948 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2n4w\" (UniqueName: \"kubernetes.io/projected/b06fac1b-774d-4b4d-afd9-58024d9e5903-kube-api-access-f2n4w\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.789015 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.789320 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-config-data\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.789389 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-scripts\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.890583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-scripts\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.890675 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2n4w\" (UniqueName: \"kubernetes.io/projected/b06fac1b-774d-4b4d-afd9-58024d9e5903-kube-api-access-f2n4w\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.890716 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.890799 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-config-data\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.899583 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-scripts\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.899619 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.900280 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-config-data\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:26 crc kubenswrapper[4885]: I0308 21:05:26.908994 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2n4w\" (UniqueName: \"kubernetes.io/projected/b06fac1b-774d-4b4d-afd9-58024d9e5903-kube-api-access-f2n4w\") pod \"nova-cell0-conductor-db-sync-7pj4r\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:27 crc kubenswrapper[4885]: I0308 21:05:27.013722 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:27 crc kubenswrapper[4885]: I0308 21:05:27.468571 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7pj4r"] Mar 08 21:05:28 crc kubenswrapper[4885]: I0308 21:05:28.190059 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" event={"ID":"b06fac1b-774d-4b4d-afd9-58024d9e5903","Type":"ContainerStarted","Data":"8602697feac478750bd9bf6e693b70c9e3f1df0afea0deb7c2804af9bf248c24"} Mar 08 21:05:28 crc kubenswrapper[4885]: I0308 21:05:28.190121 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" event={"ID":"b06fac1b-774d-4b4d-afd9-58024d9e5903","Type":"ContainerStarted","Data":"2167c407393fa34506566b4f13f35923bd52439b3c613576ce23830c24be4dc9"} Mar 08 21:05:33 crc kubenswrapper[4885]: I0308 21:05:33.246069 4885 generic.go:334] "Generic (PLEG): container finished" podID="b06fac1b-774d-4b4d-afd9-58024d9e5903" containerID="8602697feac478750bd9bf6e693b70c9e3f1df0afea0deb7c2804af9bf248c24" exitCode=0 Mar 08 21:05:33 crc kubenswrapper[4885]: I0308 21:05:33.246216 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" event={"ID":"b06fac1b-774d-4b4d-afd9-58024d9e5903","Type":"ContainerDied","Data":"8602697feac478750bd9bf6e693b70c9e3f1df0afea0deb7c2804af9bf248c24"} Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.661341 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.846184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-scripts\") pod \"b06fac1b-774d-4b4d-afd9-58024d9e5903\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.846317 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2n4w\" (UniqueName: \"kubernetes.io/projected/b06fac1b-774d-4b4d-afd9-58024d9e5903-kube-api-access-f2n4w\") pod \"b06fac1b-774d-4b4d-afd9-58024d9e5903\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.846340 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-config-data\") pod \"b06fac1b-774d-4b4d-afd9-58024d9e5903\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.846406 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-combined-ca-bundle\") pod \"b06fac1b-774d-4b4d-afd9-58024d9e5903\" (UID: \"b06fac1b-774d-4b4d-afd9-58024d9e5903\") " Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.852098 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-scripts" (OuterVolumeSpecName: "scripts") pod "b06fac1b-774d-4b4d-afd9-58024d9e5903" (UID: "b06fac1b-774d-4b4d-afd9-58024d9e5903"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.857331 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b06fac1b-774d-4b4d-afd9-58024d9e5903-kube-api-access-f2n4w" (OuterVolumeSpecName: "kube-api-access-f2n4w") pod "b06fac1b-774d-4b4d-afd9-58024d9e5903" (UID: "b06fac1b-774d-4b4d-afd9-58024d9e5903"). InnerVolumeSpecName "kube-api-access-f2n4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.873442 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b06fac1b-774d-4b4d-afd9-58024d9e5903" (UID: "b06fac1b-774d-4b4d-afd9-58024d9e5903"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.881261 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-config-data" (OuterVolumeSpecName: "config-data") pod "b06fac1b-774d-4b4d-afd9-58024d9e5903" (UID: "b06fac1b-774d-4b4d-afd9-58024d9e5903"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.948870 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.948912 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2n4w\" (UniqueName: \"kubernetes.io/projected/b06fac1b-774d-4b4d-afd9-58024d9e5903-kube-api-access-f2n4w\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.948970 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:34 crc kubenswrapper[4885]: I0308 21:05:34.948993 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b06fac1b-774d-4b4d-afd9-58024d9e5903-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.271877 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" event={"ID":"b06fac1b-774d-4b4d-afd9-58024d9e5903","Type":"ContainerDied","Data":"2167c407393fa34506566b4f13f35923bd52439b3c613576ce23830c24be4dc9"} Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.271969 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2167c407393fa34506566b4f13f35923bd52439b3c613576ce23830c24be4dc9" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.272016 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-7pj4r" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.400315 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:05:35 crc kubenswrapper[4885]: E0308 21:05:35.400901 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b06fac1b-774d-4b4d-afd9-58024d9e5903" containerName="nova-cell0-conductor-db-sync" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.400957 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b06fac1b-774d-4b4d-afd9-58024d9e5903" containerName="nova-cell0-conductor-db-sync" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.401276 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b06fac1b-774d-4b4d-afd9-58024d9e5903" containerName="nova-cell0-conductor-db-sync" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.402238 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.415476 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.446092 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-h5r8g" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.446972 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.566511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.566681 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bp6b\" (UniqueName: \"kubernetes.io/projected/1dff0b58-ac0f-4d39-9910-f924fff8f816-kube-api-access-8bp6b\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.566871 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.672485 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.673140 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bp6b\" (UniqueName: \"kubernetes.io/projected/1dff0b58-ac0f-4d39-9910-f924fff8f816-kube-api-access-8bp6b\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.673222 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.678669 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.679521 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.698007 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bp6b\" (UniqueName: \"kubernetes.io/projected/1dff0b58-ac0f-4d39-9910-f924fff8f816-kube-api-access-8bp6b\") pod \"nova-cell0-conductor-0\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:35 crc kubenswrapper[4885]: I0308 21:05:35.764997 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:36 crc kubenswrapper[4885]: I0308 21:05:36.020204 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:05:36 crc kubenswrapper[4885]: W0308 21:05:36.030215 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dff0b58_ac0f_4d39_9910_f924fff8f816.slice/crio-31d1be971bb445599e9b2b87dbc985813f03ee86921b07147d64305072f4cfbe WatchSource:0}: Error finding container 31d1be971bb445599e9b2b87dbc985813f03ee86921b07147d64305072f4cfbe: Status 404 returned error can't find the container with id 31d1be971bb445599e9b2b87dbc985813f03ee86921b07147d64305072f4cfbe Mar 08 21:05:36 crc kubenswrapper[4885]: I0308 21:05:36.285969 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1dff0b58-ac0f-4d39-9910-f924fff8f816","Type":"ContainerStarted","Data":"8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647"} Mar 08 21:05:36 crc kubenswrapper[4885]: I0308 21:05:36.286037 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1dff0b58-ac0f-4d39-9910-f924fff8f816","Type":"ContainerStarted","Data":"31d1be971bb445599e9b2b87dbc985813f03ee86921b07147d64305072f4cfbe"} Mar 08 21:05:36 crc kubenswrapper[4885]: I0308 21:05:36.286461 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:39 crc kubenswrapper[4885]: I0308 21:05:39.377719 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:05:39 crc kubenswrapper[4885]: E0308 21:05:39.379484 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:05:45 crc kubenswrapper[4885]: I0308 21:05:45.809371 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 08 21:05:45 crc kubenswrapper[4885]: I0308 21:05:45.849898 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=10.849867266 podStartE2EDuration="10.849867266s" podCreationTimestamp="2026-03-08 21:05:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:36.312298208 +0000 UTC m=+5637.708352231" watchObservedRunningTime="2026-03-08 21:05:45.849867266 +0000 UTC m=+5647.245921329" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.302912 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vk668"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.306409 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.309548 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.309580 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.324184 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-config-data\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.324227 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-scripts\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.324281 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.324310 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8xxh\" (UniqueName: \"kubernetes.io/projected/a9535a5b-072e-4a1f-b9e4-89942ba9e800-kube-api-access-v8xxh\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.330418 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vk668"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.424107 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.425453 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428320 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-config-data\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428387 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n8vd\" (UniqueName: \"kubernetes.io/projected/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-kube-api-access-7n8vd\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428421 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-config-data\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428447 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-scripts\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428509 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428531 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8xxh\" (UniqueName: \"kubernetes.io/projected/a9535a5b-072e-4a1f-b9e4-89942ba9e800-kube-api-access-v8xxh\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.428667 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.433832 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.434907 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.436577 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.437317 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-config-data\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.442852 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-scripts\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.449622 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8xxh\" (UniqueName: \"kubernetes.io/projected/a9535a5b-072e-4a1f-b9e4-89942ba9e800-kube-api-access-v8xxh\") pod \"nova-cell0-cell-mapping-vk668\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.525424 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529447 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529528 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-config-data\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529566 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-config-data\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529601 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n8vd\" (UniqueName: \"kubernetes.io/projected/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-kube-api-access-7n8vd\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529669 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwr5z\" (UniqueName: \"kubernetes.io/projected/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-kube-api-access-pwr5z\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529743 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-logs\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.529802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.536777 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.542349 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.543272 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-config-data\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.550592 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.560491 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n8vd\" (UniqueName: \"kubernetes.io/projected/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-kube-api-access-7n8vd\") pod \"nova-scheduler-0\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.567427 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.630717 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.630768 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-config-data\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.630835 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwr5z\" (UniqueName: \"kubernetes.io/projected/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-kube-api-access-pwr5z\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.630909 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-logs\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.631382 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-logs\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.634737 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.649820 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-config-data\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.659317 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.675484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwr5z\" (UniqueName: \"kubernetes.io/projected/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-kube-api-access-pwr5z\") pod \"nova-metadata-0\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.680111 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f86679947-h9j4z"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.681463 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.733123 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f86679947-h9j4z"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.803151 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.804619 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.817749 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.818750 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.826393 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.827358 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835515 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h92b5\" (UniqueName: \"kubernetes.io/projected/b854cb23-a7e2-4249-9d13-70599979ab86-kube-api-access-h92b5\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835565 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-nb\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835596 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835623 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-dns-svc\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b854cb23-a7e2-4249-9d13-70599979ab86-logs\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835655 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835675 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grkm9\" (UniqueName: \"kubernetes.io/projected/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-kube-api-access-grkm9\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835711 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dsgd\" (UniqueName: \"kubernetes.io/projected/7213b9f1-1c28-4e32-b68b-8f7464f38de0-kube-api-access-2dsgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835733 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-config\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-config-data\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835781 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.835799 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.843560 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.844434 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.879960 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.920362 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949594 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-nb\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949649 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-dns-svc\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949691 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b854cb23-a7e2-4249-9d13-70599979ab86-logs\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949709 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949738 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grkm9\" (UniqueName: \"kubernetes.io/projected/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-kube-api-access-grkm9\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949780 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dsgd\" (UniqueName: \"kubernetes.io/projected/7213b9f1-1c28-4e32-b68b-8f7464f38de0-kube-api-access-2dsgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949801 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-config\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949826 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-config-data\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949851 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.949911 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h92b5\" (UniqueName: \"kubernetes.io/projected/b854cb23-a7e2-4249-9d13-70599979ab86-kube-api-access-h92b5\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.951029 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-nb\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.951884 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b854cb23-a7e2-4249-9d13-70599979ab86-logs\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.952202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-dns-svc\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.953034 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-config\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.963532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.966853 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.967483 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grkm9\" (UniqueName: \"kubernetes.io/projected/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-kube-api-access-grkm9\") pod \"dnsmasq-dns-7f86679947-h9j4z\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.968381 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dsgd\" (UniqueName: \"kubernetes.io/projected/7213b9f1-1c28-4e32-b68b-8f7464f38de0-kube-api-access-2dsgd\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.968960 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h92b5\" (UniqueName: \"kubernetes.io/projected/b854cb23-a7e2-4249-9d13-70599979ab86-kube-api-access-h92b5\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.969099 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.969533 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:46 crc kubenswrapper[4885]: I0308 21:05:46.970092 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-config-data\") pod \"nova-api-0\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " pod="openstack/nova-api-0" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.035006 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.172096 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.214058 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.214719 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.340235 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vk668"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.396439 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f86679947-h9j4z"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.478147 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vk668" event={"ID":"a9535a5b-072e-4a1f-b9e4-89942ba9e800","Type":"ContainerStarted","Data":"1a67b003088892071c10f5b2efba45b5a66154636095afd9a5375dbb09036035"} Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.479401 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" event={"ID":"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b","Type":"ContainerStarted","Data":"4c375bccd82478c29984cb004edc82a922cd3f66f47be6d3e4038a2ff4cf6623"} Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.480270 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e08bdfc6-0196-4b23-b6a5-b0c947f646e7","Type":"ContainerStarted","Data":"5cf04ad54507a736672fee7faf37e9b754d2e6cb6f2076a1a5654158ee25a581"} Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.512859 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.598455 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ps8dx"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.599854 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.603780 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.603828 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.613567 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ps8dx"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.664977 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kqct\" (UniqueName: \"kubernetes.io/projected/eed9605f-3b77-4800-9534-6d8f2654f392-kube-api-access-2kqct\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.665289 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-config-data\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.665343 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-scripts\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.665399 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.751607 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.766556 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-scripts\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.766669 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.766780 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kqct\" (UniqueName: \"kubernetes.io/projected/eed9605f-3b77-4800-9534-6d8f2654f392-kube-api-access-2kqct\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.766815 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-config-data\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.773791 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-config-data\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.783452 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.783849 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-scripts\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.784454 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kqct\" (UniqueName: \"kubernetes.io/projected/eed9605f-3b77-4800-9534-6d8f2654f392-kube-api-access-2kqct\") pod \"nova-cell1-conductor-db-sync-ps8dx\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:47 crc kubenswrapper[4885]: W0308 21:05:47.855715 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb854cb23_a7e2_4249_9d13_70599979ab86.slice/crio-ee96183d981f50b41a804b3c4254441ccd6031f6d37fbd5f621cb8a54a1629aa WatchSource:0}: Error finding container ee96183d981f50b41a804b3c4254441ccd6031f6d37fbd5f621cb8a54a1629aa: Status 404 returned error can't find the container with id ee96183d981f50b41a804b3c4254441ccd6031f6d37fbd5f621cb8a54a1629aa Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.875235 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:05:47 crc kubenswrapper[4885]: I0308 21:05:47.928243 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.412520 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ps8dx"] Mar 08 21:05:48 crc kubenswrapper[4885]: W0308 21:05:48.417716 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeed9605f_3b77_4800_9534_6d8f2654f392.slice/crio-81f2c2162fbe34376ed76eb604715dc493f84d670b5cc940b4777b75998fc893 WatchSource:0}: Error finding container 81f2c2162fbe34376ed76eb604715dc493f84d670b5cc940b4777b75998fc893: Status 404 returned error can't find the container with id 81f2c2162fbe34376ed76eb604715dc493f84d670b5cc940b4777b75998fc893 Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.502205 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a","Type":"ContainerStarted","Data":"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.502554 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a","Type":"ContainerStarted","Data":"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.502564 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a","Type":"ContainerStarted","Data":"3c2a0cba92f705d00f5c037b2bfdeb75772f15dbb417883adb8221811ddee0ac"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.504739 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vk668" event={"ID":"a9535a5b-072e-4a1f-b9e4-89942ba9e800","Type":"ContainerStarted","Data":"57d34301e8cc7f8e0b2d448fe0ecba13af188f594f144adc104fb3b5dabb2f60"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.514825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7213b9f1-1c28-4e32-b68b-8f7464f38de0","Type":"ContainerStarted","Data":"2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.514863 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7213b9f1-1c28-4e32-b68b-8f7464f38de0","Type":"ContainerStarted","Data":"6877cb37b34c366b3176b008053204c71bb5733fa92d9537850aea7c85b6ca99"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.530301 4885 generic.go:334] "Generic (PLEG): container finished" podID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerID="364b7c9836de2ac4dcd0074d16339cb1e1fe0eee56d6ea6aba2ce5bd28ef8b4b" exitCode=0 Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.530377 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" event={"ID":"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b","Type":"ContainerDied","Data":"364b7c9836de2ac4dcd0074d16339cb1e1fe0eee56d6ea6aba2ce5bd28ef8b4b"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.537336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" event={"ID":"eed9605f-3b77-4800-9534-6d8f2654f392","Type":"ContainerStarted","Data":"81f2c2162fbe34376ed76eb604715dc493f84d670b5cc940b4777b75998fc893"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.539554 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e08bdfc6-0196-4b23-b6a5-b0c947f646e7","Type":"ContainerStarted","Data":"df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.544360 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.544343776 podStartE2EDuration="2.544343776s" podCreationTimestamp="2026-03-08 21:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:48.540667157 +0000 UTC m=+5649.936721190" watchObservedRunningTime="2026-03-08 21:05:48.544343776 +0000 UTC m=+5649.940397799" Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.561469 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b854cb23-a7e2-4249-9d13-70599979ab86","Type":"ContainerStarted","Data":"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.561511 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b854cb23-a7e2-4249-9d13-70599979ab86","Type":"ContainerStarted","Data":"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.561520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b854cb23-a7e2-4249-9d13-70599979ab86","Type":"ContainerStarted","Data":"ee96183d981f50b41a804b3c4254441ccd6031f6d37fbd5f621cb8a54a1629aa"} Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.564400 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.56438232 podStartE2EDuration="2.56438232s" podCreationTimestamp="2026-03-08 21:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:48.562197862 +0000 UTC m=+5649.958251885" watchObservedRunningTime="2026-03-08 21:05:48.56438232 +0000 UTC m=+5649.960436343" Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.672972 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vk668" podStartSLOduration=2.672947046 podStartE2EDuration="2.672947046s" podCreationTimestamp="2026-03-08 21:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:48.592278564 +0000 UTC m=+5649.988332587" watchObservedRunningTime="2026-03-08 21:05:48.672947046 +0000 UTC m=+5650.069001069" Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.685230 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.685215624 podStartE2EDuration="2.685215624s" podCreationTimestamp="2026-03-08 21:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:48.633383841 +0000 UTC m=+5650.029437864" watchObservedRunningTime="2026-03-08 21:05:48.685215624 +0000 UTC m=+5650.081269637" Mar 08 21:05:48 crc kubenswrapper[4885]: I0308 21:05:48.703577 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.703556973 podStartE2EDuration="2.703556973s" podCreationTimestamp="2026-03-08 21:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:48.66446765 +0000 UTC m=+5650.060521673" watchObservedRunningTime="2026-03-08 21:05:48.703556973 +0000 UTC m=+5650.099610996" Mar 08 21:05:49 crc kubenswrapper[4885]: I0308 21:05:49.590132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" event={"ID":"eed9605f-3b77-4800-9534-6d8f2654f392","Type":"ContainerStarted","Data":"9053532705caa4a801f382164c347679058c8a5255c223b315fac67e8c18e8ef"} Mar 08 21:05:49 crc kubenswrapper[4885]: I0308 21:05:49.618800 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" event={"ID":"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b","Type":"ContainerStarted","Data":"53391697a2f4f90c5ca24bdf716499111736339ebf0bad4e24ae2562e126d17f"} Mar 08 21:05:49 crc kubenswrapper[4885]: I0308 21:05:49.626962 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" podStartSLOduration=2.626949272 podStartE2EDuration="2.626949272s" podCreationTimestamp="2026-03-08 21:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:49.617708076 +0000 UTC m=+5651.013762109" watchObservedRunningTime="2026-03-08 21:05:49.626949272 +0000 UTC m=+5651.023003295" Mar 08 21:05:49 crc kubenswrapper[4885]: I0308 21:05:49.665386 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" podStartSLOduration=3.665369237 podStartE2EDuration="3.665369237s" podCreationTimestamp="2026-03-08 21:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:49.657342483 +0000 UTC m=+5651.053396506" watchObservedRunningTime="2026-03-08 21:05:49.665369237 +0000 UTC m=+5651.061423260" Mar 08 21:05:50 crc kubenswrapper[4885]: I0308 21:05:50.628369 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:51 crc kubenswrapper[4885]: I0308 21:05:51.639417 4885 generic.go:334] "Generic (PLEG): container finished" podID="eed9605f-3b77-4800-9534-6d8f2654f392" containerID="9053532705caa4a801f382164c347679058c8a5255c223b315fac67e8c18e8ef" exitCode=0 Mar 08 21:05:51 crc kubenswrapper[4885]: I0308 21:05:51.640363 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" event={"ID":"eed9605f-3b77-4800-9534-6d8f2654f392","Type":"ContainerDied","Data":"9053532705caa4a801f382164c347679058c8a5255c223b315fac67e8c18e8ef"} Mar 08 21:05:51 crc kubenswrapper[4885]: I0308 21:05:51.827440 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 21:05:51 crc kubenswrapper[4885]: I0308 21:05:51.921269 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:05:51 crc kubenswrapper[4885]: I0308 21:05:51.921356 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:05:52 crc kubenswrapper[4885]: I0308 21:05:52.215369 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:52 crc kubenswrapper[4885]: I0308 21:05:52.653439 4885 generic.go:334] "Generic (PLEG): container finished" podID="a9535a5b-072e-4a1f-b9e4-89942ba9e800" containerID="57d34301e8cc7f8e0b2d448fe0ecba13af188f594f144adc104fb3b5dabb2f60" exitCode=0 Mar 08 21:05:52 crc kubenswrapper[4885]: I0308 21:05:52.653555 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vk668" event={"ID":"a9535a5b-072e-4a1f-b9e4-89942ba9e800","Type":"ContainerDied","Data":"57d34301e8cc7f8e0b2d448fe0ecba13af188f594f144adc104fb3b5dabb2f60"} Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.151076 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.292327 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-config-data\") pod \"eed9605f-3b77-4800-9534-6d8f2654f392\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.292600 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kqct\" (UniqueName: \"kubernetes.io/projected/eed9605f-3b77-4800-9534-6d8f2654f392-kube-api-access-2kqct\") pod \"eed9605f-3b77-4800-9534-6d8f2654f392\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.292801 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-scripts\") pod \"eed9605f-3b77-4800-9534-6d8f2654f392\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.292911 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-combined-ca-bundle\") pod \"eed9605f-3b77-4800-9534-6d8f2654f392\" (UID: \"eed9605f-3b77-4800-9534-6d8f2654f392\") " Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.306028 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-scripts" (OuterVolumeSpecName: "scripts") pod "eed9605f-3b77-4800-9534-6d8f2654f392" (UID: "eed9605f-3b77-4800-9534-6d8f2654f392"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.306176 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eed9605f-3b77-4800-9534-6d8f2654f392-kube-api-access-2kqct" (OuterVolumeSpecName: "kube-api-access-2kqct") pod "eed9605f-3b77-4800-9534-6d8f2654f392" (UID: "eed9605f-3b77-4800-9534-6d8f2654f392"). InnerVolumeSpecName "kube-api-access-2kqct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.318824 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-config-data" (OuterVolumeSpecName: "config-data") pod "eed9605f-3b77-4800-9534-6d8f2654f392" (UID: "eed9605f-3b77-4800-9534-6d8f2654f392"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.347127 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eed9605f-3b77-4800-9534-6d8f2654f392" (UID: "eed9605f-3b77-4800-9534-6d8f2654f392"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.395945 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kqct\" (UniqueName: \"kubernetes.io/projected/eed9605f-3b77-4800-9534-6d8f2654f392-kube-api-access-2kqct\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.395992 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.396013 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.396033 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eed9605f-3b77-4800-9534-6d8f2654f392-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.672784 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.672837 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-ps8dx" event={"ID":"eed9605f-3b77-4800-9534-6d8f2654f392","Type":"ContainerDied","Data":"81f2c2162fbe34376ed76eb604715dc493f84d670b5cc940b4777b75998fc893"} Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.672864 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81f2c2162fbe34376ed76eb604715dc493f84d670b5cc940b4777b75998fc893" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.805575 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:05:53 crc kubenswrapper[4885]: E0308 21:05:53.806034 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eed9605f-3b77-4800-9534-6d8f2654f392" containerName="nova-cell1-conductor-db-sync" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.806055 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed9605f-3b77-4800-9534-6d8f2654f392" containerName="nova-cell1-conductor-db-sync" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.806345 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eed9605f-3b77-4800-9534-6d8f2654f392" containerName="nova-cell1-conductor-db-sync" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.807110 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.809378 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7jsn\" (UniqueName: \"kubernetes.io/projected/69024278-2c5f-4862-ac44-e04663a0c4a5-kube-api-access-d7jsn\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.809423 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.809469 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.810058 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.829104 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.910862 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7jsn\" (UniqueName: \"kubernetes.io/projected/69024278-2c5f-4862-ac44-e04663a0c4a5-kube-api-access-d7jsn\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.910904 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.910938 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.914600 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.917739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:53 crc kubenswrapper[4885]: I0308 21:05:53.943150 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7jsn\" (UniqueName: \"kubernetes.io/projected/69024278-2c5f-4862-ac44-e04663a0c4a5-kube-api-access-d7jsn\") pod \"nova-cell1-conductor-0\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.022123 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.114845 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8xxh\" (UniqueName: \"kubernetes.io/projected/a9535a5b-072e-4a1f-b9e4-89942ba9e800-kube-api-access-v8xxh\") pod \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.115542 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-config-data\") pod \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.115583 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-scripts\") pod \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.115627 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-combined-ca-bundle\") pod \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\" (UID: \"a9535a5b-072e-4a1f-b9e4-89942ba9e800\") " Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.119133 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9535a5b-072e-4a1f-b9e4-89942ba9e800-kube-api-access-v8xxh" (OuterVolumeSpecName: "kube-api-access-v8xxh") pod "a9535a5b-072e-4a1f-b9e4-89942ba9e800" (UID: "a9535a5b-072e-4a1f-b9e4-89942ba9e800"). InnerVolumeSpecName "kube-api-access-v8xxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.120422 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-scripts" (OuterVolumeSpecName: "scripts") pod "a9535a5b-072e-4a1f-b9e4-89942ba9e800" (UID: "a9535a5b-072e-4a1f-b9e4-89942ba9e800"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.132906 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.141131 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9535a5b-072e-4a1f-b9e4-89942ba9e800" (UID: "a9535a5b-072e-4a1f-b9e4-89942ba9e800"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.157954 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-config-data" (OuterVolumeSpecName: "config-data") pod "a9535a5b-072e-4a1f-b9e4-89942ba9e800" (UID: "a9535a5b-072e-4a1f-b9e4-89942ba9e800"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.217526 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.217846 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.218065 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9535a5b-072e-4a1f-b9e4-89942ba9e800-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.218196 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8xxh\" (UniqueName: \"kubernetes.io/projected/a9535a5b-072e-4a1f-b9e4-89942ba9e800-kube-api-access-v8xxh\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.369096 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:05:54 crc kubenswrapper[4885]: E0308 21:05:54.369462 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.410021 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.688834 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69024278-2c5f-4862-ac44-e04663a0c4a5","Type":"ContainerStarted","Data":"b24462edde9f60cfa7555c270a546d933c909d490fc62394b9ed6e4a826084f2"} Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.689579 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.689601 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69024278-2c5f-4862-ac44-e04663a0c4a5","Type":"ContainerStarted","Data":"38f1ceae301c390ad8c08deda2ca09e48e527fb1b139f821cdcf653ea04147c0"} Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.697490 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vk668" event={"ID":"a9535a5b-072e-4a1f-b9e4-89942ba9e800","Type":"ContainerDied","Data":"1a67b003088892071c10f5b2efba45b5a66154636095afd9a5375dbb09036035"} Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.697556 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a67b003088892071c10f5b2efba45b5a66154636095afd9a5375dbb09036035" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.697605 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vk668" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.715770 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.715748517 podStartE2EDuration="1.715748517s" podCreationTimestamp="2026-03-08 21:05:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:54.705464092 +0000 UTC m=+5656.101518125" watchObservedRunningTime="2026-03-08 21:05:54.715748517 +0000 UTC m=+5656.111802530" Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.909454 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.909861 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e08bdfc6-0196-4b23-b6a5-b0c947f646e7" containerName="nova-scheduler-scheduler" containerID="cri-o://df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2" gracePeriod=30 Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.919810 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.920479 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-api" containerID="cri-o://62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72" gracePeriod=30 Mar 08 21:05:54 crc kubenswrapper[4885]: I0308 21:05:54.920749 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-log" containerID="cri-o://c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de" gracePeriod=30 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.001462 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.001718 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-log" containerID="cri-o://7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba" gracePeriod=30 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.002284 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-metadata" containerID="cri-o://e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8" gracePeriod=30 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.668541 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.688226 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706715 4885 generic.go:334] "Generic (PLEG): container finished" podID="b854cb23-a7e2-4249-9d13-70599979ab86" containerID="62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72" exitCode=0 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706747 4885 generic.go:334] "Generic (PLEG): container finished" podID="b854cb23-a7e2-4249-9d13-70599979ab86" containerID="c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de" exitCode=143 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706757 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706799 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b854cb23-a7e2-4249-9d13-70599979ab86","Type":"ContainerDied","Data":"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72"} Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b854cb23-a7e2-4249-9d13-70599979ab86","Type":"ContainerDied","Data":"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de"} Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706864 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b854cb23-a7e2-4249-9d13-70599979ab86","Type":"ContainerDied","Data":"ee96183d981f50b41a804b3c4254441ccd6031f6d37fbd5f621cb8a54a1629aa"} Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.706880 4885 scope.go:117] "RemoveContainer" containerID="62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.709140 4885 generic.go:334] "Generic (PLEG): container finished" podID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerID="e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8" exitCode=0 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.709159 4885 generic.go:334] "Generic (PLEG): container finished" podID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerID="7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba" exitCode=143 Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.709169 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a","Type":"ContainerDied","Data":"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8"} Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.709204 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.709206 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a","Type":"ContainerDied","Data":"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba"} Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.709361 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a","Type":"ContainerDied","Data":"3c2a0cba92f705d00f5c037b2bfdeb75772f15dbb417883adb8221811ddee0ac"} Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.733868 4885 scope.go:117] "RemoveContainer" containerID="c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.755236 4885 scope.go:117] "RemoveContainer" containerID="62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72" Mar 08 21:05:55 crc kubenswrapper[4885]: E0308 21:05:55.757372 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72\": container with ID starting with 62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72 not found: ID does not exist" containerID="62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.757419 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72"} err="failed to get container status \"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72\": rpc error: code = NotFound desc = could not find container \"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72\": container with ID starting with 62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72 not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.757448 4885 scope.go:117] "RemoveContainer" containerID="c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de" Mar 08 21:05:55 crc kubenswrapper[4885]: E0308 21:05:55.758214 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de\": container with ID starting with c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de not found: ID does not exist" containerID="c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.758302 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de"} err="failed to get container status \"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de\": rpc error: code = NotFound desc = could not find container \"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de\": container with ID starting with c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.758336 4885 scope.go:117] "RemoveContainer" containerID="62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.758743 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72"} err="failed to get container status \"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72\": rpc error: code = NotFound desc = could not find container \"62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72\": container with ID starting with 62eee57a762e7e5148eec9bafa2e3ed791e190a571e7dabb72c12f9d8b69ab72 not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.758797 4885 scope.go:117] "RemoveContainer" containerID="c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759158 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-logs\") pod \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759214 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-combined-ca-bundle\") pod \"b854cb23-a7e2-4249-9d13-70599979ab86\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759244 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-config-data\") pod \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759270 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-config-data\") pod \"b854cb23-a7e2-4249-9d13-70599979ab86\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759307 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h92b5\" (UniqueName: \"kubernetes.io/projected/b854cb23-a7e2-4249-9d13-70599979ab86-kube-api-access-h92b5\") pod \"b854cb23-a7e2-4249-9d13-70599979ab86\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759483 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-combined-ca-bundle\") pod \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759506 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwr5z\" (UniqueName: \"kubernetes.io/projected/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-kube-api-access-pwr5z\") pod \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\" (UID: \"3fae606b-66a3-440d-a3b7-7fb6ba2cd22a\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.759543 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b854cb23-a7e2-4249-9d13-70599979ab86-logs\") pod \"b854cb23-a7e2-4249-9d13-70599979ab86\" (UID: \"b854cb23-a7e2-4249-9d13-70599979ab86\") " Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.760198 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-logs" (OuterVolumeSpecName: "logs") pod "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" (UID: "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.760489 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b854cb23-a7e2-4249-9d13-70599979ab86-logs" (OuterVolumeSpecName: "logs") pod "b854cb23-a7e2-4249-9d13-70599979ab86" (UID: "b854cb23-a7e2-4249-9d13-70599979ab86"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.765072 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de"} err="failed to get container status \"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de\": rpc error: code = NotFound desc = could not find container \"c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de\": container with ID starting with c70a753cf030f9265f6fe955f27f09a3a2d8f81fb95e10280cde677a927483de not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.765113 4885 scope.go:117] "RemoveContainer" containerID="e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.765283 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b854cb23-a7e2-4249-9d13-70599979ab86-kube-api-access-h92b5" (OuterVolumeSpecName: "kube-api-access-h92b5") pod "b854cb23-a7e2-4249-9d13-70599979ab86" (UID: "b854cb23-a7e2-4249-9d13-70599979ab86"). InnerVolumeSpecName "kube-api-access-h92b5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.767137 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-kube-api-access-pwr5z" (OuterVolumeSpecName: "kube-api-access-pwr5z") pod "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" (UID: "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a"). InnerVolumeSpecName "kube-api-access-pwr5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.784300 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b854cb23-a7e2-4249-9d13-70599979ab86" (UID: "b854cb23-a7e2-4249-9d13-70599979ab86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.784396 4885 scope.go:117] "RemoveContainer" containerID="7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.785263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-config-data" (OuterVolumeSpecName: "config-data") pod "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" (UID: "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.790826 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-config-data" (OuterVolumeSpecName: "config-data") pod "b854cb23-a7e2-4249-9d13-70599979ab86" (UID: "b854cb23-a7e2-4249-9d13-70599979ab86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.792299 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" (UID: "3fae606b-66a3-440d-a3b7-7fb6ba2cd22a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.800874 4885 scope.go:117] "RemoveContainer" containerID="e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8" Mar 08 21:05:55 crc kubenswrapper[4885]: E0308 21:05:55.801248 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8\": container with ID starting with e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8 not found: ID does not exist" containerID="e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.801278 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8"} err="failed to get container status \"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8\": rpc error: code = NotFound desc = could not find container \"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8\": container with ID starting with e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8 not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.801299 4885 scope.go:117] "RemoveContainer" containerID="7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba" Mar 08 21:05:55 crc kubenswrapper[4885]: E0308 21:05:55.801603 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba\": container with ID starting with 7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba not found: ID does not exist" containerID="7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.801631 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba"} err="failed to get container status \"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba\": rpc error: code = NotFound desc = could not find container \"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba\": container with ID starting with 7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.801648 4885 scope.go:117] "RemoveContainer" containerID="e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.801917 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8"} err="failed to get container status \"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8\": rpc error: code = NotFound desc = could not find container \"e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8\": container with ID starting with e4a3533afca0c3d82634f14a9888ced8d5dc2190a80bf0a93691ea1a8339e6b8 not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.801969 4885 scope.go:117] "RemoveContainer" containerID="7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.802257 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba"} err="failed to get container status \"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba\": rpc error: code = NotFound desc = could not find container \"7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba\": container with ID starting with 7e8c19e98379f26f54f8f632f0110ba3b1ef35c6e21f61a30afc6c549514beba not found: ID does not exist" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862057 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862078 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862086 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b854cb23-a7e2-4249-9d13-70599979ab86-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862094 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h92b5\" (UniqueName: \"kubernetes.io/projected/b854cb23-a7e2-4249-9d13-70599979ab86-kube-api-access-h92b5\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862103 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862111 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwr5z\" (UniqueName: \"kubernetes.io/projected/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-kube-api-access-pwr5z\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862119 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b854cb23-a7e2-4249-9d13-70599979ab86-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:55 crc kubenswrapper[4885]: I0308 21:05:55.862127 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.086854 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.100013 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.117819 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.123899 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.134864 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: E0308 21:05:56.135355 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-log" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135375 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-log" Mar 08 21:05:56 crc kubenswrapper[4885]: E0308 21:05:56.135394 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-api" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135403 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-api" Mar 08 21:05:56 crc kubenswrapper[4885]: E0308 21:05:56.135427 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-log" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135435 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-log" Mar 08 21:05:56 crc kubenswrapper[4885]: E0308 21:05:56.135460 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9535a5b-072e-4a1f-b9e4-89942ba9e800" containerName="nova-manage" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135468 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9535a5b-072e-4a1f-b9e4-89942ba9e800" containerName="nova-manage" Mar 08 21:05:56 crc kubenswrapper[4885]: E0308 21:05:56.135480 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-metadata" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135488 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-metadata" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135704 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-log" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135719 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-metadata" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135730 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9535a5b-072e-4a1f-b9e4-89942ba9e800" containerName="nova-manage" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135741 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" containerName="nova-api-api" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.135765 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" containerName="nova-metadata-log" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.136875 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.139800 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.158341 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.159955 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.162984 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.167729 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.167788 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-config-data\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.167835 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.167851 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-config-data\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.167981 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a3be88-3b2d-4cd0-8987-443d67351acb-logs\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.168124 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfkcz\" (UniqueName: \"kubernetes.io/projected/d8a3be88-3b2d-4cd0-8987-443d67351acb-kube-api-access-cfkcz\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.168147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41f915-4ddc-4f84-b402-67e3ff310d0e-logs\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.168168 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj25c\" (UniqueName: \"kubernetes.io/projected/8d41f915-4ddc-4f84-b402-67e3ff310d0e-kube-api-access-sj25c\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.178305 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.206524 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269608 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269649 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-config-data\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269694 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a3be88-3b2d-4cd0-8987-443d67351acb-logs\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269738 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfkcz\" (UniqueName: \"kubernetes.io/projected/d8a3be88-3b2d-4cd0-8987-443d67351acb-kube-api-access-cfkcz\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269755 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41f915-4ddc-4f84-b402-67e3ff310d0e-logs\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269773 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj25c\" (UniqueName: \"kubernetes.io/projected/8d41f915-4ddc-4f84-b402-67e3ff310d0e-kube-api-access-sj25c\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269803 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.269833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-config-data\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.270824 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41f915-4ddc-4f84-b402-67e3ff310d0e-logs\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.271047 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a3be88-3b2d-4cd0-8987-443d67351acb-logs\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.274064 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-config-data\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.274978 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-config-data\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.274994 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.275484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.300600 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfkcz\" (UniqueName: \"kubernetes.io/projected/d8a3be88-3b2d-4cd0-8987-443d67351acb-kube-api-access-cfkcz\") pod \"nova-api-0\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.301711 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj25c\" (UniqueName: \"kubernetes.io/projected/8d41f915-4ddc-4f84-b402-67e3ff310d0e-kube-api-access-sj25c\") pod \"nova-metadata-0\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.461852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.495306 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:05:56 crc kubenswrapper[4885]: W0308 21:05:56.939936 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d41f915_4ddc_4f84_b402_67e3ff310d0e.slice/crio-20924542c83be387f0ae030282c946d65a2f7a96ddb827f2a18a8e4ab1e8eddb WatchSource:0}: Error finding container 20924542c83be387f0ae030282c946d65a2f7a96ddb827f2a18a8e4ab1e8eddb: Status 404 returned error can't find the container with id 20924542c83be387f0ae030282c946d65a2f7a96ddb827f2a18a8e4ab1e8eddb Mar 08 21:05:56 crc kubenswrapper[4885]: I0308 21:05:56.941831 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.037223 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:05:57 crc kubenswrapper[4885]: W0308 21:05:57.038468 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8a3be88_3b2d_4cd0_8987_443d67351acb.slice/crio-8c02c0dd36a35b639dcbf4fdeb957cafcaf6477b4aefd990c59836fff31381b3 WatchSource:0}: Error finding container 8c02c0dd36a35b639dcbf4fdeb957cafcaf6477b4aefd990c59836fff31381b3: Status 404 returned error can't find the container with id 8c02c0dd36a35b639dcbf4fdeb957cafcaf6477b4aefd990c59836fff31381b3 Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.042244 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.103252 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dd79dd4c-22rq2"] Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.103581 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerName="dnsmasq-dns" containerID="cri-o://4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8" gracePeriod=10 Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.215486 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.238281 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.382081 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fae606b-66a3-440d-a3b7-7fb6ba2cd22a" path="/var/lib/kubelet/pods/3fae606b-66a3-440d-a3b7-7fb6ba2cd22a/volumes" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.382707 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b854cb23-a7e2-4249-9d13-70599979ab86" path="/var/lib/kubelet/pods/b854cb23-a7e2-4249-9d13-70599979ab86/volumes" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.527178 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.696064 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-nb\") pod \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.696133 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-dns-svc\") pod \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.696191 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dkcc\" (UniqueName: \"kubernetes.io/projected/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-kube-api-access-7dkcc\") pod \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.696241 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-sb\") pod \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.696274 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-config\") pod \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\" (UID: \"a18bb9fd-f7a3-4935-9d89-26654d7e08c5\") " Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.706852 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-kube-api-access-7dkcc" (OuterVolumeSpecName: "kube-api-access-7dkcc") pod "a18bb9fd-f7a3-4935-9d89-26654d7e08c5" (UID: "a18bb9fd-f7a3-4935-9d89-26654d7e08c5"). InnerVolumeSpecName "kube-api-access-7dkcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.741603 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a18bb9fd-f7a3-4935-9d89-26654d7e08c5" (UID: "a18bb9fd-f7a3-4935-9d89-26654d7e08c5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.743422 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a18bb9fd-f7a3-4935-9d89-26654d7e08c5" (UID: "a18bb9fd-f7a3-4935-9d89-26654d7e08c5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.743817 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-config" (OuterVolumeSpecName: "config") pod "a18bb9fd-f7a3-4935-9d89-26654d7e08c5" (UID: "a18bb9fd-f7a3-4935-9d89-26654d7e08c5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.746021 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41f915-4ddc-4f84-b402-67e3ff310d0e","Type":"ContainerStarted","Data":"cdeaf3122ca8dbc663268a6c75437ef3b2d630cf202295db223a82e3e219dba9"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.746093 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41f915-4ddc-4f84-b402-67e3ff310d0e","Type":"ContainerStarted","Data":"1c9a277208821dbe45c51d736ad5778842b9c47f876f42497004e47858db7d7a"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.746107 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41f915-4ddc-4f84-b402-67e3ff310d0e","Type":"ContainerStarted","Data":"20924542c83be387f0ae030282c946d65a2f7a96ddb827f2a18a8e4ab1e8eddb"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.748610 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8a3be88-3b2d-4cd0-8987-443d67351acb","Type":"ContainerStarted","Data":"756c1aa575f487dbda9fdf021512017307ddd58458c8521dba14b321c91f8b89"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.748646 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8a3be88-3b2d-4cd0-8987-443d67351acb","Type":"ContainerStarted","Data":"145acadb2aaec8dcd5165a895f8909b8c1b38463cef4ef90b51c7692b91fbfd5"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.748658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8a3be88-3b2d-4cd0-8987-443d67351acb","Type":"ContainerStarted","Data":"8c02c0dd36a35b639dcbf4fdeb957cafcaf6477b4aefd990c59836fff31381b3"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.759319 4885 generic.go:334] "Generic (PLEG): container finished" podID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerID="4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8" exitCode=0 Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.759489 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a18bb9fd-f7a3-4935-9d89-26654d7e08c5" (UID: "a18bb9fd-f7a3-4935-9d89-26654d7e08c5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.759499 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" event={"ID":"a18bb9fd-f7a3-4935-9d89-26654d7e08c5","Type":"ContainerDied","Data":"4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.759559 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" event={"ID":"a18bb9fd-f7a3-4935-9d89-26654d7e08c5","Type":"ContainerDied","Data":"b86733ef92f0d474db415f2ffb90c760fd1a463fdc711ed1f41f539df4b11b99"} Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.759584 4885 scope.go:117] "RemoveContainer" containerID="4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.759990 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67dd79dd4c-22rq2" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.775957 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.777594 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.777572785 podStartE2EDuration="1.777572785s" podCreationTimestamp="2026-03-08 21:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:57.76537411 +0000 UTC m=+5659.161428153" watchObservedRunningTime="2026-03-08 21:05:57.777572785 +0000 UTC m=+5659.173626808" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.785444 4885 scope.go:117] "RemoveContainer" containerID="59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.790769 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.7907468560000002 podStartE2EDuration="1.790746856s" podCreationTimestamp="2026-03-08 21:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:05:57.786561455 +0000 UTC m=+5659.182615488" watchObservedRunningTime="2026-03-08 21:05:57.790746856 +0000 UTC m=+5659.186800879" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.798454 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.798490 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.798505 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dkcc\" (UniqueName: \"kubernetes.io/projected/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-kube-api-access-7dkcc\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.798516 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.798527 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a18bb9fd-f7a3-4935-9d89-26654d7e08c5-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.848126 4885 scope.go:117] "RemoveContainer" containerID="4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8" Mar 08 21:05:57 crc kubenswrapper[4885]: E0308 21:05:57.848486 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8\": container with ID starting with 4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8 not found: ID does not exist" containerID="4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.848529 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8"} err="failed to get container status \"4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8\": rpc error: code = NotFound desc = could not find container \"4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8\": container with ID starting with 4957d1d1a97112165527f0ba72cb5d9c75e1dad24c50ce2e81359ceb6908a4c8 not found: ID does not exist" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.848547 4885 scope.go:117] "RemoveContainer" containerID="59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3" Mar 08 21:05:57 crc kubenswrapper[4885]: E0308 21:05:57.848904 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3\": container with ID starting with 59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3 not found: ID does not exist" containerID="59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.848940 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3"} err="failed to get container status \"59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3\": rpc error: code = NotFound desc = could not find container \"59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3\": container with ID starting with 59bf8d01f31ed5a6ee4ab8f55425f15958d8c2c19bb2cac4e05ed6ebbf4b45f3 not found: ID does not exist" Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.864814 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67dd79dd4c-22rq2"] Mar 08 21:05:57 crc kubenswrapper[4885]: I0308 21:05:57.871281 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67dd79dd4c-22rq2"] Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.209439 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.383414 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" path="/var/lib/kubelet/pods/a18bb9fd-f7a3-4935-9d89-26654d7e08c5/volumes" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.493889 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.662461 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-combined-ca-bundle\") pod \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.662565 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-config-data\") pod \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.662664 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n8vd\" (UniqueName: \"kubernetes.io/projected/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-kube-api-access-7n8vd\") pod \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\" (UID: \"e08bdfc6-0196-4b23-b6a5-b0c947f646e7\") " Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.674103 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-kube-api-access-7n8vd" (OuterVolumeSpecName: "kube-api-access-7n8vd") pod "e08bdfc6-0196-4b23-b6a5-b0c947f646e7" (UID: "e08bdfc6-0196-4b23-b6a5-b0c947f646e7"). InnerVolumeSpecName "kube-api-access-7n8vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.701249 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-config-data" (OuterVolumeSpecName: "config-data") pod "e08bdfc6-0196-4b23-b6a5-b0c947f646e7" (UID: "e08bdfc6-0196-4b23-b6a5-b0c947f646e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.726137 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e08bdfc6-0196-4b23-b6a5-b0c947f646e7" (UID: "e08bdfc6-0196-4b23-b6a5-b0c947f646e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.764591 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.765363 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.765434 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n8vd\" (UniqueName: \"kubernetes.io/projected/e08bdfc6-0196-4b23-b6a5-b0c947f646e7-kube-api-access-7n8vd\") on node \"crc\" DevicePath \"\"" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.781983 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-8crmt"] Mar 08 21:05:59 crc kubenswrapper[4885]: E0308 21:05:59.782763 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerName="init" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.782848 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerName="init" Mar 08 21:05:59 crc kubenswrapper[4885]: E0308 21:05:59.782954 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e08bdfc6-0196-4b23-b6a5-b0c947f646e7" containerName="nova-scheduler-scheduler" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.783027 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e08bdfc6-0196-4b23-b6a5-b0c947f646e7" containerName="nova-scheduler-scheduler" Mar 08 21:05:59 crc kubenswrapper[4885]: E0308 21:05:59.783149 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerName="dnsmasq-dns" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.783222 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerName="dnsmasq-dns" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.783523 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e08bdfc6-0196-4b23-b6a5-b0c947f646e7" containerName="nova-scheduler-scheduler" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.783616 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a18bb9fd-f7a3-4935-9d89-26654d7e08c5" containerName="dnsmasq-dns" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.784766 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.791379 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.796629 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.800545 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8crmt"] Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.818416 4885 generic.go:334] "Generic (PLEG): container finished" podID="e08bdfc6-0196-4b23-b6a5-b0c947f646e7" containerID="df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2" exitCode=0 Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.818480 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e08bdfc6-0196-4b23-b6a5-b0c947f646e7","Type":"ContainerDied","Data":"df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2"} Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.818515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e08bdfc6-0196-4b23-b6a5-b0c947f646e7","Type":"ContainerDied","Data":"5cf04ad54507a736672fee7faf37e9b754d2e6cb6f2076a1a5654158ee25a581"} Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.818533 4885 scope.go:117] "RemoveContainer" containerID="df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.818690 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.864830 4885 scope.go:117] "RemoveContainer" containerID="df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2" Mar 08 21:05:59 crc kubenswrapper[4885]: E0308 21:05:59.869103 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2\": container with ID starting with df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2 not found: ID does not exist" containerID="df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.869149 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2"} err="failed to get container status \"df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2\": rpc error: code = NotFound desc = could not find container \"df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2\": container with ID starting with df9d39e7972a830ec439f92e8770ef752e329461e117449a12fedf1bf7476cf2 not found: ID does not exist" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.929999 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.953961 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.970957 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66fz6\" (UniqueName: \"kubernetes.io/projected/3405968f-173e-4ab2-a8ac-699fdaaad4d3-kube-api-access-66fz6\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.971038 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-scripts\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.971067 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.971100 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-config-data\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:05:59 crc kubenswrapper[4885]: I0308 21:05:59.993206 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:05:59.994715 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.005890 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.020813 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.072192 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66fz6\" (UniqueName: \"kubernetes.io/projected/3405968f-173e-4ab2-a8ac-699fdaaad4d3-kube-api-access-66fz6\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.072255 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-scripts\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.072290 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.072326 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-config-data\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.076310 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.076872 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-config-data\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.082448 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-scripts\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.092765 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66fz6\" (UniqueName: \"kubernetes.io/projected/3405968f-173e-4ab2-a8ac-699fdaaad4d3-kube-api-access-66fz6\") pod \"nova-cell1-cell-mapping-8crmt\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.134488 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550066-6hpw5"] Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.136425 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.140235 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.141353 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.141430 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.163637 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550066-6hpw5"] Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.171159 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.173266 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/d0878726-55a7-4b21-95ba-4dda1491dfdd-kube-api-access-6xxpc\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.173447 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.173513 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-config-data\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.276004 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.276362 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-config-data\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.276538 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/d0878726-55a7-4b21-95ba-4dda1491dfdd-kube-api-access-6xxpc\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.276638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4whn\" (UniqueName: \"kubernetes.io/projected/ddbd1248-e534-4251-b5a6-0505b7710e6e-kube-api-access-d4whn\") pod \"auto-csr-approver-29550066-6hpw5\" (UID: \"ddbd1248-e534-4251-b5a6-0505b7710e6e\") " pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.281240 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.281264 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-config-data\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.303274 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/d0878726-55a7-4b21-95ba-4dda1491dfdd-kube-api-access-6xxpc\") pod \"nova-scheduler-0\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.353418 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.377633 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4whn\" (UniqueName: \"kubernetes.io/projected/ddbd1248-e534-4251-b5a6-0505b7710e6e-kube-api-access-d4whn\") pod \"auto-csr-approver-29550066-6hpw5\" (UID: \"ddbd1248-e534-4251-b5a6-0505b7710e6e\") " pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.407577 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4whn\" (UniqueName: \"kubernetes.io/projected/ddbd1248-e534-4251-b5a6-0505b7710e6e-kube-api-access-d4whn\") pod \"auto-csr-approver-29550066-6hpw5\" (UID: \"ddbd1248-e534-4251-b5a6-0505b7710e6e\") " pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.461101 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.719729 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8crmt"] Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.735294 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550066-6hpw5"] Mar 08 21:06:00 crc kubenswrapper[4885]: W0308 21:06:00.735415 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3405968f_173e_4ab2_a8ac_699fdaaad4d3.slice/crio-8aa86ae41dcf34ddf33de5b419b318cc82ebeca192c9184ed4b4e62fb7cb6132 WatchSource:0}: Error finding container 8aa86ae41dcf34ddf33de5b419b318cc82ebeca192c9184ed4b4e62fb7cb6132: Status 404 returned error can't find the container with id 8aa86ae41dcf34ddf33de5b419b318cc82ebeca192c9184ed4b4e62fb7cb6132 Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.820380 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.840834 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8crmt" event={"ID":"3405968f-173e-4ab2-a8ac-699fdaaad4d3","Type":"ContainerStarted","Data":"8aa86ae41dcf34ddf33de5b419b318cc82ebeca192c9184ed4b4e62fb7cb6132"} Mar 08 21:06:00 crc kubenswrapper[4885]: I0308 21:06:00.846185 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" event={"ID":"ddbd1248-e534-4251-b5a6-0505b7710e6e","Type":"ContainerStarted","Data":"f4a62a10f48b881adff6ba21994ad309b2fffdf31994bf17c62cbe92bb1d398d"} Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.400435 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e08bdfc6-0196-4b23-b6a5-b0c947f646e7" path="/var/lib/kubelet/pods/e08bdfc6-0196-4b23-b6a5-b0c947f646e7/volumes" Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.462692 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.462884 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.865585 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0878726-55a7-4b21-95ba-4dda1491dfdd","Type":"ContainerStarted","Data":"3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14"} Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.865658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0878726-55a7-4b21-95ba-4dda1491dfdd","Type":"ContainerStarted","Data":"fc7da7a57e0febad3191a8b8237c25b407f5b585b3c92502602c4310b7e751ee"} Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.868141 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8crmt" event={"ID":"3405968f-173e-4ab2-a8ac-699fdaaad4d3","Type":"ContainerStarted","Data":"8930c6f62095bcf638d08901a77c2574bea75083141756a22093eb3aac06abfe"} Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.898498 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.898468433 podStartE2EDuration="2.898468433s" podCreationTimestamp="2026-03-08 21:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:01.888602499 +0000 UTC m=+5663.284656522" watchObservedRunningTime="2026-03-08 21:06:01.898468433 +0000 UTC m=+5663.294522456" Mar 08 21:06:01 crc kubenswrapper[4885]: I0308 21:06:01.924335 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-8crmt" podStartSLOduration=2.924307781 podStartE2EDuration="2.924307781s" podCreationTimestamp="2026-03-08 21:05:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:01.908586212 +0000 UTC m=+5663.304640235" watchObservedRunningTime="2026-03-08 21:06:01.924307781 +0000 UTC m=+5663.320361834" Mar 08 21:06:02 crc kubenswrapper[4885]: I0308 21:06:02.880544 4885 generic.go:334] "Generic (PLEG): container finished" podID="ddbd1248-e534-4251-b5a6-0505b7710e6e" containerID="f4d740c9938b3b085cc1665a4c48f0e8e5909dace559f7eedf545ed929b6ffde" exitCode=0 Mar 08 21:06:02 crc kubenswrapper[4885]: I0308 21:06:02.880672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" event={"ID":"ddbd1248-e534-4251-b5a6-0505b7710e6e","Type":"ContainerDied","Data":"f4d740c9938b3b085cc1665a4c48f0e8e5909dace559f7eedf545ed929b6ffde"} Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.373008 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.475711 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4whn\" (UniqueName: \"kubernetes.io/projected/ddbd1248-e534-4251-b5a6-0505b7710e6e-kube-api-access-d4whn\") pod \"ddbd1248-e534-4251-b5a6-0505b7710e6e\" (UID: \"ddbd1248-e534-4251-b5a6-0505b7710e6e\") " Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.488847 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddbd1248-e534-4251-b5a6-0505b7710e6e-kube-api-access-d4whn" (OuterVolumeSpecName: "kube-api-access-d4whn") pod "ddbd1248-e534-4251-b5a6-0505b7710e6e" (UID: "ddbd1248-e534-4251-b5a6-0505b7710e6e"). InnerVolumeSpecName "kube-api-access-d4whn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.579166 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4whn\" (UniqueName: \"kubernetes.io/projected/ddbd1248-e534-4251-b5a6-0505b7710e6e-kube-api-access-d4whn\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.912531 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" event={"ID":"ddbd1248-e534-4251-b5a6-0505b7710e6e","Type":"ContainerDied","Data":"f4a62a10f48b881adff6ba21994ad309b2fffdf31994bf17c62cbe92bb1d398d"} Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.912596 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4a62a10f48b881adff6ba21994ad309b2fffdf31994bf17c62cbe92bb1d398d" Mar 08 21:06:04 crc kubenswrapper[4885]: I0308 21:06:04.912627 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550066-6hpw5" Mar 08 21:06:05 crc kubenswrapper[4885]: I0308 21:06:05.305650 4885 scope.go:117] "RemoveContainer" containerID="93dc9dbb2536460c751fb5259c99f80a1281e51794443a335faf96ba42cb4c59" Mar 08 21:06:05 crc kubenswrapper[4885]: I0308 21:06:05.353576 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 21:06:05 crc kubenswrapper[4885]: I0308 21:06:05.501852 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550060-ztwmh"] Mar 08 21:06:05 crc kubenswrapper[4885]: I0308 21:06:05.514980 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550060-ztwmh"] Mar 08 21:06:05 crc kubenswrapper[4885]: I0308 21:06:05.927584 4885 generic.go:334] "Generic (PLEG): container finished" podID="3405968f-173e-4ab2-a8ac-699fdaaad4d3" containerID="8930c6f62095bcf638d08901a77c2574bea75083141756a22093eb3aac06abfe" exitCode=0 Mar 08 21:06:05 crc kubenswrapper[4885]: I0308 21:06:05.927674 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8crmt" event={"ID":"3405968f-173e-4ab2-a8ac-699fdaaad4d3","Type":"ContainerDied","Data":"8930c6f62095bcf638d08901a77c2574bea75083141756a22093eb3aac06abfe"} Mar 08 21:06:06 crc kubenswrapper[4885]: I0308 21:06:06.463135 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:06:06 crc kubenswrapper[4885]: I0308 21:06:06.463610 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:06:06 crc kubenswrapper[4885]: I0308 21:06:06.495963 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:06:06 crc kubenswrapper[4885]: I0308 21:06:06.496062 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.358507 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.368243 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.392778 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0100b61-a97f-40b6-b8fd-91499667f3d9" path="/var/lib/kubelet/pods/a0100b61-a97f-40b6-b8fd-91499667f3d9/volumes" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.450019 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-config-data\") pod \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.450110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-combined-ca-bundle\") pod \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.450227 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-scripts\") pod \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.450416 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66fz6\" (UniqueName: \"kubernetes.io/projected/3405968f-173e-4ab2-a8ac-699fdaaad4d3-kube-api-access-66fz6\") pod \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\" (UID: \"3405968f-173e-4ab2-a8ac-699fdaaad4d3\") " Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.457146 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-scripts" (OuterVolumeSpecName: "scripts") pod "3405968f-173e-4ab2-a8ac-699fdaaad4d3" (UID: "3405968f-173e-4ab2-a8ac-699fdaaad4d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.461042 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3405968f-173e-4ab2-a8ac-699fdaaad4d3-kube-api-access-66fz6" (OuterVolumeSpecName: "kube-api-access-66fz6") pod "3405968f-173e-4ab2-a8ac-699fdaaad4d3" (UID: "3405968f-173e-4ab2-a8ac-699fdaaad4d3"). InnerVolumeSpecName "kube-api-access-66fz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.478851 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3405968f-173e-4ab2-a8ac-699fdaaad4d3" (UID: "3405968f-173e-4ab2-a8ac-699fdaaad4d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.484044 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-config-data" (OuterVolumeSpecName: "config-data") pod "3405968f-173e-4ab2-a8ac-699fdaaad4d3" (UID: "3405968f-173e-4ab2-a8ac-699fdaaad4d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.552340 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.552386 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66fz6\" (UniqueName: \"kubernetes.io/projected/3405968f-173e-4ab2-a8ac-699fdaaad4d3-kube-api-access-66fz6\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.552401 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.552412 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3405968f-173e-4ab2-a8ac-699fdaaad4d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.628252 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.111:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.628377 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.110:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.628522 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.111:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.628272 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.110:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.950806 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"f2cd11e6d776229da9098efb4d94ce67906e2c52e2199ae80ec12db171f7eadf"} Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.953116 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8crmt" event={"ID":"3405968f-173e-4ab2-a8ac-699fdaaad4d3","Type":"ContainerDied","Data":"8aa86ae41dcf34ddf33de5b419b318cc82ebeca192c9184ed4b4e62fb7cb6132"} Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.953142 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aa86ae41dcf34ddf33de5b419b318cc82ebeca192c9184ed4b4e62fb7cb6132" Mar 08 21:06:07 crc kubenswrapper[4885]: I0308 21:06:07.953246 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8crmt" Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.197608 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.197856 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-log" containerID="cri-o://145acadb2aaec8dcd5165a895f8909b8c1b38463cef4ef90b51c7692b91fbfd5" gracePeriod=30 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.198352 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-api" containerID="cri-o://756c1aa575f487dbda9fdf021512017307ddd58458c8521dba14b321c91f8b89" gracePeriod=30 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.217832 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.218148 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="d0878726-55a7-4b21-95ba-4dda1491dfdd" containerName="nova-scheduler-scheduler" containerID="cri-o://3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14" gracePeriod=30 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.231121 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.231511 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-log" containerID="cri-o://1c9a277208821dbe45c51d736ad5778842b9c47f876f42497004e47858db7d7a" gracePeriod=30 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.231597 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-metadata" containerID="cri-o://cdeaf3122ca8dbc663268a6c75437ef3b2d630cf202295db223a82e3e219dba9" gracePeriod=30 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.972868 4885 generic.go:334] "Generic (PLEG): container finished" podID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerID="145acadb2aaec8dcd5165a895f8909b8c1b38463cef4ef90b51c7692b91fbfd5" exitCode=143 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.973149 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8a3be88-3b2d-4cd0-8987-443d67351acb","Type":"ContainerDied","Data":"145acadb2aaec8dcd5165a895f8909b8c1b38463cef4ef90b51c7692b91fbfd5"} Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.980595 4885 generic.go:334] "Generic (PLEG): container finished" podID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerID="1c9a277208821dbe45c51d736ad5778842b9c47f876f42497004e47858db7d7a" exitCode=143 Mar 08 21:06:08 crc kubenswrapper[4885]: I0308 21:06:08.980670 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41f915-4ddc-4f84-b402-67e3ff310d0e","Type":"ContainerDied","Data":"1c9a277208821dbe45c51d736ad5778842b9c47f876f42497004e47858db7d7a"} Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.549738 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.673938 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/d0878726-55a7-4b21-95ba-4dda1491dfdd-kube-api-access-6xxpc\") pod \"d0878726-55a7-4b21-95ba-4dda1491dfdd\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.674583 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-combined-ca-bundle\") pod \"d0878726-55a7-4b21-95ba-4dda1491dfdd\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.674687 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-config-data\") pod \"d0878726-55a7-4b21-95ba-4dda1491dfdd\" (UID: \"d0878726-55a7-4b21-95ba-4dda1491dfdd\") " Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.682477 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0878726-55a7-4b21-95ba-4dda1491dfdd-kube-api-access-6xxpc" (OuterVolumeSpecName: "kube-api-access-6xxpc") pod "d0878726-55a7-4b21-95ba-4dda1491dfdd" (UID: "d0878726-55a7-4b21-95ba-4dda1491dfdd"). InnerVolumeSpecName "kube-api-access-6xxpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.710101 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0878726-55a7-4b21-95ba-4dda1491dfdd" (UID: "d0878726-55a7-4b21-95ba-4dda1491dfdd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.727458 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-config-data" (OuterVolumeSpecName: "config-data") pod "d0878726-55a7-4b21-95ba-4dda1491dfdd" (UID: "d0878726-55a7-4b21-95ba-4dda1491dfdd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.777556 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.777585 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0878726-55a7-4b21-95ba-4dda1491dfdd-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:12 crc kubenswrapper[4885]: I0308 21:06:12.777595 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xxpc\" (UniqueName: \"kubernetes.io/projected/d0878726-55a7-4b21-95ba-4dda1491dfdd-kube-api-access-6xxpc\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.034077 4885 generic.go:334] "Generic (PLEG): container finished" podID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerID="cdeaf3122ca8dbc663268a6c75437ef3b2d630cf202295db223a82e3e219dba9" exitCode=0 Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.034137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41f915-4ddc-4f84-b402-67e3ff310d0e","Type":"ContainerDied","Data":"cdeaf3122ca8dbc663268a6c75437ef3b2d630cf202295db223a82e3e219dba9"} Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.037311 4885 generic.go:334] "Generic (PLEG): container finished" podID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerID="756c1aa575f487dbda9fdf021512017307ddd58458c8521dba14b321c91f8b89" exitCode=0 Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.037386 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8a3be88-3b2d-4cd0-8987-443d67351acb","Type":"ContainerDied","Data":"756c1aa575f487dbda9fdf021512017307ddd58458c8521dba14b321c91f8b89"} Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.039347 4885 generic.go:334] "Generic (PLEG): container finished" podID="d0878726-55a7-4b21-95ba-4dda1491dfdd" containerID="3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14" exitCode=0 Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.039384 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0878726-55a7-4b21-95ba-4dda1491dfdd","Type":"ContainerDied","Data":"3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14"} Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.039452 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.039484 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"d0878726-55a7-4b21-95ba-4dda1491dfdd","Type":"ContainerDied","Data":"fc7da7a57e0febad3191a8b8237c25b407f5b585b3c92502602c4310b7e751ee"} Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.039516 4885 scope.go:117] "RemoveContainer" containerID="3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.109137 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.125118 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.127642 4885 scope.go:117] "RemoveContainer" containerID="3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.130992 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:13 crc kubenswrapper[4885]: E0308 21:06:13.131498 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3405968f-173e-4ab2-a8ac-699fdaaad4d3" containerName="nova-manage" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.131517 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3405968f-173e-4ab2-a8ac-699fdaaad4d3" containerName="nova-manage" Mar 08 21:06:13 crc kubenswrapper[4885]: E0308 21:06:13.131546 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddbd1248-e534-4251-b5a6-0505b7710e6e" containerName="oc" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.131555 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddbd1248-e534-4251-b5a6-0505b7710e6e" containerName="oc" Mar 08 21:06:13 crc kubenswrapper[4885]: E0308 21:06:13.131573 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0878726-55a7-4b21-95ba-4dda1491dfdd" containerName="nova-scheduler-scheduler" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.131583 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0878726-55a7-4b21-95ba-4dda1491dfdd" containerName="nova-scheduler-scheduler" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.131826 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0878726-55a7-4b21-95ba-4dda1491dfdd" containerName="nova-scheduler-scheduler" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.131855 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3405968f-173e-4ab2-a8ac-699fdaaad4d3" containerName="nova-manage" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.131873 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddbd1248-e534-4251-b5a6-0505b7710e6e" containerName="oc" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.132842 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: E0308 21:06:13.133224 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14\": container with ID starting with 3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14 not found: ID does not exist" containerID="3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.133282 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14"} err="failed to get container status \"3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14\": rpc error: code = NotFound desc = could not find container \"3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14\": container with ID starting with 3629e351e5d6914a358479a07cf6fcc46f8cf5ad7f59c45af6b4317d8fc17a14 not found: ID does not exist" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.135691 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.137106 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.245376 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.251650 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.290978 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-config-data\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.291023 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sf6d\" (UniqueName: \"kubernetes.io/projected/6a29a091-3ebc-4dbb-b876-19892bedba02-kube-api-access-5sf6d\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.291246 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.383256 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0878726-55a7-4b21-95ba-4dda1491dfdd" path="/var/lib/kubelet/pods/d0878726-55a7-4b21-95ba-4dda1491dfdd/volumes" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392216 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-config-data\") pod \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392323 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj25c\" (UniqueName: \"kubernetes.io/projected/8d41f915-4ddc-4f84-b402-67e3ff310d0e-kube-api-access-sj25c\") pod \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392369 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41f915-4ddc-4f84-b402-67e3ff310d0e-logs\") pod \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392432 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-combined-ca-bundle\") pod \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\" (UID: \"8d41f915-4ddc-4f84-b402-67e3ff310d0e\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392464 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-combined-ca-bundle\") pod \"d8a3be88-3b2d-4cd0-8987-443d67351acb\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392535 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfkcz\" (UniqueName: \"kubernetes.io/projected/d8a3be88-3b2d-4cd0-8987-443d67351acb-kube-api-access-cfkcz\") pod \"d8a3be88-3b2d-4cd0-8987-443d67351acb\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392577 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a3be88-3b2d-4cd0-8987-443d67351acb-logs\") pod \"d8a3be88-3b2d-4cd0-8987-443d67351acb\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392613 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-config-data\") pod \"d8a3be88-3b2d-4cd0-8987-443d67351acb\" (UID: \"d8a3be88-3b2d-4cd0-8987-443d67351acb\") " Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392874 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.392983 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-config-data\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.393007 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sf6d\" (UniqueName: \"kubernetes.io/projected/6a29a091-3ebc-4dbb-b876-19892bedba02-kube-api-access-5sf6d\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.394365 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a3be88-3b2d-4cd0-8987-443d67351acb-logs" (OuterVolumeSpecName: "logs") pod "d8a3be88-3b2d-4cd0-8987-443d67351acb" (UID: "d8a3be88-3b2d-4cd0-8987-443d67351acb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.394611 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d41f915-4ddc-4f84-b402-67e3ff310d0e-logs" (OuterVolumeSpecName: "logs") pod "8d41f915-4ddc-4f84-b402-67e3ff310d0e" (UID: "8d41f915-4ddc-4f84-b402-67e3ff310d0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.400424 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d41f915-4ddc-4f84-b402-67e3ff310d0e-kube-api-access-sj25c" (OuterVolumeSpecName: "kube-api-access-sj25c") pod "8d41f915-4ddc-4f84-b402-67e3ff310d0e" (UID: "8d41f915-4ddc-4f84-b402-67e3ff310d0e"). InnerVolumeSpecName "kube-api-access-sj25c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.400874 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a3be88-3b2d-4cd0-8987-443d67351acb-kube-api-access-cfkcz" (OuterVolumeSpecName: "kube-api-access-cfkcz") pod "d8a3be88-3b2d-4cd0-8987-443d67351acb" (UID: "d8a3be88-3b2d-4cd0-8987-443d67351acb"). InnerVolumeSpecName "kube-api-access-cfkcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.403188 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-config-data\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.413456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.417403 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sf6d\" (UniqueName: \"kubernetes.io/projected/6a29a091-3ebc-4dbb-b876-19892bedba02-kube-api-access-5sf6d\") pod \"nova-scheduler-0\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " pod="openstack/nova-scheduler-0" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.427857 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-config-data" (OuterVolumeSpecName: "config-data") pod "8d41f915-4ddc-4f84-b402-67e3ff310d0e" (UID: "8d41f915-4ddc-4f84-b402-67e3ff310d0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.428197 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8d41f915-4ddc-4f84-b402-67e3ff310d0e" (UID: "8d41f915-4ddc-4f84-b402-67e3ff310d0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.429482 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-config-data" (OuterVolumeSpecName: "config-data") pod "d8a3be88-3b2d-4cd0-8987-443d67351acb" (UID: "d8a3be88-3b2d-4cd0-8987-443d67351acb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.440147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8a3be88-3b2d-4cd0-8987-443d67351acb" (UID: "d8a3be88-3b2d-4cd0-8987-443d67351acb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494356 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfkcz\" (UniqueName: \"kubernetes.io/projected/d8a3be88-3b2d-4cd0-8987-443d67351acb-kube-api-access-cfkcz\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494390 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8a3be88-3b2d-4cd0-8987-443d67351acb-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494403 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494417 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494429 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj25c\" (UniqueName: \"kubernetes.io/projected/8d41f915-4ddc-4f84-b402-67e3ff310d0e-kube-api-access-sj25c\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494441 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8d41f915-4ddc-4f84-b402-67e3ff310d0e-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494453 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d41f915-4ddc-4f84-b402-67e3ff310d0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.494464 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8a3be88-3b2d-4cd0-8987-443d67351acb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:13 crc kubenswrapper[4885]: I0308 21:06:13.539392 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.056140 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.056211 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"8d41f915-4ddc-4f84-b402-67e3ff310d0e","Type":"ContainerDied","Data":"20924542c83be387f0ae030282c946d65a2f7a96ddb827f2a18a8e4ab1e8eddb"} Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.056773 4885 scope.go:117] "RemoveContainer" containerID="cdeaf3122ca8dbc663268a6c75437ef3b2d630cf202295db223a82e3e219dba9" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.059275 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d8a3be88-3b2d-4cd0-8987-443d67351acb","Type":"ContainerDied","Data":"8c02c0dd36a35b639dcbf4fdeb957cafcaf6477b4aefd990c59836fff31381b3"} Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.059396 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: W0308 21:06:14.077375 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a29a091_3ebc_4dbb_b876_19892bedba02.slice/crio-3c576770d28108955fb30624b271f23df81e0516b1d08085cc88a640253a2d1e WatchSource:0}: Error finding container 3c576770d28108955fb30624b271f23df81e0516b1d08085cc88a640253a2d1e: Status 404 returned error can't find the container with id 3c576770d28108955fb30624b271f23df81e0516b1d08085cc88a640253a2d1e Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.077456 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.113417 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.122054 4885 scope.go:117] "RemoveContainer" containerID="1c9a277208821dbe45c51d736ad5778842b9c47f876f42497004e47858db7d7a" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.134832 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.152801 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.169725 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.185048 4885 scope.go:117] "RemoveContainer" containerID="756c1aa575f487dbda9fdf021512017307ddd58458c8521dba14b321c91f8b89" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.190734 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: E0308 21:06:14.191225 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-api" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191240 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-api" Mar 08 21:06:14 crc kubenswrapper[4885]: E0308 21:06:14.191267 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-metadata" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191276 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-metadata" Mar 08 21:06:14 crc kubenswrapper[4885]: E0308 21:06:14.191290 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-log" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191299 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-log" Mar 08 21:06:14 crc kubenswrapper[4885]: E0308 21:06:14.191318 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-log" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191327 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-log" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191533 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-metadata" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191551 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-log" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191566 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" containerName="nova-api-api" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.191585 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" containerName="nova-metadata-log" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.192643 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.195347 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.202966 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.227992 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.229868 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.231520 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.237990 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.266058 4885 scope.go:117] "RemoveContainer" containerID="145acadb2aaec8dcd5165a895f8909b8c1b38463cef4ef90b51c7692b91fbfd5" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.321102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74c4d\" (UniqueName: \"kubernetes.io/projected/994b00da-2d97-4508-8f36-b517afab98e1-kube-api-access-74c4d\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.321154 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-config-data\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.321176 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/994b00da-2d97-4508-8f36-b517afab98e1-logs\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.321446 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422630 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422683 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-config-data\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422756 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpccf\" (UniqueName: \"kubernetes.io/projected/3195111b-b266-425b-82da-98f3d0a29f0e-kube-api-access-vpccf\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422782 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422818 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3195111b-b266-425b-82da-98f3d0a29f0e-logs\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74c4d\" (UniqueName: \"kubernetes.io/projected/994b00da-2d97-4508-8f36-b517afab98e1-kube-api-access-74c4d\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422897 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-config-data\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.422941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/994b00da-2d97-4508-8f36-b517afab98e1-logs\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.423570 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/994b00da-2d97-4508-8f36-b517afab98e1-logs\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.426958 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.429810 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-config-data\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.447465 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74c4d\" (UniqueName: \"kubernetes.io/projected/994b00da-2d97-4508-8f36-b517afab98e1-kube-api-access-74c4d\") pod \"nova-metadata-0\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.524662 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpccf\" (UniqueName: \"kubernetes.io/projected/3195111b-b266-425b-82da-98f3d0a29f0e-kube-api-access-vpccf\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.524894 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.525029 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3195111b-b266-425b-82da-98f3d0a29f0e-logs\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.525197 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-config-data\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.525586 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3195111b-b266-425b-82da-98f3d0a29f0e-logs\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.528570 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.529164 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-config-data\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.547166 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpccf\" (UniqueName: \"kubernetes.io/projected/3195111b-b266-425b-82da-98f3d0a29f0e-kube-api-access-vpccf\") pod \"nova-api-0\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.567103 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.635459 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:06:14 crc kubenswrapper[4885]: I0308 21:06:14.992064 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.038898 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.073148 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a29a091-3ebc-4dbb-b876-19892bedba02","Type":"ContainerStarted","Data":"6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd"} Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.073202 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a29a091-3ebc-4dbb-b876-19892bedba02","Type":"ContainerStarted","Data":"3c576770d28108955fb30624b271f23df81e0516b1d08085cc88a640253a2d1e"} Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.075321 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"994b00da-2d97-4508-8f36-b517afab98e1","Type":"ContainerStarted","Data":"ed98b0990237ad316a273718be6c6f8f3198828e148541a9840c7e6321b7e7da"} Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.076252 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3195111b-b266-425b-82da-98f3d0a29f0e","Type":"ContainerStarted","Data":"77ebc86a187e7f428857986b384dc697b96b7685acfe2d360f9674aa240afe23"} Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.096368 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.096351423 podStartE2EDuration="2.096351423s" podCreationTimestamp="2026-03-08 21:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:15.08952535 +0000 UTC m=+5676.485579383" watchObservedRunningTime="2026-03-08 21:06:15.096351423 +0000 UTC m=+5676.492405446" Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.381631 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d41f915-4ddc-4f84-b402-67e3ff310d0e" path="/var/lib/kubelet/pods/8d41f915-4ddc-4f84-b402-67e3ff310d0e/volumes" Mar 08 21:06:15 crc kubenswrapper[4885]: I0308 21:06:15.383170 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a3be88-3b2d-4cd0-8987-443d67351acb" path="/var/lib/kubelet/pods/d8a3be88-3b2d-4cd0-8987-443d67351acb/volumes" Mar 08 21:06:16 crc kubenswrapper[4885]: I0308 21:06:16.091082 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"994b00da-2d97-4508-8f36-b517afab98e1","Type":"ContainerStarted","Data":"0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49"} Mar 08 21:06:16 crc kubenswrapper[4885]: I0308 21:06:16.091151 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"994b00da-2d97-4508-8f36-b517afab98e1","Type":"ContainerStarted","Data":"9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b"} Mar 08 21:06:16 crc kubenswrapper[4885]: I0308 21:06:16.095801 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3195111b-b266-425b-82da-98f3d0a29f0e","Type":"ContainerStarted","Data":"2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895"} Mar 08 21:06:16 crc kubenswrapper[4885]: I0308 21:06:16.095866 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3195111b-b266-425b-82da-98f3d0a29f0e","Type":"ContainerStarted","Data":"d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324"} Mar 08 21:06:16 crc kubenswrapper[4885]: I0308 21:06:16.118475 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.118460096 podStartE2EDuration="2.118460096s" podCreationTimestamp="2026-03-08 21:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:16.113257777 +0000 UTC m=+5677.509311800" watchObservedRunningTime="2026-03-08 21:06:16.118460096 +0000 UTC m=+5677.514514109" Mar 08 21:06:16 crc kubenswrapper[4885]: I0308 21:06:16.162585 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.162569582 podStartE2EDuration="2.162569582s" podCreationTimestamp="2026-03-08 21:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:16.157199429 +0000 UTC m=+5677.553253462" watchObservedRunningTime="2026-03-08 21:06:16.162569582 +0000 UTC m=+5677.558623605" Mar 08 21:06:18 crc kubenswrapper[4885]: I0308 21:06:18.540246 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 21:06:19 crc kubenswrapper[4885]: I0308 21:06:19.569024 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:06:19 crc kubenswrapper[4885]: I0308 21:06:19.569075 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.607872 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5zrkw"] Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.611290 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.655147 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5zrkw"] Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.685147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-catalog-content\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.685421 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-utilities\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.685710 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxp2w\" (UniqueName: \"kubernetes.io/projected/2a0b0e8c-3002-4dcd-9172-998602ca9be9-kube-api-access-bxp2w\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.787195 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxp2w\" (UniqueName: \"kubernetes.io/projected/2a0b0e8c-3002-4dcd-9172-998602ca9be9-kube-api-access-bxp2w\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.787300 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-catalog-content\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.787351 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-utilities\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.787900 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-utilities\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.787972 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-catalog-content\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.827796 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxp2w\" (UniqueName: \"kubernetes.io/projected/2a0b0e8c-3002-4dcd-9172-998602ca9be9-kube-api-access-bxp2w\") pod \"redhat-operators-5zrkw\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:21 crc kubenswrapper[4885]: I0308 21:06:21.969723 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:22 crc kubenswrapper[4885]: I0308 21:06:22.471905 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5zrkw"] Mar 08 21:06:23 crc kubenswrapper[4885]: I0308 21:06:23.182544 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerID="4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277" exitCode=0 Mar 08 21:06:23 crc kubenswrapper[4885]: I0308 21:06:23.182706 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerDied","Data":"4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277"} Mar 08 21:06:23 crc kubenswrapper[4885]: I0308 21:06:23.183877 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerStarted","Data":"80e3a927b22816a9112bee41b9a53b04401331287a3f011a1dc86a4440d4689e"} Mar 08 21:06:23 crc kubenswrapper[4885]: I0308 21:06:23.540640 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 21:06:23 crc kubenswrapper[4885]: I0308 21:06:23.572196 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 21:06:24 crc kubenswrapper[4885]: I0308 21:06:24.196362 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerStarted","Data":"43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11"} Mar 08 21:06:24 crc kubenswrapper[4885]: I0308 21:06:24.261679 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 21:06:24 crc kubenswrapper[4885]: I0308 21:06:24.569197 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:06:24 crc kubenswrapper[4885]: I0308 21:06:24.569267 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:06:24 crc kubenswrapper[4885]: I0308 21:06:24.636194 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:06:24 crc kubenswrapper[4885]: I0308 21:06:24.636265 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:06:25 crc kubenswrapper[4885]: I0308 21:06:25.206965 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerID="43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11" exitCode=0 Mar 08 21:06:25 crc kubenswrapper[4885]: I0308 21:06:25.208420 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerDied","Data":"43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11"} Mar 08 21:06:25 crc kubenswrapper[4885]: I0308 21:06:25.651522 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.116:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:25 crc kubenswrapper[4885]: I0308 21:06:25.651579 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.116:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:25 crc kubenswrapper[4885]: I0308 21:06:25.733233 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.117:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:25 crc kubenswrapper[4885]: I0308 21:06:25.733649 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.117:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:06:26 crc kubenswrapper[4885]: I0308 21:06:26.218507 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerStarted","Data":"32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13"} Mar 08 21:06:26 crc kubenswrapper[4885]: I0308 21:06:26.249241 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5zrkw" podStartSLOduration=2.831514046 podStartE2EDuration="5.249222935s" podCreationTimestamp="2026-03-08 21:06:21 +0000 UTC" firstStartedPulling="2026-03-08 21:06:23.185078435 +0000 UTC m=+5684.581132458" lastFinishedPulling="2026-03-08 21:06:25.602787294 +0000 UTC m=+5686.998841347" observedRunningTime="2026-03-08 21:06:26.238776957 +0000 UTC m=+5687.634830980" watchObservedRunningTime="2026-03-08 21:06:26.249222935 +0000 UTC m=+5687.645276958" Mar 08 21:06:31 crc kubenswrapper[4885]: I0308 21:06:31.970797 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:31 crc kubenswrapper[4885]: I0308 21:06:31.971412 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:33 crc kubenswrapper[4885]: I0308 21:06:33.041702 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5zrkw" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="registry-server" probeResult="failure" output=< Mar 08 21:06:33 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 21:06:33 crc kubenswrapper[4885]: > Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.570912 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.575975 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.577335 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.641103 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.641906 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.642078 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 21:06:34 crc kubenswrapper[4885]: I0308 21:06:34.644658 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.306720 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.309647 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.311054 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.581340 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cf6665877-kr6fn"] Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.582974 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.597309 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cf6665877-kr6fn"] Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.754482 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-dns-svc\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.754535 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-config\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.754638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdc7k\" (UniqueName: \"kubernetes.io/projected/2ed9193c-0c46-47cb-af24-c8415837c19b-kube-api-access-pdc7k\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.754788 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.754844 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.856780 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-dns-svc\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.856841 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-config\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.856868 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdc7k\" (UniqueName: \"kubernetes.io/projected/2ed9193c-0c46-47cb-af24-c8415837c19b-kube-api-access-pdc7k\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.856907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.856943 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.858184 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-config\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.858422 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-dns-svc\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.858634 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-nb\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.858635 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-sb\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.883972 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdc7k\" (UniqueName: \"kubernetes.io/projected/2ed9193c-0c46-47cb-af24-c8415837c19b-kube-api-access-pdc7k\") pod \"dnsmasq-dns-5cf6665877-kr6fn\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:35 crc kubenswrapper[4885]: I0308 21:06:35.920465 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:36 crc kubenswrapper[4885]: I0308 21:06:36.484029 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cf6665877-kr6fn"] Mar 08 21:06:37 crc kubenswrapper[4885]: I0308 21:06:37.336461 4885 generic.go:334] "Generic (PLEG): container finished" podID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerID="44db3e80d53ecaa3d76c25ae2231f68ed9a8e2480156df67bfa8787c436f51c1" exitCode=0 Mar 08 21:06:37 crc kubenswrapper[4885]: I0308 21:06:37.336576 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" event={"ID":"2ed9193c-0c46-47cb-af24-c8415837c19b","Type":"ContainerDied","Data":"44db3e80d53ecaa3d76c25ae2231f68ed9a8e2480156df67bfa8787c436f51c1"} Mar 08 21:06:37 crc kubenswrapper[4885]: I0308 21:06:37.337007 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" event={"ID":"2ed9193c-0c46-47cb-af24-c8415837c19b","Type":"ContainerStarted","Data":"13a5f96703742b15ab41ce5ca4bff51ff0ff5f629fdccca2879c9831c1547b90"} Mar 08 21:06:38 crc kubenswrapper[4885]: I0308 21:06:38.348356 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" event={"ID":"2ed9193c-0c46-47cb-af24-c8415837c19b","Type":"ContainerStarted","Data":"750d78661c34c956acafb525ed97e2c26316d92d4d7f7130df1013f1f9bc8ad8"} Mar 08 21:06:38 crc kubenswrapper[4885]: I0308 21:06:38.348804 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:38 crc kubenswrapper[4885]: I0308 21:06:38.373610 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" podStartSLOduration=3.373588332 podStartE2EDuration="3.373588332s" podCreationTimestamp="2026-03-08 21:06:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:38.367711015 +0000 UTC m=+5699.763765038" watchObservedRunningTime="2026-03-08 21:06:38.373588332 +0000 UTC m=+5699.769642355" Mar 08 21:06:42 crc kubenswrapper[4885]: I0308 21:06:42.041620 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:42 crc kubenswrapper[4885]: I0308 21:06:42.111441 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:42 crc kubenswrapper[4885]: I0308 21:06:42.281333 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5zrkw"] Mar 08 21:06:43 crc kubenswrapper[4885]: I0308 21:06:43.396689 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5zrkw" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="registry-server" containerID="cri-o://32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13" gracePeriod=2 Mar 08 21:06:43 crc kubenswrapper[4885]: I0308 21:06:43.985187 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.148616 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-catalog-content\") pod \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.148742 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxp2w\" (UniqueName: \"kubernetes.io/projected/2a0b0e8c-3002-4dcd-9172-998602ca9be9-kube-api-access-bxp2w\") pod \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.148885 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-utilities\") pod \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\" (UID: \"2a0b0e8c-3002-4dcd-9172-998602ca9be9\") " Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.149700 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-utilities" (OuterVolumeSpecName: "utilities") pod "2a0b0e8c-3002-4dcd-9172-998602ca9be9" (UID: "2a0b0e8c-3002-4dcd-9172-998602ca9be9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.164273 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0b0e8c-3002-4dcd-9172-998602ca9be9-kube-api-access-bxp2w" (OuterVolumeSpecName: "kube-api-access-bxp2w") pod "2a0b0e8c-3002-4dcd-9172-998602ca9be9" (UID: "2a0b0e8c-3002-4dcd-9172-998602ca9be9"). InnerVolumeSpecName "kube-api-access-bxp2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.250709 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxp2w\" (UniqueName: \"kubernetes.io/projected/2a0b0e8c-3002-4dcd-9172-998602ca9be9-kube-api-access-bxp2w\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.250748 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.292327 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a0b0e8c-3002-4dcd-9172-998602ca9be9" (UID: "2a0b0e8c-3002-4dcd-9172-998602ca9be9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.352109 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a0b0e8c-3002-4dcd-9172-998602ca9be9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.410558 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerID="32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13" exitCode=0 Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.410602 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerDied","Data":"32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13"} Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.410639 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5zrkw" event={"ID":"2a0b0e8c-3002-4dcd-9172-998602ca9be9","Type":"ContainerDied","Data":"80e3a927b22816a9112bee41b9a53b04401331287a3f011a1dc86a4440d4689e"} Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.410654 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5zrkw" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.410664 4885 scope.go:117] "RemoveContainer" containerID="32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.438262 4885 scope.go:117] "RemoveContainer" containerID="43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.452359 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5zrkw"] Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.459778 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5zrkw"] Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.478133 4885 scope.go:117] "RemoveContainer" containerID="4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.536877 4885 scope.go:117] "RemoveContainer" containerID="32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13" Mar 08 21:06:44 crc kubenswrapper[4885]: E0308 21:06:44.537269 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13\": container with ID starting with 32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13 not found: ID does not exist" containerID="32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.537410 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13"} err="failed to get container status \"32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13\": rpc error: code = NotFound desc = could not find container \"32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13\": container with ID starting with 32b29f4e5a93c674f5034273a72922629175901971a826a5f199f255fffc0b13 not found: ID does not exist" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.537494 4885 scope.go:117] "RemoveContainer" containerID="43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11" Mar 08 21:06:44 crc kubenswrapper[4885]: E0308 21:06:44.537882 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11\": container with ID starting with 43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11 not found: ID does not exist" containerID="43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.537984 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11"} err="failed to get container status \"43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11\": rpc error: code = NotFound desc = could not find container \"43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11\": container with ID starting with 43195d76ebf8587a845ea94be9a8454a4ea7224d1749d141168ff7404a35da11 not found: ID does not exist" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.538056 4885 scope.go:117] "RemoveContainer" containerID="4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277" Mar 08 21:06:44 crc kubenswrapper[4885]: E0308 21:06:44.538548 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277\": container with ID starting with 4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277 not found: ID does not exist" containerID="4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277" Mar 08 21:06:44 crc kubenswrapper[4885]: I0308 21:06:44.538674 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277"} err="failed to get container status \"4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277\": rpc error: code = NotFound desc = could not find container \"4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277\": container with ID starting with 4036513b5058b2bc9d41126feba9a5285ff367f788150d2d39b1cdcfe2347277 not found: ID does not exist" Mar 08 21:06:45 crc kubenswrapper[4885]: I0308 21:06:45.379054 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" path="/var/lib/kubelet/pods/2a0b0e8c-3002-4dcd-9172-998602ca9be9/volumes" Mar 08 21:06:45 crc kubenswrapper[4885]: I0308 21:06:45.922219 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.026301 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f86679947-h9j4z"] Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.026603 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerName="dnsmasq-dns" containerID="cri-o://53391697a2f4f90c5ca24bdf716499111736339ebf0bad4e24ae2562e126d17f" gracePeriod=10 Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.435944 4885 generic.go:334] "Generic (PLEG): container finished" podID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerID="53391697a2f4f90c5ca24bdf716499111736339ebf0bad4e24ae2562e126d17f" exitCode=0 Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.436145 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" event={"ID":"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b","Type":"ContainerDied","Data":"53391697a2f4f90c5ca24bdf716499111736339ebf0bad4e24ae2562e126d17f"} Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.517258 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.589133 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-sb\") pod \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.589183 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-nb\") pod \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.589234 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-dns-svc\") pod \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.589293 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grkm9\" (UniqueName: \"kubernetes.io/projected/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-kube-api-access-grkm9\") pod \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.589421 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-config\") pod \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\" (UID: \"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b\") " Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.600629 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-kube-api-access-grkm9" (OuterVolumeSpecName: "kube-api-access-grkm9") pod "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" (UID: "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b"). InnerVolumeSpecName "kube-api-access-grkm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.635034 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" (UID: "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.650891 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" (UID: "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.663687 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-config" (OuterVolumeSpecName: "config") pod "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" (UID: "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.687667 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" (UID: "1b79cc1d-e5e7-4fb2-a33a-d3041e17384b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.691356 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.691386 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.691397 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.691409 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:46 crc kubenswrapper[4885]: I0308 21:06:46.691420 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grkm9\" (UniqueName: \"kubernetes.io/projected/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b-kube-api-access-grkm9\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:47 crc kubenswrapper[4885]: I0308 21:06:47.447114 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" event={"ID":"1b79cc1d-e5e7-4fb2-a33a-d3041e17384b","Type":"ContainerDied","Data":"4c375bccd82478c29984cb004edc82a922cd3f66f47be6d3e4038a2ff4cf6623"} Mar 08 21:06:47 crc kubenswrapper[4885]: I0308 21:06:47.447175 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f86679947-h9j4z" Mar 08 21:06:47 crc kubenswrapper[4885]: I0308 21:06:47.447178 4885 scope.go:117] "RemoveContainer" containerID="53391697a2f4f90c5ca24bdf716499111736339ebf0bad4e24ae2562e126d17f" Mar 08 21:06:47 crc kubenswrapper[4885]: I0308 21:06:47.473626 4885 scope.go:117] "RemoveContainer" containerID="364b7c9836de2ac4dcd0074d16339cb1e1fe0eee56d6ea6aba2ce5bd28ef8b4b" Mar 08 21:06:47 crc kubenswrapper[4885]: I0308 21:06:47.477121 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f86679947-h9j4z"] Mar 08 21:06:47 crc kubenswrapper[4885]: I0308 21:06:47.498044 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f86679947-h9j4z"] Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.350699 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-cclgv"] Mar 08 21:06:48 crc kubenswrapper[4885]: E0308 21:06:48.351797 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="extract-utilities" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.351823 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="extract-utilities" Mar 08 21:06:48 crc kubenswrapper[4885]: E0308 21:06:48.351839 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="extract-content" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.351848 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="extract-content" Mar 08 21:06:48 crc kubenswrapper[4885]: E0308 21:06:48.351860 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="registry-server" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.351868 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="registry-server" Mar 08 21:06:48 crc kubenswrapper[4885]: E0308 21:06:48.351892 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerName="dnsmasq-dns" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.351899 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerName="dnsmasq-dns" Mar 08 21:06:48 crc kubenswrapper[4885]: E0308 21:06:48.352409 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerName="init" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.352423 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerName="init" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.352629 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0b0e8c-3002-4dcd-9172-998602ca9be9" containerName="registry-server" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.352652 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" containerName="dnsmasq-dns" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.353552 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.362931 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cclgv"] Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.425392 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96e29afc-72d1-4b29-9528-1ed61feed290-operator-scripts\") pod \"cinder-db-create-cclgv\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.425494 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtwh8\" (UniqueName: \"kubernetes.io/projected/96e29afc-72d1-4b29-9528-1ed61feed290-kube-api-access-mtwh8\") pod \"cinder-db-create-cclgv\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.480015 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-2e95-account-create-update-thqbx"] Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.481115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.495038 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.499126 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2e95-account-create-update-thqbx"] Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.529558 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96e29afc-72d1-4b29-9528-1ed61feed290-operator-scripts\") pod \"cinder-db-create-cclgv\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.529607 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtwh8\" (UniqueName: \"kubernetes.io/projected/96e29afc-72d1-4b29-9528-1ed61feed290-kube-api-access-mtwh8\") pod \"cinder-db-create-cclgv\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.531262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96e29afc-72d1-4b29-9528-1ed61feed290-operator-scripts\") pod \"cinder-db-create-cclgv\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.580693 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtwh8\" (UniqueName: \"kubernetes.io/projected/96e29afc-72d1-4b29-9528-1ed61feed290-kube-api-access-mtwh8\") pod \"cinder-db-create-cclgv\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.634698 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz558\" (UniqueName: \"kubernetes.io/projected/3159e4ac-64da-47ba-9c70-b23214e8b8ad-kube-api-access-vz558\") pod \"cinder-2e95-account-create-update-thqbx\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.635010 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3159e4ac-64da-47ba-9c70-b23214e8b8ad-operator-scripts\") pod \"cinder-2e95-account-create-update-thqbx\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.678966 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.736997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz558\" (UniqueName: \"kubernetes.io/projected/3159e4ac-64da-47ba-9c70-b23214e8b8ad-kube-api-access-vz558\") pod \"cinder-2e95-account-create-update-thqbx\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.737071 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3159e4ac-64da-47ba-9c70-b23214e8b8ad-operator-scripts\") pod \"cinder-2e95-account-create-update-thqbx\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.737701 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3159e4ac-64da-47ba-9c70-b23214e8b8ad-operator-scripts\") pod \"cinder-2e95-account-create-update-thqbx\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.754330 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz558\" (UniqueName: \"kubernetes.io/projected/3159e4ac-64da-47ba-9c70-b23214e8b8ad-kube-api-access-vz558\") pod \"cinder-2e95-account-create-update-thqbx\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:48 crc kubenswrapper[4885]: I0308 21:06:48.800394 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:49 crc kubenswrapper[4885]: W0308 21:06:49.113722 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96e29afc_72d1_4b29_9528_1ed61feed290.slice/crio-0e7df792d25acaa25d83f037bb13acb74103e06430c54655ed73c3fa1fc654fa WatchSource:0}: Error finding container 0e7df792d25acaa25d83f037bb13acb74103e06430c54655ed73c3fa1fc654fa: Status 404 returned error can't find the container with id 0e7df792d25acaa25d83f037bb13acb74103e06430c54655ed73c3fa1fc654fa Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.119380 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cclgv"] Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.253810 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-2e95-account-create-update-thqbx"] Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.382689 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b79cc1d-e5e7-4fb2-a33a-d3041e17384b" path="/var/lib/kubelet/pods/1b79cc1d-e5e7-4fb2-a33a-d3041e17384b/volumes" Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.468439 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2e95-account-create-update-thqbx" event={"ID":"3159e4ac-64da-47ba-9c70-b23214e8b8ad","Type":"ContainerStarted","Data":"d5baab592079e81c7a0f9f2d2a048f773b79e232fa80526c94c402b1d3d147c9"} Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.468486 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2e95-account-create-update-thqbx" event={"ID":"3159e4ac-64da-47ba-9c70-b23214e8b8ad","Type":"ContainerStarted","Data":"2592b482160778d89ae89f082b4c5d807f6749aa7cd668c503d444d47635c41e"} Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.470769 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cclgv" event={"ID":"96e29afc-72d1-4b29-9528-1ed61feed290","Type":"ContainerStarted","Data":"488ce5cd91e5332723ee20f8e8bbaf7d336b87f5ba2cbf84286a0e234f08758e"} Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.470792 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cclgv" event={"ID":"96e29afc-72d1-4b29-9528-1ed61feed290","Type":"ContainerStarted","Data":"0e7df792d25acaa25d83f037bb13acb74103e06430c54655ed73c3fa1fc654fa"} Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.486948 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-2e95-account-create-update-thqbx" podStartSLOduration=1.48692895 podStartE2EDuration="1.48692895s" podCreationTimestamp="2026-03-08 21:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:49.486314444 +0000 UTC m=+5710.882368467" watchObservedRunningTime="2026-03-08 21:06:49.48692895 +0000 UTC m=+5710.882982973" Mar 08 21:06:49 crc kubenswrapper[4885]: I0308 21:06:49.508498 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-cclgv" podStartSLOduration=1.508480355 podStartE2EDuration="1.508480355s" podCreationTimestamp="2026-03-08 21:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:49.505351652 +0000 UTC m=+5710.901405675" watchObservedRunningTime="2026-03-08 21:06:49.508480355 +0000 UTC m=+5710.904534378" Mar 08 21:06:50 crc kubenswrapper[4885]: I0308 21:06:50.483599 4885 generic.go:334] "Generic (PLEG): container finished" podID="96e29afc-72d1-4b29-9528-1ed61feed290" containerID="488ce5cd91e5332723ee20f8e8bbaf7d336b87f5ba2cbf84286a0e234f08758e" exitCode=0 Mar 08 21:06:50 crc kubenswrapper[4885]: I0308 21:06:50.483658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cclgv" event={"ID":"96e29afc-72d1-4b29-9528-1ed61feed290","Type":"ContainerDied","Data":"488ce5cd91e5332723ee20f8e8bbaf7d336b87f5ba2cbf84286a0e234f08758e"} Mar 08 21:06:50 crc kubenswrapper[4885]: I0308 21:06:50.488299 4885 generic.go:334] "Generic (PLEG): container finished" podID="3159e4ac-64da-47ba-9c70-b23214e8b8ad" containerID="d5baab592079e81c7a0f9f2d2a048f773b79e232fa80526c94c402b1d3d147c9" exitCode=0 Mar 08 21:06:50 crc kubenswrapper[4885]: I0308 21:06:50.488380 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2e95-account-create-update-thqbx" event={"ID":"3159e4ac-64da-47ba-9c70-b23214e8b8ad","Type":"ContainerDied","Data":"d5baab592079e81c7a0f9f2d2a048f773b79e232fa80526c94c402b1d3d147c9"} Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.069334 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.075126 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.212077 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtwh8\" (UniqueName: \"kubernetes.io/projected/96e29afc-72d1-4b29-9528-1ed61feed290-kube-api-access-mtwh8\") pod \"96e29afc-72d1-4b29-9528-1ed61feed290\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.212155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96e29afc-72d1-4b29-9528-1ed61feed290-operator-scripts\") pod \"96e29afc-72d1-4b29-9528-1ed61feed290\" (UID: \"96e29afc-72d1-4b29-9528-1ed61feed290\") " Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.212405 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz558\" (UniqueName: \"kubernetes.io/projected/3159e4ac-64da-47ba-9c70-b23214e8b8ad-kube-api-access-vz558\") pod \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.212441 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3159e4ac-64da-47ba-9c70-b23214e8b8ad-operator-scripts\") pod \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\" (UID: \"3159e4ac-64da-47ba-9c70-b23214e8b8ad\") " Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.213030 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96e29afc-72d1-4b29-9528-1ed61feed290-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "96e29afc-72d1-4b29-9528-1ed61feed290" (UID: "96e29afc-72d1-4b29-9528-1ed61feed290"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.213329 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3159e4ac-64da-47ba-9c70-b23214e8b8ad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3159e4ac-64da-47ba-9c70-b23214e8b8ad" (UID: "3159e4ac-64da-47ba-9c70-b23214e8b8ad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.213787 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3159e4ac-64da-47ba-9c70-b23214e8b8ad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.213824 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/96e29afc-72d1-4b29-9528-1ed61feed290-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.219039 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3159e4ac-64da-47ba-9c70-b23214e8b8ad-kube-api-access-vz558" (OuterVolumeSpecName: "kube-api-access-vz558") pod "3159e4ac-64da-47ba-9c70-b23214e8b8ad" (UID: "3159e4ac-64da-47ba-9c70-b23214e8b8ad"). InnerVolumeSpecName "kube-api-access-vz558". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.224937 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e29afc-72d1-4b29-9528-1ed61feed290-kube-api-access-mtwh8" (OuterVolumeSpecName: "kube-api-access-mtwh8") pod "96e29afc-72d1-4b29-9528-1ed61feed290" (UID: "96e29afc-72d1-4b29-9528-1ed61feed290"). InnerVolumeSpecName "kube-api-access-mtwh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.316301 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz558\" (UniqueName: \"kubernetes.io/projected/3159e4ac-64da-47ba-9c70-b23214e8b8ad-kube-api-access-vz558\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.316353 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtwh8\" (UniqueName: \"kubernetes.io/projected/96e29afc-72d1-4b29-9528-1ed61feed290-kube-api-access-mtwh8\") on node \"crc\" DevicePath \"\"" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.514684 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-2e95-account-create-update-thqbx" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.514681 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-2e95-account-create-update-thqbx" event={"ID":"3159e4ac-64da-47ba-9c70-b23214e8b8ad","Type":"ContainerDied","Data":"2592b482160778d89ae89f082b4c5d807f6749aa7cd668c503d444d47635c41e"} Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.514835 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2592b482160778d89ae89f082b4c5d807f6749aa7cd668c503d444d47635c41e" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.517028 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cclgv" event={"ID":"96e29afc-72d1-4b29-9528-1ed61feed290","Type":"ContainerDied","Data":"0e7df792d25acaa25d83f037bb13acb74103e06430c54655ed73c3fa1fc654fa"} Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.517080 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e7df792d25acaa25d83f037bb13acb74103e06430c54655ed73c3fa1fc654fa" Mar 08 21:06:52 crc kubenswrapper[4885]: I0308 21:06:52.517099 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cclgv" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.806846 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-4k8w9"] Mar 08 21:06:53 crc kubenswrapper[4885]: E0308 21:06:53.807642 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e29afc-72d1-4b29-9528-1ed61feed290" containerName="mariadb-database-create" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.807658 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e29afc-72d1-4b29-9528-1ed61feed290" containerName="mariadb-database-create" Mar 08 21:06:53 crc kubenswrapper[4885]: E0308 21:06:53.807688 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3159e4ac-64da-47ba-9c70-b23214e8b8ad" containerName="mariadb-account-create-update" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.807699 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3159e4ac-64da-47ba-9c70-b23214e8b8ad" containerName="mariadb-account-create-update" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.807962 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e29afc-72d1-4b29-9528-1ed61feed290" containerName="mariadb-database-create" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.807987 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3159e4ac-64da-47ba-9c70-b23214e8b8ad" containerName="mariadb-account-create-update" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.808738 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.811342 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.811991 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rczmx" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.812383 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.817479 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4k8w9"] Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.944772 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-etc-machine-id\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.944846 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-scripts\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.944887 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5lxr\" (UniqueName: \"kubernetes.io/projected/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-kube-api-access-b5lxr\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.944912 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-db-sync-config-data\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.945102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-config-data\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:53 crc kubenswrapper[4885]: I0308 21:06:53.945170 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-combined-ca-bundle\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.047024 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-scripts\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.047085 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5lxr\" (UniqueName: \"kubernetes.io/projected/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-kube-api-access-b5lxr\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.047136 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-db-sync-config-data\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.048294 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-config-data\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.048745 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-combined-ca-bundle\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.049280 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-etc-machine-id\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.049375 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-etc-machine-id\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.051654 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-scripts\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.052190 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-db-sync-config-data\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.053020 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-config-data\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.053591 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-combined-ca-bundle\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.067483 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5lxr\" (UniqueName: \"kubernetes.io/projected/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-kube-api-access-b5lxr\") pod \"cinder-db-sync-4k8w9\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.145700 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:06:54 crc kubenswrapper[4885]: I0308 21:06:54.635325 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-4k8w9"] Mar 08 21:06:55 crc kubenswrapper[4885]: I0308 21:06:55.553819 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4k8w9" event={"ID":"6adbeb38-5e1d-43e0-a516-2cc65ad853aa","Type":"ContainerStarted","Data":"d64f534661786890c391e49d5e099a75e38f19a2ce6774e8bd719218475416e7"} Mar 08 21:06:55 crc kubenswrapper[4885]: I0308 21:06:55.554345 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4k8w9" event={"ID":"6adbeb38-5e1d-43e0-a516-2cc65ad853aa","Type":"ContainerStarted","Data":"d5c28a2881001ea956077fcf8b61adfbec3be96d16ab2bb26efabc1bd23cb7f3"} Mar 08 21:06:55 crc kubenswrapper[4885]: I0308 21:06:55.591898 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-4k8w9" podStartSLOduration=2.591871349 podStartE2EDuration="2.591871349s" podCreationTimestamp="2026-03-08 21:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:06:55.575320477 +0000 UTC m=+5716.971374550" watchObservedRunningTime="2026-03-08 21:06:55.591871349 +0000 UTC m=+5716.987925412" Mar 08 21:06:58 crc kubenswrapper[4885]: I0308 21:06:58.586079 4885 generic.go:334] "Generic (PLEG): container finished" podID="6adbeb38-5e1d-43e0-a516-2cc65ad853aa" containerID="d64f534661786890c391e49d5e099a75e38f19a2ce6774e8bd719218475416e7" exitCode=0 Mar 08 21:06:58 crc kubenswrapper[4885]: I0308 21:06:58.586204 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4k8w9" event={"ID":"6adbeb38-5e1d-43e0-a516-2cc65ad853aa","Type":"ContainerDied","Data":"d64f534661786890c391e49d5e099a75e38f19a2ce6774e8bd719218475416e7"} Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.002297 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.081381 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5lxr\" (UniqueName: \"kubernetes.io/projected/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-kube-api-access-b5lxr\") pod \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.081426 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-combined-ca-bundle\") pod \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.081562 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-scripts\") pod \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.081596 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-db-sync-config-data\") pod \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.081617 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-etc-machine-id\") pod \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.081695 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-config-data\") pod \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\" (UID: \"6adbeb38-5e1d-43e0-a516-2cc65ad853aa\") " Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.082071 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6adbeb38-5e1d-43e0-a516-2cc65ad853aa" (UID: "6adbeb38-5e1d-43e0-a516-2cc65ad853aa"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.082576 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.086646 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-scripts" (OuterVolumeSpecName: "scripts") pod "6adbeb38-5e1d-43e0-a516-2cc65ad853aa" (UID: "6adbeb38-5e1d-43e0-a516-2cc65ad853aa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.087535 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-kube-api-access-b5lxr" (OuterVolumeSpecName: "kube-api-access-b5lxr") pod "6adbeb38-5e1d-43e0-a516-2cc65ad853aa" (UID: "6adbeb38-5e1d-43e0-a516-2cc65ad853aa"). InnerVolumeSpecName "kube-api-access-b5lxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.087831 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6adbeb38-5e1d-43e0-a516-2cc65ad853aa" (UID: "6adbeb38-5e1d-43e0-a516-2cc65ad853aa"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.108693 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6adbeb38-5e1d-43e0-a516-2cc65ad853aa" (UID: "6adbeb38-5e1d-43e0-a516-2cc65ad853aa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.125719 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-config-data" (OuterVolumeSpecName: "config-data") pod "6adbeb38-5e1d-43e0-a516-2cc65ad853aa" (UID: "6adbeb38-5e1d-43e0-a516-2cc65ad853aa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.184897 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.184952 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5lxr\" (UniqueName: \"kubernetes.io/projected/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-kube-api-access-b5lxr\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.184966 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.184975 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.184983 4885 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6adbeb38-5e1d-43e0-a516-2cc65ad853aa-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.626371 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-4k8w9" event={"ID":"6adbeb38-5e1d-43e0-a516-2cc65ad853aa","Type":"ContainerDied","Data":"d5c28a2881001ea956077fcf8b61adfbec3be96d16ab2bb26efabc1bd23cb7f3"} Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.626437 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5c28a2881001ea956077fcf8b61adfbec3be96d16ab2bb26efabc1bd23cb7f3" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.626549 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-4k8w9" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.978869 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75bb747dd7-kqx6n"] Mar 08 21:07:00 crc kubenswrapper[4885]: E0308 21:07:00.979555 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6adbeb38-5e1d-43e0-a516-2cc65ad853aa" containerName="cinder-db-sync" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.988912 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6adbeb38-5e1d-43e0-a516-2cc65ad853aa" containerName="cinder-db-sync" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.989295 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6adbeb38-5e1d-43e0-a516-2cc65ad853aa" containerName="cinder-db-sync" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.990283 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:00 crc kubenswrapper[4885]: I0308 21:07:00.994226 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb747dd7-kqx6n"] Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.131467 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-config\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.131768 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.131799 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.131900 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-dns-svc\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.131965 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvqpp\" (UniqueName: \"kubernetes.io/projected/938eebde-2664-4ae3-8289-e378affb1274-kube-api-access-wvqpp\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.132300 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.133610 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.135454 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.135628 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.143033 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.147663 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-rczmx" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.150931 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.233934 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-config\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.233998 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rftcn\" (UniqueName: \"kubernetes.io/projected/a4a0f209-cabb-4b78-8e14-17625407e49d-kube-api-access-rftcn\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234027 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234051 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234077 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234104 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234119 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234236 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-dns-svc\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234271 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-scripts\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234637 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvqpp\" (UniqueName: \"kubernetes.io/projected/938eebde-2664-4ae3-8289-e378affb1274-kube-api-access-wvqpp\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.235215 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a0f209-cabb-4b78-8e14-17625407e49d-logs\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.234898 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-config\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.235150 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-dns-svc\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.235019 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-nb\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.235362 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a0f209-cabb-4b78-8e14-17625407e49d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.235611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-sb\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.253666 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvqpp\" (UniqueName: \"kubernetes.io/projected/938eebde-2664-4ae3-8289-e378affb1274-kube-api-access-wvqpp\") pod \"dnsmasq-dns-75bb747dd7-kqx6n\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.309962 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.337398 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rftcn\" (UniqueName: \"kubernetes.io/projected/a4a0f209-cabb-4b78-8e14-17625407e49d-kube-api-access-rftcn\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.337769 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.337797 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.338466 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.338658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-scripts\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.338705 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a0f209-cabb-4b78-8e14-17625407e49d-logs\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.338769 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a0f209-cabb-4b78-8e14-17625407e49d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.338866 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a0f209-cabb-4b78-8e14-17625407e49d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.339682 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a0f209-cabb-4b78-8e14-17625407e49d-logs\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.342217 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-scripts\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.342776 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.343212 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.343307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.360379 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rftcn\" (UniqueName: \"kubernetes.io/projected/a4a0f209-cabb-4b78-8e14-17625407e49d-kube-api-access-rftcn\") pod \"cinder-api-0\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.451764 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 21:07:01 crc kubenswrapper[4885]: W0308 21:07:01.867984 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod938eebde_2664_4ae3_8289_e378affb1274.slice/crio-52e08bc4fa9512096f0690af07df832fb34183cd6a6514d48ec0a3244015bcc9 WatchSource:0}: Error finding container 52e08bc4fa9512096f0690af07df832fb34183cd6a6514d48ec0a3244015bcc9: Status 404 returned error can't find the container with id 52e08bc4fa9512096f0690af07df832fb34183cd6a6514d48ec0a3244015bcc9 Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.869868 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75bb747dd7-kqx6n"] Mar 08 21:07:01 crc kubenswrapper[4885]: I0308 21:07:01.982218 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:02 crc kubenswrapper[4885]: I0308 21:07:02.652275 4885 generic.go:334] "Generic (PLEG): container finished" podID="938eebde-2664-4ae3-8289-e378affb1274" containerID="12fb871f5a239d7fdcc6ca3f845e422dfc2911258d85c6c9852f5cbe4d01cbdc" exitCode=0 Mar 08 21:07:02 crc kubenswrapper[4885]: I0308 21:07:02.652611 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" event={"ID":"938eebde-2664-4ae3-8289-e378affb1274","Type":"ContainerDied","Data":"12fb871f5a239d7fdcc6ca3f845e422dfc2911258d85c6c9852f5cbe4d01cbdc"} Mar 08 21:07:02 crc kubenswrapper[4885]: I0308 21:07:02.652658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" event={"ID":"938eebde-2664-4ae3-8289-e378affb1274","Type":"ContainerStarted","Data":"52e08bc4fa9512096f0690af07df832fb34183cd6a6514d48ec0a3244015bcc9"} Mar 08 21:07:02 crc kubenswrapper[4885]: I0308 21:07:02.655336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a0f209-cabb-4b78-8e14-17625407e49d","Type":"ContainerStarted","Data":"ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c"} Mar 08 21:07:02 crc kubenswrapper[4885]: I0308 21:07:02.655365 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a0f209-cabb-4b78-8e14-17625407e49d","Type":"ContainerStarted","Data":"73a221bfa90434743d951101b28f8fc6f753d32cb93f954ef81be7dadb45dff1"} Mar 08 21:07:03 crc kubenswrapper[4885]: I0308 21:07:03.664414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" event={"ID":"938eebde-2664-4ae3-8289-e378affb1274","Type":"ContainerStarted","Data":"196df1785b0b84f41c7ec30c506906a12662281699b2188f8f32887c1ea78555"} Mar 08 21:07:03 crc kubenswrapper[4885]: I0308 21:07:03.664800 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:03 crc kubenswrapper[4885]: I0308 21:07:03.667004 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a0f209-cabb-4b78-8e14-17625407e49d","Type":"ContainerStarted","Data":"8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83"} Mar 08 21:07:03 crc kubenswrapper[4885]: I0308 21:07:03.667295 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 08 21:07:03 crc kubenswrapper[4885]: I0308 21:07:03.694845 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" podStartSLOduration=3.69482053 podStartE2EDuration="3.69482053s" podCreationTimestamp="2026-03-08 21:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:03.682939804 +0000 UTC m=+5725.078993827" watchObservedRunningTime="2026-03-08 21:07:03.69482053 +0000 UTC m=+5725.090874563" Mar 08 21:07:03 crc kubenswrapper[4885]: I0308 21:07:03.716198 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.716180511 podStartE2EDuration="2.716180511s" podCreationTimestamp="2026-03-08 21:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:03.70679983 +0000 UTC m=+5725.102853843" watchObservedRunningTime="2026-03-08 21:07:03.716180511 +0000 UTC m=+5725.112234534" Mar 08 21:07:05 crc kubenswrapper[4885]: I0308 21:07:05.528583 4885 scope.go:117] "RemoveContainer" containerID="c0d8f0f8a4a0c8ee7bf5279891872ae208bc8c52a779f98dff22752b9bff60d5" Mar 08 21:07:11 crc kubenswrapper[4885]: I0308 21:07:11.312150 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:07:11 crc kubenswrapper[4885]: I0308 21:07:11.407498 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cf6665877-kr6fn"] Mar 08 21:07:11 crc kubenswrapper[4885]: I0308 21:07:11.407892 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerName="dnsmasq-dns" containerID="cri-o://750d78661c34c956acafb525ed97e2c26316d92d4d7f7130df1013f1f9bc8ad8" gracePeriod=10 Mar 08 21:07:11 crc kubenswrapper[4885]: I0308 21:07:11.782225 4885 generic.go:334] "Generic (PLEG): container finished" podID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerID="750d78661c34c956acafb525ed97e2c26316d92d4d7f7130df1013f1f9bc8ad8" exitCode=0 Mar 08 21:07:11 crc kubenswrapper[4885]: I0308 21:07:11.782471 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" event={"ID":"2ed9193c-0c46-47cb-af24-c8415837c19b","Type":"ContainerDied","Data":"750d78661c34c956acafb525ed97e2c26316d92d4d7f7130df1013f1f9bc8ad8"} Mar 08 21:07:11 crc kubenswrapper[4885]: I0308 21:07:11.924810 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.119894 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-dns-svc\") pod \"2ed9193c-0c46-47cb-af24-c8415837c19b\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.119985 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-config\") pod \"2ed9193c-0c46-47cb-af24-c8415837c19b\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.120063 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdc7k\" (UniqueName: \"kubernetes.io/projected/2ed9193c-0c46-47cb-af24-c8415837c19b-kube-api-access-pdc7k\") pod \"2ed9193c-0c46-47cb-af24-c8415837c19b\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.120120 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-nb\") pod \"2ed9193c-0c46-47cb-af24-c8415837c19b\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.120144 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-sb\") pod \"2ed9193c-0c46-47cb-af24-c8415837c19b\" (UID: \"2ed9193c-0c46-47cb-af24-c8415837c19b\") " Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.127898 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ed9193c-0c46-47cb-af24-c8415837c19b-kube-api-access-pdc7k" (OuterVolumeSpecName: "kube-api-access-pdc7k") pod "2ed9193c-0c46-47cb-af24-c8415837c19b" (UID: "2ed9193c-0c46-47cb-af24-c8415837c19b"). InnerVolumeSpecName "kube-api-access-pdc7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.171498 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2ed9193c-0c46-47cb-af24-c8415837c19b" (UID: "2ed9193c-0c46-47cb-af24-c8415837c19b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.172669 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-config" (OuterVolumeSpecName: "config") pod "2ed9193c-0c46-47cb-af24-c8415837c19b" (UID: "2ed9193c-0c46-47cb-af24-c8415837c19b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.178990 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2ed9193c-0c46-47cb-af24-c8415837c19b" (UID: "2ed9193c-0c46-47cb-af24-c8415837c19b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.210086 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2ed9193c-0c46-47cb-af24-c8415837c19b" (UID: "2ed9193c-0c46-47cb-af24-c8415837c19b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.222592 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdc7k\" (UniqueName: \"kubernetes.io/projected/2ed9193c-0c46-47cb-af24-c8415837c19b-kube-api-access-pdc7k\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.222627 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.222637 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.222647 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.222656 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ed9193c-0c46-47cb-af24-c8415837c19b-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.723844 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.724325 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-log" containerID="cri-o://9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.724461 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-metadata" containerID="cri-o://0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.744548 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.744781 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerName="nova-scheduler-scheduler" containerID="cri-o://6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.755791 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.756060 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-log" containerID="cri-o://d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.756134 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-api" containerID="cri-o://2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.768988 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.769193 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="7213b9f1-1c28-4e32-b68b-8f7464f38de0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.806402 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" event={"ID":"2ed9193c-0c46-47cb-af24-c8415837c19b","Type":"ContainerDied","Data":"13a5f96703742b15ab41ce5ca4bff51ff0ff5f629fdccca2879c9831c1547b90"} Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.806516 4885 scope.go:117] "RemoveContainer" containerID="750d78661c34c956acafb525ed97e2c26316d92d4d7f7130df1013f1f9bc8ad8" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.806625 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cf6665877-kr6fn" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.814125 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.814312 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="1dff0b58-ac0f-4d39-9910-f924fff8f816" containerName="nova-cell0-conductor-conductor" containerID="cri-o://8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647" gracePeriod=30 Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.827395 4885 scope.go:117] "RemoveContainer" containerID="44db3e80d53ecaa3d76c25ae2231f68ed9a8e2480156df67bfa8787c436f51c1" Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.885453 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5cf6665877-kr6fn"] Mar 08 21:07:12 crc kubenswrapper[4885]: I0308 21:07:12.894294 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5cf6665877-kr6fn"] Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.386005 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" path="/var/lib/kubelet/pods/2ed9193c-0c46-47cb-af24-c8415837c19b/volumes" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.401401 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.488630 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.545819 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.551675 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.552968 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.553011 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerName="nova-scheduler-scheduler" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.644343 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dsgd\" (UniqueName: \"kubernetes.io/projected/7213b9f1-1c28-4e32-b68b-8f7464f38de0-kube-api-access-2dsgd\") pod \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.644439 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-config-data\") pod \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.644581 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-combined-ca-bundle\") pod \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\" (UID: \"7213b9f1-1c28-4e32-b68b-8f7464f38de0\") " Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.666489 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7213b9f1-1c28-4e32-b68b-8f7464f38de0-kube-api-access-2dsgd" (OuterVolumeSpecName: "kube-api-access-2dsgd") pod "7213b9f1-1c28-4e32-b68b-8f7464f38de0" (UID: "7213b9f1-1c28-4e32-b68b-8f7464f38de0"). InnerVolumeSpecName "kube-api-access-2dsgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.668534 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7213b9f1-1c28-4e32-b68b-8f7464f38de0" (UID: "7213b9f1-1c28-4e32-b68b-8f7464f38de0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.668566 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-config-data" (OuterVolumeSpecName: "config-data") pod "7213b9f1-1c28-4e32-b68b-8f7464f38de0" (UID: "7213b9f1-1c28-4e32-b68b-8f7464f38de0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.746328 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.748298 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7213b9f1-1c28-4e32-b68b-8f7464f38de0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.748364 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dsgd\" (UniqueName: \"kubernetes.io/projected/7213b9f1-1c28-4e32-b68b-8f7464f38de0-kube-api-access-2dsgd\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.817066 4885 generic.go:334] "Generic (PLEG): container finished" podID="994b00da-2d97-4508-8f36-b517afab98e1" containerID="9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b" exitCode=143 Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.817489 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"994b00da-2d97-4508-8f36-b517afab98e1","Type":"ContainerDied","Data":"9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b"} Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.820776 4885 generic.go:334] "Generic (PLEG): container finished" podID="3195111b-b266-425b-82da-98f3d0a29f0e" containerID="d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324" exitCode=143 Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.820847 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3195111b-b266-425b-82da-98f3d0a29f0e","Type":"ContainerDied","Data":"d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324"} Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.822609 4885 generic.go:334] "Generic (PLEG): container finished" podID="7213b9f1-1c28-4e32-b68b-8f7464f38de0" containerID="2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59" exitCode=0 Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.822633 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7213b9f1-1c28-4e32-b68b-8f7464f38de0","Type":"ContainerDied","Data":"2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59"} Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.822649 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7213b9f1-1c28-4e32-b68b-8f7464f38de0","Type":"ContainerDied","Data":"6877cb37b34c366b3176b008053204c71bb5733fa92d9537850aea7c85b6ca99"} Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.822665 4885 scope.go:117] "RemoveContainer" containerID="2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.822745 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.856862 4885 scope.go:117] "RemoveContainer" containerID="2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59" Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.857284 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59\": container with ID starting with 2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59 not found: ID does not exist" containerID="2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.857310 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59"} err="failed to get container status \"2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59\": rpc error: code = NotFound desc = could not find container \"2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59\": container with ID starting with 2110e88399a37c06073901d1481f76cd9f7f6f1ef7ac7e44fbdd3a49369d8f59 not found: ID does not exist" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.877047 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.886218 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.899642 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.900043 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7213b9f1-1c28-4e32-b68b-8f7464f38de0" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.900059 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7213b9f1-1c28-4e32-b68b-8f7464f38de0" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.900083 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerName="init" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.900089 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerName="init" Mar 08 21:07:13 crc kubenswrapper[4885]: E0308 21:07:13.900100 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerName="dnsmasq-dns" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.900132 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerName="dnsmasq-dns" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.900301 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7213b9f1-1c28-4e32-b68b-8f7464f38de0" containerName="nova-cell1-novncproxy-novncproxy" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.900311 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ed9193c-0c46-47cb-af24-c8415837c19b" containerName="dnsmasq-dns" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.900881 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.903079 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.918848 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.952448 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2kr4\" (UniqueName: \"kubernetes.io/projected/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-kube-api-access-t2kr4\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.952507 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:13 crc kubenswrapper[4885]: I0308 21:07:13.952544 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.054703 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.054886 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2kr4\" (UniqueName: \"kubernetes.io/projected/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-kube-api-access-t2kr4\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.054953 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.058838 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.074007 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.082479 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2kr4\" (UniqueName: \"kubernetes.io/projected/b55b7c8a-8888-43e1-a593-d3a1f00cba4c-kube-api-access-t2kr4\") pod \"nova-cell1-novncproxy-0\" (UID: \"b55b7c8a-8888-43e1-a593-d3a1f00cba4c\") " pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.226478 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.697227 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.780200 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bp6b\" (UniqueName: \"kubernetes.io/projected/1dff0b58-ac0f-4d39-9910-f924fff8f816-kube-api-access-8bp6b\") pod \"1dff0b58-ac0f-4d39-9910-f924fff8f816\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.780262 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-config-data\") pod \"1dff0b58-ac0f-4d39-9910-f924fff8f816\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.780399 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-combined-ca-bundle\") pod \"1dff0b58-ac0f-4d39-9910-f924fff8f816\" (UID: \"1dff0b58-ac0f-4d39-9910-f924fff8f816\") " Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.787003 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dff0b58-ac0f-4d39-9910-f924fff8f816-kube-api-access-8bp6b" (OuterVolumeSpecName: "kube-api-access-8bp6b") pod "1dff0b58-ac0f-4d39-9910-f924fff8f816" (UID: "1dff0b58-ac0f-4d39-9910-f924fff8f816"). InnerVolumeSpecName "kube-api-access-8bp6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.808398 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-config-data" (OuterVolumeSpecName: "config-data") pod "1dff0b58-ac0f-4d39-9910-f924fff8f816" (UID: "1dff0b58-ac0f-4d39-9910-f924fff8f816"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.812419 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1dff0b58-ac0f-4d39-9910-f924fff8f816" (UID: "1dff0b58-ac0f-4d39-9910-f924fff8f816"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.851672 4885 generic.go:334] "Generic (PLEG): container finished" podID="1dff0b58-ac0f-4d39-9910-f924fff8f816" containerID="8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647" exitCode=0 Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.851777 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1dff0b58-ac0f-4d39-9910-f924fff8f816","Type":"ContainerDied","Data":"8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647"} Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.851807 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1dff0b58-ac0f-4d39-9910-f924fff8f816","Type":"ContainerDied","Data":"31d1be971bb445599e9b2b87dbc985813f03ee86921b07147d64305072f4cfbe"} Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.851845 4885 scope.go:117] "RemoveContainer" containerID="8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.852104 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.864785 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.889228 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.889285 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bp6b\" (UniqueName: \"kubernetes.io/projected/1dff0b58-ac0f-4d39-9910-f924fff8f816-kube-api-access-8bp6b\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.889311 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1dff0b58-ac0f-4d39-9910-f924fff8f816-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.890535 4885 scope.go:117] "RemoveContainer" containerID="8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647" Mar 08 21:07:14 crc kubenswrapper[4885]: E0308 21:07:14.891146 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647\": container with ID starting with 8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647 not found: ID does not exist" containerID="8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.891300 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647"} err="failed to get container status \"8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647\": rpc error: code = NotFound desc = could not find container \"8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647\": container with ID starting with 8c5208eef7308487edea8523eb42cc9f890d2e74b7efe4b67fd2a71e8ba46647 not found: ID does not exist" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.898204 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.908287 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.917713 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:07:14 crc kubenswrapper[4885]: E0308 21:07:14.918399 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dff0b58-ac0f-4d39-9910-f924fff8f816" containerName="nova-cell0-conductor-conductor" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.918592 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dff0b58-ac0f-4d39-9910-f924fff8f816" containerName="nova-cell0-conductor-conductor" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.919023 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dff0b58-ac0f-4d39-9910-f924fff8f816" containerName="nova-cell0-conductor-conductor" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.921898 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.924833 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.950415 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.997427 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv9z6\" (UniqueName: \"kubernetes.io/projected/b042d37f-f908-40b8-88be-21798a9428f6-kube-api-access-tv9z6\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.997518 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:14 crc kubenswrapper[4885]: I0308 21:07:14.997581 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.099185 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv9z6\" (UniqueName: \"kubernetes.io/projected/b042d37f-f908-40b8-88be-21798a9428f6-kube-api-access-tv9z6\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.099280 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.099342 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.105246 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.105400 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.117129 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv9z6\" (UniqueName: \"kubernetes.io/projected/b042d37f-f908-40b8-88be-21798a9428f6-kube-api-access-tv9z6\") pod \"nova-cell0-conductor-0\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.247871 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.385460 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dff0b58-ac0f-4d39-9910-f924fff8f816" path="/var/lib/kubelet/pods/1dff0b58-ac0f-4d39-9910-f924fff8f816/volumes" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.386579 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7213b9f1-1c28-4e32-b68b-8f7464f38de0" path="/var/lib/kubelet/pods/7213b9f1-1c28-4e32-b68b-8f7464f38de0/volumes" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.775626 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.870910 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b55b7c8a-8888-43e1-a593-d3a1f00cba4c","Type":"ContainerStarted","Data":"922183f45030e12d00e8b807707859ba7744efe0c6e294484c02f0e7e3f3c408"} Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.871000 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b55b7c8a-8888-43e1-a593-d3a1f00cba4c","Type":"ContainerStarted","Data":"65befdca19bb181cc8ab8864bc03f2df7d05201e6bf238aa9c4fd5e61334e69c"} Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.874044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b042d37f-f908-40b8-88be-21798a9428f6","Type":"ContainerStarted","Data":"80d76197d607834e3f09a7896979af3b1a5308464484d654754ae837dd80a0ab"} Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.905152 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.905103398 podStartE2EDuration="2.905103398s" podCreationTimestamp="2026-03-08 21:07:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:15.885175556 +0000 UTC m=+5737.281229629" watchObservedRunningTime="2026-03-08 21:07:15.905103398 +0000 UTC m=+5737.301157451" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.910814 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.116:8775/\": read tcp 10.217.0.2:59796->10.217.1.116:8775: read: connection reset by peer" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.911144 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.116:8775/\": read tcp 10.217.0.2:59798->10.217.1.116:8775: read: connection reset by peer" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.933899 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.117:8774/\": read tcp 10.217.0.2:45742->10.217.1.117:8774: read: connection reset by peer" Mar 08 21:07:15 crc kubenswrapper[4885]: I0308 21:07:15.934387 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-api-0" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.117:8774/\": read tcp 10.217.0.2:45744->10.217.1.117:8774: read: connection reset by peer" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.010518 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.010709 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="69024278-2c5f-4862-ac44-e04663a0c4a5" containerName="nova-cell1-conductor-conductor" containerID="cri-o://b24462edde9f60cfa7555c270a546d933c909d490fc62394b9ed6e4a826084f2" gracePeriod=30 Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.313372 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.378513 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427233 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpccf\" (UniqueName: \"kubernetes.io/projected/3195111b-b266-425b-82da-98f3d0a29f0e-kube-api-access-vpccf\") pod \"3195111b-b266-425b-82da-98f3d0a29f0e\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427426 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-config-data\") pod \"3195111b-b266-425b-82da-98f3d0a29f0e\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427487 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-combined-ca-bundle\") pod \"3195111b-b266-425b-82da-98f3d0a29f0e\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427521 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-config-data\") pod \"994b00da-2d97-4508-8f36-b517afab98e1\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427543 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-combined-ca-bundle\") pod \"994b00da-2d97-4508-8f36-b517afab98e1\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427570 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74c4d\" (UniqueName: \"kubernetes.io/projected/994b00da-2d97-4508-8f36-b517afab98e1-kube-api-access-74c4d\") pod \"994b00da-2d97-4508-8f36-b517afab98e1\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427591 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3195111b-b266-425b-82da-98f3d0a29f0e-logs\") pod \"3195111b-b266-425b-82da-98f3d0a29f0e\" (UID: \"3195111b-b266-425b-82da-98f3d0a29f0e\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.427616 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/994b00da-2d97-4508-8f36-b517afab98e1-logs\") pod \"994b00da-2d97-4508-8f36-b517afab98e1\" (UID: \"994b00da-2d97-4508-8f36-b517afab98e1\") " Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.428656 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/994b00da-2d97-4508-8f36-b517afab98e1-logs" (OuterVolumeSpecName: "logs") pod "994b00da-2d97-4508-8f36-b517afab98e1" (UID: "994b00da-2d97-4508-8f36-b517afab98e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.431237 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3195111b-b266-425b-82da-98f3d0a29f0e-logs" (OuterVolumeSpecName: "logs") pod "3195111b-b266-425b-82da-98f3d0a29f0e" (UID: "3195111b-b266-425b-82da-98f3d0a29f0e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.442705 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/994b00da-2d97-4508-8f36-b517afab98e1-kube-api-access-74c4d" (OuterVolumeSpecName: "kube-api-access-74c4d") pod "994b00da-2d97-4508-8f36-b517afab98e1" (UID: "994b00da-2d97-4508-8f36-b517afab98e1"). InnerVolumeSpecName "kube-api-access-74c4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.448973 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3195111b-b266-425b-82da-98f3d0a29f0e-kube-api-access-vpccf" (OuterVolumeSpecName: "kube-api-access-vpccf") pod "3195111b-b266-425b-82da-98f3d0a29f0e" (UID: "3195111b-b266-425b-82da-98f3d0a29f0e"). InnerVolumeSpecName "kube-api-access-vpccf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.472331 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3195111b-b266-425b-82da-98f3d0a29f0e" (UID: "3195111b-b266-425b-82da-98f3d0a29f0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.480799 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-config-data" (OuterVolumeSpecName: "config-data") pod "994b00da-2d97-4508-8f36-b517afab98e1" (UID: "994b00da-2d97-4508-8f36-b517afab98e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.482530 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "994b00da-2d97-4508-8f36-b517afab98e1" (UID: "994b00da-2d97-4508-8f36-b517afab98e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.485176 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-config-data" (OuterVolumeSpecName: "config-data") pod "3195111b-b266-425b-82da-98f3d0a29f0e" (UID: "3195111b-b266-425b-82da-98f3d0a29f0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530129 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530157 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/994b00da-2d97-4508-8f36-b517afab98e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530168 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74c4d\" (UniqueName: \"kubernetes.io/projected/994b00da-2d97-4508-8f36-b517afab98e1-kube-api-access-74c4d\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530180 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3195111b-b266-425b-82da-98f3d0a29f0e-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530188 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/994b00da-2d97-4508-8f36-b517afab98e1-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530200 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpccf\" (UniqueName: \"kubernetes.io/projected/3195111b-b266-425b-82da-98f3d0a29f0e-kube-api-access-vpccf\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530208 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.530231 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3195111b-b266-425b-82da-98f3d0a29f0e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.883383 4885 generic.go:334] "Generic (PLEG): container finished" podID="994b00da-2d97-4508-8f36-b517afab98e1" containerID="0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49" exitCode=0 Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.883468 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"994b00da-2d97-4508-8f36-b517afab98e1","Type":"ContainerDied","Data":"0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49"} Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.883501 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"994b00da-2d97-4508-8f36-b517afab98e1","Type":"ContainerDied","Data":"ed98b0990237ad316a273718be6c6f8f3198828e148541a9840c7e6321b7e7da"} Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.883472 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.883520 4885 scope.go:117] "RemoveContainer" containerID="0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.886502 4885 generic.go:334] "Generic (PLEG): container finished" podID="3195111b-b266-425b-82da-98f3d0a29f0e" containerID="2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895" exitCode=0 Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.886586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3195111b-b266-425b-82da-98f3d0a29f0e","Type":"ContainerDied","Data":"2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895"} Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.886621 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3195111b-b266-425b-82da-98f3d0a29f0e","Type":"ContainerDied","Data":"77ebc86a187e7f428857986b384dc697b96b7685acfe2d360f9674aa240afe23"} Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.886693 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.915248 4885 scope.go:117] "RemoveContainer" containerID="9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.921189 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b042d37f-f908-40b8-88be-21798a9428f6","Type":"ContainerStarted","Data":"3eb13c978d994d153edf32828a7f7fe52fe2b797781fd84bb013541ebf4584bf"} Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.921248 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.946220 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.966984 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.979044 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.980297 4885 scope.go:117] "RemoveContainer" containerID="0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49" Mar 08 21:07:16 crc kubenswrapper[4885]: E0308 21:07:16.986261 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49\": container with ID starting with 0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49 not found: ID does not exist" containerID="0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.986310 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49"} err="failed to get container status \"0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49\": rpc error: code = NotFound desc = could not find container \"0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49\": container with ID starting with 0ae8785a3a89acfb90ad8bee63d2b016a95ae259ac0ff6a7fa04363d093feb49 not found: ID does not exist" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.986333 4885 scope.go:117] "RemoveContainer" containerID="9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.994751 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:07:16 crc kubenswrapper[4885]: E0308 21:07:16.995050 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b\": container with ID starting with 9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b not found: ID does not exist" containerID="9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.995094 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b"} err="failed to get container status \"9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b\": rpc error: code = NotFound desc = could not find container \"9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b\": container with ID starting with 9090c0e8edc6ea9bbbf94db1a875b385955d05cb60af9b564e3bf629624b6a0b not found: ID does not exist" Mar 08 21:07:16 crc kubenswrapper[4885]: I0308 21:07:16.995116 4885 scope.go:117] "RemoveContainer" containerID="2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.021159 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=3.021143956 podStartE2EDuration="3.021143956s" podCreationTimestamp="2026-03-08 21:07:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:16.972272222 +0000 UTC m=+5738.368326245" watchObservedRunningTime="2026-03-08 21:07:17.021143956 +0000 UTC m=+5738.417197979" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026249 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:07:17 crc kubenswrapper[4885]: E0308 21:07:17.026631 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-log" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026642 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-log" Mar 08 21:07:17 crc kubenswrapper[4885]: E0308 21:07:17.026657 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-log" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026664 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-log" Mar 08 21:07:17 crc kubenswrapper[4885]: E0308 21:07:17.026674 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-api" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026680 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-api" Mar 08 21:07:17 crc kubenswrapper[4885]: E0308 21:07:17.026691 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-metadata" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026697 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-metadata" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026861 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-log" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026880 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-log" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026892 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" containerName="nova-api-api" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.026899 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="994b00da-2d97-4508-8f36-b517afab98e1" containerName="nova-metadata-metadata" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.052946 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.056377 4885 scope.go:117] "RemoveContainer" containerID="d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.067990 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.091212 4885 scope.go:117] "RemoveContainer" containerID="2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895" Mar 08 21:07:17 crc kubenswrapper[4885]: E0308 21:07:17.093427 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895\": container with ID starting with 2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895 not found: ID does not exist" containerID="2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.093553 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895"} err="failed to get container status \"2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895\": rpc error: code = NotFound desc = could not find container \"2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895\": container with ID starting with 2523b38d6bc4f4be98aa0ffbf6d7819e81e30cc6b9c051dce826561140f85895 not found: ID does not exist" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.093628 4885 scope.go:117] "RemoveContainer" containerID="d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324" Mar 08 21:07:17 crc kubenswrapper[4885]: E0308 21:07:17.098451 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324\": container with ID starting with d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324 not found: ID does not exist" containerID="d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.098478 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324"} err="failed to get container status \"d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324\": rpc error: code = NotFound desc = could not find container \"d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324\": container with ID starting with d1d5409633d6f5d5c8ac4093e997c8c3251ad08c5cd1cb555d6d70bb77704324 not found: ID does not exist" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.109689 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.119017 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.120549 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.122453 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.126851 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.154075 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.154147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-logs\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.154275 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-config-data\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.154350 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56dvv\" (UniqueName: \"kubernetes.io/projected/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-kube-api-access-56dvv\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256121 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56dvv\" (UniqueName: \"kubernetes.io/projected/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-kube-api-access-56dvv\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256412 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-config-data\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256438 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256455 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-logs\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256506 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256529 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f008edbb-a92d-45b1-ab9d-a56978d20e75-logs\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256581 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-config-data\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.256610 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttvw5\" (UniqueName: \"kubernetes.io/projected/f008edbb-a92d-45b1-ab9d-a56978d20e75-kube-api-access-ttvw5\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.257053 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-logs\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.261434 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.265774 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-config-data\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.294632 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56dvv\" (UniqueName: \"kubernetes.io/projected/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-kube-api-access-56dvv\") pod \"nova-metadata-0\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.358194 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-config-data\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.358353 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.358428 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f008edbb-a92d-45b1-ab9d-a56978d20e75-logs\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.358615 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttvw5\" (UniqueName: \"kubernetes.io/projected/f008edbb-a92d-45b1-ab9d-a56978d20e75-kube-api-access-ttvw5\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.359387 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f008edbb-a92d-45b1-ab9d-a56978d20e75-logs\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.361247 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.365443 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-config-data\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.379123 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttvw5\" (UniqueName: \"kubernetes.io/projected/f008edbb-a92d-45b1-ab9d-a56978d20e75-kube-api-access-ttvw5\") pod \"nova-api-0\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.381238 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3195111b-b266-425b-82da-98f3d0a29f0e" path="/var/lib/kubelet/pods/3195111b-b266-425b-82da-98f3d0a29f0e/volumes" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.381981 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="994b00da-2d97-4508-8f36-b517afab98e1" path="/var/lib/kubelet/pods/994b00da-2d97-4508-8f36-b517afab98e1/volumes" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.394605 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.435705 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.929764 4885 generic.go:334] "Generic (PLEG): container finished" podID="69024278-2c5f-4862-ac44-e04663a0c4a5" containerID="b24462edde9f60cfa7555c270a546d933c909d490fc62394b9ed6e4a826084f2" exitCode=0 Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.929993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69024278-2c5f-4862-ac44-e04663a0c4a5","Type":"ContainerDied","Data":"b24462edde9f60cfa7555c270a546d933c909d490fc62394b9ed6e4a826084f2"} Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.930199 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"69024278-2c5f-4862-ac44-e04663a0c4a5","Type":"ContainerDied","Data":"38f1ceae301c390ad8c08deda2ca09e48e527fb1b139f821cdcf653ea04147c0"} Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.930224 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38f1ceae301c390ad8c08deda2ca09e48e527fb1b139f821cdcf653ea04147c0" Mar 08 21:07:17 crc kubenswrapper[4885]: I0308 21:07:17.941360 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.012499 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.024442 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.201779 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7jsn\" (UniqueName: \"kubernetes.io/projected/69024278-2c5f-4862-ac44-e04663a0c4a5-kube-api-access-d7jsn\") pod \"69024278-2c5f-4862-ac44-e04663a0c4a5\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.202484 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-config-data\") pod \"69024278-2c5f-4862-ac44-e04663a0c4a5\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.202648 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-combined-ca-bundle\") pod \"69024278-2c5f-4862-ac44-e04663a0c4a5\" (UID: \"69024278-2c5f-4862-ac44-e04663a0c4a5\") " Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.205964 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69024278-2c5f-4862-ac44-e04663a0c4a5-kube-api-access-d7jsn" (OuterVolumeSpecName: "kube-api-access-d7jsn") pod "69024278-2c5f-4862-ac44-e04663a0c4a5" (UID: "69024278-2c5f-4862-ac44-e04663a0c4a5"). InnerVolumeSpecName "kube-api-access-d7jsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.233162 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69024278-2c5f-4862-ac44-e04663a0c4a5" (UID: "69024278-2c5f-4862-ac44-e04663a0c4a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.256980 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-config-data" (OuterVolumeSpecName: "config-data") pod "69024278-2c5f-4862-ac44-e04663a0c4a5" (UID: "69024278-2c5f-4862-ac44-e04663a0c4a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.304687 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7jsn\" (UniqueName: \"kubernetes.io/projected/69024278-2c5f-4862-ac44-e04663a0c4a5-kube-api-access-d7jsn\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.304722 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.304732 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69024278-2c5f-4862-ac44-e04663a0c4a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:18 crc kubenswrapper[4885]: E0308 21:07:18.542837 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:07:18 crc kubenswrapper[4885]: E0308 21:07:18.548904 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:07:18 crc kubenswrapper[4885]: E0308 21:07:18.549906 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:07:18 crc kubenswrapper[4885]: E0308 21:07:18.549989 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerName="nova-scheduler-scheduler" Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.951190 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f008edbb-a92d-45b1-ab9d-a56978d20e75","Type":"ContainerStarted","Data":"cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be"} Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.951279 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f008edbb-a92d-45b1-ab9d-a56978d20e75","Type":"ContainerStarted","Data":"2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea"} Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.951320 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f008edbb-a92d-45b1-ab9d-a56978d20e75","Type":"ContainerStarted","Data":"e11dd41c5f061aeb15ad1b8571823004739345378aecc51a3efc7e9384ef84f7"} Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.970525 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a","Type":"ContainerStarted","Data":"88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc"} Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.970598 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a","Type":"ContainerStarted","Data":"37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1"} Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.970620 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a","Type":"ContainerStarted","Data":"d6b2799aab2f7b019fc1cf13b4754da3de421e26bd6b42741c010f54fac4b62f"} Mar 08 21:07:18 crc kubenswrapper[4885]: I0308 21:07:18.971835 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.001692 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.001668103 podStartE2EDuration="3.001668103s" podCreationTimestamp="2026-03-08 21:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:18.992784976 +0000 UTC m=+5740.388839019" watchObservedRunningTime="2026-03-08 21:07:19.001668103 +0000 UTC m=+5740.397722136" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.018137 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.018107962 podStartE2EDuration="3.018107962s" podCreationTimestamp="2026-03-08 21:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:19.0101548 +0000 UTC m=+5740.406208843" watchObservedRunningTime="2026-03-08 21:07:19.018107962 +0000 UTC m=+5740.414161995" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.049090 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.067939 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.079372 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:07:19 crc kubenswrapper[4885]: E0308 21:07:19.079973 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69024278-2c5f-4862-ac44-e04663a0c4a5" containerName="nova-cell1-conductor-conductor" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.080004 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="69024278-2c5f-4862-ac44-e04663a0c4a5" containerName="nova-cell1-conductor-conductor" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.080280 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="69024278-2c5f-4862-ac44-e04663a0c4a5" containerName="nova-cell1-conductor-conductor" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.081191 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.087014 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.106010 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.117908 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.118150 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vfz6\" (UniqueName: \"kubernetes.io/projected/95d9d37c-0204-47e9-956d-d93f2dd1e94d-kube-api-access-5vfz6\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.118279 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.219447 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.219599 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vfz6\" (UniqueName: \"kubernetes.io/projected/95d9d37c-0204-47e9-956d-d93f2dd1e94d-kube-api-access-5vfz6\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.219634 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.224754 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.227417 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.231121 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.240893 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vfz6\" (UniqueName: \"kubernetes.io/projected/95d9d37c-0204-47e9-956d-d93f2dd1e94d-kube-api-access-5vfz6\") pod \"nova-cell1-conductor-0\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.380981 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69024278-2c5f-4862-ac44-e04663a0c4a5" path="/var/lib/kubelet/pods/69024278-2c5f-4862-ac44-e04663a0c4a5/volumes" Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.420738 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:19 crc kubenswrapper[4885]: W0308 21:07:19.919383 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95d9d37c_0204_47e9_956d_d93f2dd1e94d.slice/crio-4989a2ecee24254fe4aca7e76f1e2e40b733be1e70cbeec7d60131da54ab2fd4 WatchSource:0}: Error finding container 4989a2ecee24254fe4aca7e76f1e2e40b733be1e70cbeec7d60131da54ab2fd4: Status 404 returned error can't find the container with id 4989a2ecee24254fe4aca7e76f1e2e40b733be1e70cbeec7d60131da54ab2fd4 Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.924152 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:07:19 crc kubenswrapper[4885]: I0308 21:07:19.992012 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"95d9d37c-0204-47e9-956d-d93f2dd1e94d","Type":"ContainerStarted","Data":"4989a2ecee24254fe4aca7e76f1e2e40b733be1e70cbeec7d60131da54ab2fd4"} Mar 08 21:07:21 crc kubenswrapper[4885]: I0308 21:07:21.004165 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"95d9d37c-0204-47e9-956d-d93f2dd1e94d","Type":"ContainerStarted","Data":"a2acd6bf101df1fb3fec18f77f85983c836074e8e9c6420f407dfa0fe15f85c1"} Mar 08 21:07:21 crc kubenswrapper[4885]: I0308 21:07:21.005572 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:21 crc kubenswrapper[4885]: I0308 21:07:21.046444 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.046426583 podStartE2EDuration="2.046426583s" podCreationTimestamp="2026-03-08 21:07:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:21.034127365 +0000 UTC m=+5742.430181388" watchObservedRunningTime="2026-03-08 21:07:21.046426583 +0000 UTC m=+5742.442480606" Mar 08 21:07:22 crc kubenswrapper[4885]: E0308 21:07:22.381712 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a29a091_3ebc_4dbb_b876_19892bedba02.slice/crio-conmon-6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd.scope\": RecentStats: unable to find data in memory cache]" Mar 08 21:07:22 crc kubenswrapper[4885]: I0308 21:07:22.396456 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:07:22 crc kubenswrapper[4885]: I0308 21:07:22.396890 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:07:22 crc kubenswrapper[4885]: I0308 21:07:22.829449 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:07:22 crc kubenswrapper[4885]: I0308 21:07:22.993187 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-config-data\") pod \"6a29a091-3ebc-4dbb-b876-19892bedba02\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " Mar 08 21:07:22 crc kubenswrapper[4885]: I0308 21:07:22.993665 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-combined-ca-bundle\") pod \"6a29a091-3ebc-4dbb-b876-19892bedba02\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " Mar 08 21:07:22 crc kubenswrapper[4885]: I0308 21:07:22.993793 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sf6d\" (UniqueName: \"kubernetes.io/projected/6a29a091-3ebc-4dbb-b876-19892bedba02-kube-api-access-5sf6d\") pod \"6a29a091-3ebc-4dbb-b876-19892bedba02\" (UID: \"6a29a091-3ebc-4dbb-b876-19892bedba02\") " Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.009030 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a29a091-3ebc-4dbb-b876-19892bedba02-kube-api-access-5sf6d" (OuterVolumeSpecName: "kube-api-access-5sf6d") pod "6a29a091-3ebc-4dbb-b876-19892bedba02" (UID: "6a29a091-3ebc-4dbb-b876-19892bedba02"). InnerVolumeSpecName "kube-api-access-5sf6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.021890 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a29a091-3ebc-4dbb-b876-19892bedba02" (UID: "6a29a091-3ebc-4dbb-b876-19892bedba02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.027202 4885 generic.go:334] "Generic (PLEG): container finished" podID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" exitCode=0 Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.027288 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.027368 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a29a091-3ebc-4dbb-b876-19892bedba02","Type":"ContainerDied","Data":"6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd"} Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.027433 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6a29a091-3ebc-4dbb-b876-19892bedba02","Type":"ContainerDied","Data":"3c576770d28108955fb30624b271f23df81e0516b1d08085cc88a640253a2d1e"} Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.027491 4885 scope.go:117] "RemoveContainer" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.034319 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-config-data" (OuterVolumeSpecName: "config-data") pod "6a29a091-3ebc-4dbb-b876-19892bedba02" (UID: "6a29a091-3ebc-4dbb-b876-19892bedba02"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.083184 4885 scope.go:117] "RemoveContainer" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" Mar 08 21:07:23 crc kubenswrapper[4885]: E0308 21:07:23.083665 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd\": container with ID starting with 6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd not found: ID does not exist" containerID="6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.083702 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd"} err="failed to get container status \"6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd\": rpc error: code = NotFound desc = could not find container \"6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd\": container with ID starting with 6509d7152b63a7980942aeabb400020ac7192f606fe1541f1a6a8cf8042f42bd not found: ID does not exist" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.096251 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.096293 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sf6d\" (UniqueName: \"kubernetes.io/projected/6a29a091-3ebc-4dbb-b876-19892bedba02-kube-api-access-5sf6d\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.096313 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a29a091-3ebc-4dbb-b876-19892bedba02-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.410304 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.437009 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.446341 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:07:23 crc kubenswrapper[4885]: E0308 21:07:23.446941 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerName="nova-scheduler-scheduler" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.446965 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerName="nova-scheduler-scheduler" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.447203 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" containerName="nova-scheduler-scheduler" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.448515 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.451544 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.459099 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.610785 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.611000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-config-data\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.611083 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz7p5\" (UniqueName: \"kubernetes.io/projected/e5c41752-6a6f-4bbf-882f-a1e873cd225f-kube-api-access-lz7p5\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.713022 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.713405 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-config-data\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.713598 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz7p5\" (UniqueName: \"kubernetes.io/projected/e5c41752-6a6f-4bbf-882f-a1e873cd225f-kube-api-access-lz7p5\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.717551 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.725612 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-config-data\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.743267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz7p5\" (UniqueName: \"kubernetes.io/projected/e5c41752-6a6f-4bbf-882f-a1e873cd225f-kube-api-access-lz7p5\") pod \"nova-scheduler-0\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " pod="openstack/nova-scheduler-0" Mar 08 21:07:23 crc kubenswrapper[4885]: I0308 21:07:23.764191 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:07:24 crc kubenswrapper[4885]: I0308 21:07:24.227114 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:24 crc kubenswrapper[4885]: I0308 21:07:24.242700 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:24 crc kubenswrapper[4885]: W0308 21:07:24.331110 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5c41752_6a6f_4bbf_882f_a1e873cd225f.slice/crio-ef5f617fb77787529ec827122c4f886ba5c0cc15c252f2142be0c62ea54a65b1 WatchSource:0}: Error finding container ef5f617fb77787529ec827122c4f886ba5c0cc15c252f2142be0c62ea54a65b1: Status 404 returned error can't find the container with id ef5f617fb77787529ec827122c4f886ba5c0cc15c252f2142be0c62ea54a65b1 Mar 08 21:07:24 crc kubenswrapper[4885]: I0308 21:07:24.333646 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:07:25 crc kubenswrapper[4885]: I0308 21:07:25.062240 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e5c41752-6a6f-4bbf-882f-a1e873cd225f","Type":"ContainerStarted","Data":"949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403"} Mar 08 21:07:25 crc kubenswrapper[4885]: I0308 21:07:25.062749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e5c41752-6a6f-4bbf-882f-a1e873cd225f","Type":"ContainerStarted","Data":"ef5f617fb77787529ec827122c4f886ba5c0cc15c252f2142be0c62ea54a65b1"} Mar 08 21:07:25 crc kubenswrapper[4885]: I0308 21:07:25.077816 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 08 21:07:25 crc kubenswrapper[4885]: I0308 21:07:25.096519 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.096503522 podStartE2EDuration="2.096503522s" podCreationTimestamp="2026-03-08 21:07:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:25.085754815 +0000 UTC m=+5746.481808838" watchObservedRunningTime="2026-03-08 21:07:25.096503522 +0000 UTC m=+5746.492557545" Mar 08 21:07:25 crc kubenswrapper[4885]: I0308 21:07:25.284431 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 08 21:07:25 crc kubenswrapper[4885]: I0308 21:07:25.376743 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a29a091-3ebc-4dbb-b876-19892bedba02" path="/var/lib/kubelet/pods/6a29a091-3ebc-4dbb-b876-19892bedba02/volumes" Mar 08 21:07:27 crc kubenswrapper[4885]: I0308 21:07:27.395690 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:07:27 crc kubenswrapper[4885]: I0308 21:07:27.396008 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:07:27 crc kubenswrapper[4885]: I0308 21:07:27.437202 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:07:27 crc kubenswrapper[4885]: I0308 21:07:27.437255 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:07:28 crc kubenswrapper[4885]: I0308 21:07:28.436226 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.127:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:07:28 crc kubenswrapper[4885]: I0308 21:07:28.561187 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.128:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:07:28 crc kubenswrapper[4885]: I0308 21:07:28.561302 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.127:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:07:28 crc kubenswrapper[4885]: I0308 21:07:28.561392 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.128:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:07:28 crc kubenswrapper[4885]: I0308 21:07:28.764611 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 21:07:29 crc kubenswrapper[4885]: I0308 21:07:29.470938 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.044249 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.046963 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.050018 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.075322 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.160524 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9txnv\" (UniqueName: \"kubernetes.io/projected/13978f90-1bb0-4f18-9094-6bbbafc7dd21-kube-api-access-9txnv\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.161157 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.161276 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.161470 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13978f90-1bb0-4f18-9094-6bbbafc7dd21-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.161600 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-scripts\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.161944 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.264735 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9txnv\" (UniqueName: \"kubernetes.io/projected/13978f90-1bb0-4f18-9094-6bbbafc7dd21-kube-api-access-9txnv\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.265139 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.265352 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.265528 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13978f90-1bb0-4f18-9094-6bbbafc7dd21-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.265713 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-scripts\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.265900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.266489 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13978f90-1bb0-4f18-9094-6bbbafc7dd21-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.273368 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.274373 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-scripts\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.275223 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.276041 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.293315 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9txnv\" (UniqueName: \"kubernetes.io/projected/13978f90-1bb0-4f18-9094-6bbbafc7dd21-kube-api-access-9txnv\") pod \"cinder-scheduler-0\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.378547 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 21:07:31 crc kubenswrapper[4885]: I0308 21:07:31.864675 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:31 crc kubenswrapper[4885]: W0308 21:07:31.903148 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13978f90_1bb0_4f18_9094_6bbbafc7dd21.slice/crio-0d1992fe6e0f61053dcf0864790c4f6f70e8a935b8f3ec31679a88218c2e2731 WatchSource:0}: Error finding container 0d1992fe6e0f61053dcf0864790c4f6f70e8a935b8f3ec31679a88218c2e2731: Status 404 returned error can't find the container with id 0d1992fe6e0f61053dcf0864790c4f6f70e8a935b8f3ec31679a88218c2e2731 Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.140677 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13978f90-1bb0-4f18-9094-6bbbafc7dd21","Type":"ContainerStarted","Data":"0d1992fe6e0f61053dcf0864790c4f6f70e8a935b8f3ec31679a88218c2e2731"} Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.300645 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.300912 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api-log" containerID="cri-o://ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c" gracePeriod=30 Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.300996 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api" containerID="cri-o://8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83" gracePeriod=30 Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.627407 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.629177 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.631592 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.674337 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.791869 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.791994 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792081 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792156 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792282 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-run\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792420 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bpjz\" (UniqueName: \"kubernetes.io/projected/954ec951-d955-4335-93bb-d43e59408ae3-kube-api-access-6bpjz\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792455 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792485 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792525 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/954ec951-d955-4335-93bb-d43e59408ae3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792574 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792618 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792639 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792658 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792691 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.792834 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895188 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895545 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895597 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895632 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-run\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895720 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bpjz\" (UniqueName: \"kubernetes.io/projected/954ec951-d955-4335-93bb-d43e59408ae3-kube-api-access-6bpjz\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895740 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895764 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895789 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/954ec951-d955-4335-93bb-d43e59408ae3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895814 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895868 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895890 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895914 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.895979 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896265 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896371 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896418 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896483 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896687 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896745 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-run\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896773 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.896904 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.897332 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/954ec951-d955-4335-93bb-d43e59408ae3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.900104 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.901054 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.901200 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.902596 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/954ec951-d955-4335-93bb-d43e59408ae3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.910608 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/954ec951-d955-4335-93bb-d43e59408ae3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.914917 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bpjz\" (UniqueName: \"kubernetes.io/projected/954ec951-d955-4335-93bb-d43e59408ae3-kube-api-access-6bpjz\") pod \"cinder-volume-volume1-0\" (UID: \"954ec951-d955-4335-93bb-d43e59408ae3\") " pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:32 crc kubenswrapper[4885]: I0308 21:07:32.988045 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.153946 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13978f90-1bb0-4f18-9094-6bbbafc7dd21","Type":"ContainerStarted","Data":"9881abb2fc54dccce28e0bf4a5673dd83ed488ea4488884d6a3d9befedf24e2b"} Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.153998 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13978f90-1bb0-4f18-9094-6bbbafc7dd21","Type":"ContainerStarted","Data":"1fe3223e4f4132ee0aa233bdece638ddefaee60de254e8e01f53c8649e9e80d2"} Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.176807 4885 generic.go:334] "Generic (PLEG): container finished" podID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerID="ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c" exitCode=143 Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.177667 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a0f209-cabb-4b78-8e14-17625407e49d","Type":"ContainerDied","Data":"ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c"} Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.187906 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.187883926 podStartE2EDuration="2.187883926s" podCreationTimestamp="2026-03-08 21:07:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:33.183576341 +0000 UTC m=+5754.579630354" watchObservedRunningTime="2026-03-08 21:07:33.187883926 +0000 UTC m=+5754.583937959" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.255674 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.275485 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.275606 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.280856 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.383486 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405558 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405597 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405620 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-run\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405636 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405670 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405687 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405710 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-ceph\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405741 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-scripts\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405833 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405867 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-dev\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405896 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d65tg\" (UniqueName: \"kubernetes.io/projected/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-kube-api-access-d65tg\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405953 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-sys\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.405980 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.406011 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-lib-modules\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.406060 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-config-data\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.406093 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.507884 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-dev\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.507943 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d65tg\" (UniqueName: \"kubernetes.io/projected/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-kube-api-access-d65tg\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.507967 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-sys\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.507985 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508014 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-lib-modules\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508040 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-config-data\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508069 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508097 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508120 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508137 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508155 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-run\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508190 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508205 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508225 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-ceph\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508254 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-scripts\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508271 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508344 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508377 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-dev\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508631 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-sys\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508672 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-nvme\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.508693 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-lib-modules\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.509365 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-run\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.509400 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.510057 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.510720 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.512047 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.515358 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-ceph\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.515461 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-scripts\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.518734 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-config-data-custom\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.519364 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-config-data\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.520018 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.524221 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d65tg\" (UniqueName: \"kubernetes.io/projected/5eb198a5-6241-48b0-bc8c-57ad764a1f3b-kube-api-access-d65tg\") pod \"cinder-backup-0\" (UID: \"5eb198a5-6241-48b0-bc8c-57ad764a1f3b\") " pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.642597 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.765006 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 21:07:33 crc kubenswrapper[4885]: I0308 21:07:33.802217 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 21:07:34 crc kubenswrapper[4885]: I0308 21:07:34.201032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"954ec951-d955-4335-93bb-d43e59408ae3","Type":"ContainerStarted","Data":"e6925dbb4d4c2d2645ba7fe3b89359e6a91429076db5f03f49cb0e1ef3e714bb"} Mar 08 21:07:34 crc kubenswrapper[4885]: I0308 21:07:34.260287 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 21:07:34 crc kubenswrapper[4885]: I0308 21:07:34.340271 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Mar 08 21:07:34 crc kubenswrapper[4885]: W0308 21:07:34.458064 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5eb198a5_6241_48b0_bc8c_57ad764a1f3b.slice/crio-73d922692a4f325773d69803802351a5f0faa13123ce224b57b34937a18ac70a WatchSource:0}: Error finding container 73d922692a4f325773d69803802351a5f0faa13123ce224b57b34937a18ac70a: Status 404 returned error can't find the container with id 73d922692a4f325773d69803802351a5f0faa13123ce224b57b34937a18ac70a Mar 08 21:07:35 crc kubenswrapper[4885]: I0308 21:07:35.213766 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"954ec951-d955-4335-93bb-d43e59408ae3","Type":"ContainerStarted","Data":"f65dd3c1291428b7f45a33e87497faa12831a7ebb83ceff464a7e0aa5ae65bb1"} Mar 08 21:07:35 crc kubenswrapper[4885]: I0308 21:07:35.215604 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"954ec951-d955-4335-93bb-d43e59408ae3","Type":"ContainerStarted","Data":"19d24ae5f67c03e1da8f4df4964cafb82b3621b9126e14acec3d80f16e854646"} Mar 08 21:07:35 crc kubenswrapper[4885]: I0308 21:07:35.219556 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5eb198a5-6241-48b0-bc8c-57ad764a1f3b","Type":"ContainerStarted","Data":"73d922692a4f325773d69803802351a5f0faa13123ce224b57b34937a18ac70a"} Mar 08 21:07:35 crc kubenswrapper[4885]: I0308 21:07:35.260345 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.132550043 podStartE2EDuration="3.260320744s" podCreationTimestamp="2026-03-08 21:07:32 +0000 UTC" firstStartedPulling="2026-03-08 21:07:33.377973376 +0000 UTC m=+5754.774027399" lastFinishedPulling="2026-03-08 21:07:34.505744047 +0000 UTC m=+5755.901798100" observedRunningTime="2026-03-08 21:07:35.250433201 +0000 UTC m=+5756.646487224" watchObservedRunningTime="2026-03-08 21:07:35.260320744 +0000 UTC m=+5756.656374767" Mar 08 21:07:35 crc kubenswrapper[4885]: I0308 21:07:35.934234 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063202 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063535 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-scripts\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063562 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-combined-ca-bundle\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063620 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a0f209-cabb-4b78-8e14-17625407e49d-etc-machine-id\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063652 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rftcn\" (UniqueName: \"kubernetes.io/projected/a4a0f209-cabb-4b78-8e14-17625407e49d-kube-api-access-rftcn\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063733 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a0f209-cabb-4b78-8e14-17625407e49d-logs\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.063784 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data-custom\") pod \"a4a0f209-cabb-4b78-8e14-17625407e49d\" (UID: \"a4a0f209-cabb-4b78-8e14-17625407e49d\") " Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.064289 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4a0f209-cabb-4b78-8e14-17625407e49d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.065019 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4a0f209-cabb-4b78-8e14-17625407e49d-logs" (OuterVolumeSpecName: "logs") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.065714 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4a0f209-cabb-4b78-8e14-17625407e49d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.069401 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-scripts" (OuterVolumeSpecName: "scripts") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.069502 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.071104 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4a0f209-cabb-4b78-8e14-17625407e49d-kube-api-access-rftcn" (OuterVolumeSpecName: "kube-api-access-rftcn") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "kube-api-access-rftcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.100079 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.111227 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data" (OuterVolumeSpecName: "config-data") pod "a4a0f209-cabb-4b78-8e14-17625407e49d" (UID: "a4a0f209-cabb-4b78-8e14-17625407e49d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.168480 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.168521 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.168534 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.168547 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rftcn\" (UniqueName: \"kubernetes.io/projected/a4a0f209-cabb-4b78-8e14-17625407e49d-kube-api-access-rftcn\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.168562 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4a0f209-cabb-4b78-8e14-17625407e49d-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.168574 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4a0f209-cabb-4b78-8e14-17625407e49d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.231514 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5eb198a5-6241-48b0-bc8c-57ad764a1f3b","Type":"ContainerStarted","Data":"5602fc306631a848e72138dff41bdfb61a61f06959859effd5f35781d85b851a"} Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.231561 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"5eb198a5-6241-48b0-bc8c-57ad764a1f3b","Type":"ContainerStarted","Data":"d68b540453dd58e109792497aa02a9665490650e6b7a8937a0a76f89af9ce4fd"} Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.233694 4885 generic.go:334] "Generic (PLEG): container finished" podID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerID="8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83" exitCode=0 Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.234452 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.235671 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a0f209-cabb-4b78-8e14-17625407e49d","Type":"ContainerDied","Data":"8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83"} Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.235712 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4a0f209-cabb-4b78-8e14-17625407e49d","Type":"ContainerDied","Data":"73a221bfa90434743d951101b28f8fc6f753d32cb93f954ef81be7dadb45dff1"} Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.235728 4885 scope.go:117] "RemoveContainer" containerID="8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.276902 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.586009922 podStartE2EDuration="3.276885239s" podCreationTimestamp="2026-03-08 21:07:33 +0000 UTC" firstStartedPulling="2026-03-08 21:07:34.460991874 +0000 UTC m=+5755.857045907" lastFinishedPulling="2026-03-08 21:07:35.151867191 +0000 UTC m=+5756.547921224" observedRunningTime="2026-03-08 21:07:36.25813657 +0000 UTC m=+5757.654190623" watchObservedRunningTime="2026-03-08 21:07:36.276885239 +0000 UTC m=+5757.672939252" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.281025 4885 scope.go:117] "RemoveContainer" containerID="ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.291008 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.299403 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.315475 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:36 crc kubenswrapper[4885]: E0308 21:07:36.316667 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.316686 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api" Mar 08 21:07:36 crc kubenswrapper[4885]: E0308 21:07:36.316711 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api-log" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.316720 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api-log" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.317271 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.317295 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" containerName="cinder-api-log" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.319887 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.322304 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.335757 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.350002 4885 scope.go:117] "RemoveContainer" containerID="8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83" Mar 08 21:07:36 crc kubenswrapper[4885]: E0308 21:07:36.350716 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83\": container with ID starting with 8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83 not found: ID does not exist" containerID="8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.350753 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83"} err="failed to get container status \"8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83\": rpc error: code = NotFound desc = could not find container \"8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83\": container with ID starting with 8f9aaff4eecabd6066aa43c7b6cb4436b28f03d92baff6f4546fee193b16ee83 not found: ID does not exist" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.350780 4885 scope.go:117] "RemoveContainer" containerID="ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c" Mar 08 21:07:36 crc kubenswrapper[4885]: E0308 21:07:36.351847 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c\": container with ID starting with ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c not found: ID does not exist" containerID="ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.351869 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c"} err="failed to get container status \"ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c\": rpc error: code = NotFound desc = could not find container \"ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c\": container with ID starting with ec8f070433e9be72650047af24edf39d7e7ea5d4895804ab88bed0686796134c not found: ID does not exist" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.378624 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481144 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/216289ea-1f99-4924-aa6b-9951b3b3840e-logs\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481254 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-config-data\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481291 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/216289ea-1f99-4924-aa6b-9951b3b3840e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481317 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-config-data-custom\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481343 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481401 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-scripts\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.481501 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97ts9\" (UniqueName: \"kubernetes.io/projected/216289ea-1f99-4924-aa6b-9951b3b3840e-kube-api-access-97ts9\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.583006 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-config-data\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.583125 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/216289ea-1f99-4924-aa6b-9951b3b3840e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.583235 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/216289ea-1f99-4924-aa6b-9951b3b3840e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.583332 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-config-data-custom\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.584053 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.584134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-scripts\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.584234 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97ts9\" (UniqueName: \"kubernetes.io/projected/216289ea-1f99-4924-aa6b-9951b3b3840e-kube-api-access-97ts9\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.584379 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/216289ea-1f99-4924-aa6b-9951b3b3840e-logs\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.585291 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/216289ea-1f99-4924-aa6b-9951b3b3840e-logs\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.589245 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.589280 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-config-data\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.589957 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-config-data-custom\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.604855 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97ts9\" (UniqueName: \"kubernetes.io/projected/216289ea-1f99-4924-aa6b-9951b3b3840e-kube-api-access-97ts9\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.610163 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/216289ea-1f99-4924-aa6b-9951b3b3840e-scripts\") pod \"cinder-api-0\" (UID: \"216289ea-1f99-4924-aa6b-9951b3b3840e\") " pod="openstack/cinder-api-0" Mar 08 21:07:36 crc kubenswrapper[4885]: I0308 21:07:36.658254 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.170950 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.259879 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"216289ea-1f99-4924-aa6b-9951b3b3840e","Type":"ContainerStarted","Data":"26eadba5e089e69115d1c5ae218bbe8e990ab92af7d65ef8c3353b5cb9e20b6a"} Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.379991 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4a0f209-cabb-4b78-8e14-17625407e49d" path="/var/lib/kubelet/pods/a4a0f209-cabb-4b78-8e14-17625407e49d/volumes" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.397859 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.399963 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.402838 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.443751 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.444160 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.448206 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.448375 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 21:07:37 crc kubenswrapper[4885]: I0308 21:07:37.988910 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:38 crc kubenswrapper[4885]: I0308 21:07:38.272190 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"216289ea-1f99-4924-aa6b-9951b3b3840e","Type":"ContainerStarted","Data":"bd524370eea870813a7c3754ca1646805cc89fdc3e8f0f15f92f53813e1b0f05"} Mar 08 21:07:38 crc kubenswrapper[4885]: I0308 21:07:38.272636 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 21:07:38 crc kubenswrapper[4885]: I0308 21:07:38.277456 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 21:07:38 crc kubenswrapper[4885]: I0308 21:07:38.277602 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 21:07:38 crc kubenswrapper[4885]: I0308 21:07:38.643729 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Mar 08 21:07:39 crc kubenswrapper[4885]: I0308 21:07:39.289457 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"216289ea-1f99-4924-aa6b-9951b3b3840e","Type":"ContainerStarted","Data":"de854a5b5ca38c81d341cd7433e27205131779dbdd929e60f852088e1e045149"} Mar 08 21:07:39 crc kubenswrapper[4885]: I0308 21:07:39.290107 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 08 21:07:39 crc kubenswrapper[4885]: I0308 21:07:39.327370 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.327338955 podStartE2EDuration="3.327338955s" podCreationTimestamp="2026-03-08 21:07:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:39.319455705 +0000 UTC m=+5760.715509738" watchObservedRunningTime="2026-03-08 21:07:39.327338955 +0000 UTC m=+5760.723393008" Mar 08 21:07:41 crc kubenswrapper[4885]: I0308 21:07:41.663275 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 08 21:07:41 crc kubenswrapper[4885]: I0308 21:07:41.766798 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:42 crc kubenswrapper[4885]: I0308 21:07:42.349003 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="cinder-scheduler" containerID="cri-o://1fe3223e4f4132ee0aa233bdece638ddefaee60de254e8e01f53c8649e9e80d2" gracePeriod=30 Mar 08 21:07:42 crc kubenswrapper[4885]: I0308 21:07:42.349088 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="probe" containerID="cri-o://9881abb2fc54dccce28e0bf4a5673dd83ed488ea4488884d6a3d9befedf24e2b" gracePeriod=30 Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.167722 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.362545 4885 generic.go:334] "Generic (PLEG): container finished" podID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerID="9881abb2fc54dccce28e0bf4a5673dd83ed488ea4488884d6a3d9befedf24e2b" exitCode=0 Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.362595 4885 generic.go:334] "Generic (PLEG): container finished" podID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerID="1fe3223e4f4132ee0aa233bdece638ddefaee60de254e8e01f53c8649e9e80d2" exitCode=0 Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.362623 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13978f90-1bb0-4f18-9094-6bbbafc7dd21","Type":"ContainerDied","Data":"9881abb2fc54dccce28e0bf4a5673dd83ed488ea4488884d6a3d9befedf24e2b"} Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.362661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13978f90-1bb0-4f18-9094-6bbbafc7dd21","Type":"ContainerDied","Data":"1fe3223e4f4132ee0aa233bdece638ddefaee60de254e8e01f53c8649e9e80d2"} Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.906091 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Mar 08 21:07:43 crc kubenswrapper[4885]: I0308 21:07:43.916700 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.056950 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-scripts\") pod \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057022 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data\") pod \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057120 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-combined-ca-bundle\") pod \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057149 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9txnv\" (UniqueName: \"kubernetes.io/projected/13978f90-1bb0-4f18-9094-6bbbafc7dd21-kube-api-access-9txnv\") pod \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057201 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13978f90-1bb0-4f18-9094-6bbbafc7dd21-etc-machine-id\") pod \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057280 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data-custom\") pod \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\" (UID: \"13978f90-1bb0-4f18-9094-6bbbafc7dd21\") " Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057584 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13978f90-1bb0-4f18-9094-6bbbafc7dd21-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "13978f90-1bb0-4f18-9094-6bbbafc7dd21" (UID: "13978f90-1bb0-4f18-9094-6bbbafc7dd21"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.057975 4885 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13978f90-1bb0-4f18-9094-6bbbafc7dd21-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.063827 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-scripts" (OuterVolumeSpecName: "scripts") pod "13978f90-1bb0-4f18-9094-6bbbafc7dd21" (UID: "13978f90-1bb0-4f18-9094-6bbbafc7dd21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.067616 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13978f90-1bb0-4f18-9094-6bbbafc7dd21" (UID: "13978f90-1bb0-4f18-9094-6bbbafc7dd21"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.075359 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13978f90-1bb0-4f18-9094-6bbbafc7dd21-kube-api-access-9txnv" (OuterVolumeSpecName: "kube-api-access-9txnv") pod "13978f90-1bb0-4f18-9094-6bbbafc7dd21" (UID: "13978f90-1bb0-4f18-9094-6bbbafc7dd21"). InnerVolumeSpecName "kube-api-access-9txnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.120474 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13978f90-1bb0-4f18-9094-6bbbafc7dd21" (UID: "13978f90-1bb0-4f18-9094-6bbbafc7dd21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.159477 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9txnv\" (UniqueName: \"kubernetes.io/projected/13978f90-1bb0-4f18-9094-6bbbafc7dd21-kube-api-access-9txnv\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.159797 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.159887 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.159993 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.165108 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data" (OuterVolumeSpecName: "config-data") pod "13978f90-1bb0-4f18-9094-6bbbafc7dd21" (UID: "13978f90-1bb0-4f18-9094-6bbbafc7dd21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.261943 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13978f90-1bb0-4f18-9094-6bbbafc7dd21-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.394835 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"13978f90-1bb0-4f18-9094-6bbbafc7dd21","Type":"ContainerDied","Data":"0d1992fe6e0f61053dcf0864790c4f6f70e8a935b8f3ec31679a88218c2e2731"} Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.394905 4885 scope.go:117] "RemoveContainer" containerID="9881abb2fc54dccce28e0bf4a5673dd83ed488ea4488884d6a3d9befedf24e2b" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.395136 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.441551 4885 scope.go:117] "RemoveContainer" containerID="1fe3223e4f4132ee0aa233bdece638ddefaee60de254e8e01f53c8649e9e80d2" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.449414 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.481104 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.501430 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:44 crc kubenswrapper[4885]: E0308 21:07:44.501955 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="probe" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.501978 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="probe" Mar 08 21:07:44 crc kubenswrapper[4885]: E0308 21:07:44.501998 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="cinder-scheduler" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.502006 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="cinder-scheduler" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.502243 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="probe" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.502266 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" containerName="cinder-scheduler" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.507000 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.509816 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.533415 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.678572 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.678653 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-config-data\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.678686 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7620c463-ffe0-4d70-ba82-deaef34da248-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.678856 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-scripts\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.678909 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7zzw\" (UniqueName: \"kubernetes.io/projected/7620c463-ffe0-4d70-ba82-deaef34da248-kube-api-access-g7zzw\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.679122 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.780781 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.780880 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.780965 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-config-data\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.780997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7620c463-ffe0-4d70-ba82-deaef34da248-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.781078 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-scripts\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.781113 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7zzw\" (UniqueName: \"kubernetes.io/projected/7620c463-ffe0-4d70-ba82-deaef34da248-kube-api-access-g7zzw\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.781781 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7620c463-ffe0-4d70-ba82-deaef34da248-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.785287 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.786228 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-config-data\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.793811 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-scripts\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.794408 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7620c463-ffe0-4d70-ba82-deaef34da248-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.810729 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7zzw\" (UniqueName: \"kubernetes.io/projected/7620c463-ffe0-4d70-ba82-deaef34da248-kube-api-access-g7zzw\") pod \"cinder-scheduler-0\" (UID: \"7620c463-ffe0-4d70-ba82-deaef34da248\") " pod="openstack/cinder-scheduler-0" Mar 08 21:07:44 crc kubenswrapper[4885]: I0308 21:07:44.829410 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 08 21:07:45 crc kubenswrapper[4885]: I0308 21:07:45.340696 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 08 21:07:45 crc kubenswrapper[4885]: W0308 21:07:45.349334 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7620c463_ffe0_4d70_ba82_deaef34da248.slice/crio-756037f3c75add192500cc5c007cc0c20b95aad2e7d6a133df86b5b8ee4a6efc WatchSource:0}: Error finding container 756037f3c75add192500cc5c007cc0c20b95aad2e7d6a133df86b5b8ee4a6efc: Status 404 returned error can't find the container with id 756037f3c75add192500cc5c007cc0c20b95aad2e7d6a133df86b5b8ee4a6efc Mar 08 21:07:45 crc kubenswrapper[4885]: I0308 21:07:45.381134 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13978f90-1bb0-4f18-9094-6bbbafc7dd21" path="/var/lib/kubelet/pods/13978f90-1bb0-4f18-9094-6bbbafc7dd21/volumes" Mar 08 21:07:45 crc kubenswrapper[4885]: I0308 21:07:45.415304 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7620c463-ffe0-4d70-ba82-deaef34da248","Type":"ContainerStarted","Data":"756037f3c75add192500cc5c007cc0c20b95aad2e7d6a133df86b5b8ee4a6efc"} Mar 08 21:07:46 crc kubenswrapper[4885]: I0308 21:07:46.432769 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7620c463-ffe0-4d70-ba82-deaef34da248","Type":"ContainerStarted","Data":"bba7159d14310c79ddeaa2597e896a5f6fe510b649b0b9206814fe055de3ce19"} Mar 08 21:07:47 crc kubenswrapper[4885]: I0308 21:07:47.454135 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7620c463-ffe0-4d70-ba82-deaef34da248","Type":"ContainerStarted","Data":"2b0ded14f6da19f693e60f9e596950aa09eff6442661efedfd0b38eff3938cd9"} Mar 08 21:07:47 crc kubenswrapper[4885]: I0308 21:07:47.486586 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.486565668 podStartE2EDuration="3.486565668s" podCreationTimestamp="2026-03-08 21:07:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:07:47.482029817 +0000 UTC m=+5768.878083850" watchObservedRunningTime="2026-03-08 21:07:47.486565668 +0000 UTC m=+5768.882619701" Mar 08 21:07:48 crc kubenswrapper[4885]: I0308 21:07:48.450484 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 08 21:07:49 crc kubenswrapper[4885]: I0308 21:07:49.829797 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 08 21:07:55 crc kubenswrapper[4885]: I0308 21:07:55.114894 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.154062 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550068-8fldh"] Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.156705 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.160234 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.160301 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.161773 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.170900 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550068-8fldh"] Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.301464 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tkxg\" (UniqueName: \"kubernetes.io/projected/2a1f5d5d-f061-4187-a9ed-720b291774e5-kube-api-access-5tkxg\") pod \"auto-csr-approver-29550068-8fldh\" (UID: \"2a1f5d5d-f061-4187-a9ed-720b291774e5\") " pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.403943 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tkxg\" (UniqueName: \"kubernetes.io/projected/2a1f5d5d-f061-4187-a9ed-720b291774e5-kube-api-access-5tkxg\") pod \"auto-csr-approver-29550068-8fldh\" (UID: \"2a1f5d5d-f061-4187-a9ed-720b291774e5\") " pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.435903 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tkxg\" (UniqueName: \"kubernetes.io/projected/2a1f5d5d-f061-4187-a9ed-720b291774e5-kube-api-access-5tkxg\") pod \"auto-csr-approver-29550068-8fldh\" (UID: \"2a1f5d5d-f061-4187-a9ed-720b291774e5\") " pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.490636 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:00 crc kubenswrapper[4885]: I0308 21:08:00.994364 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550068-8fldh"] Mar 08 21:08:01 crc kubenswrapper[4885]: I0308 21:08:01.651677 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550068-8fldh" event={"ID":"2a1f5d5d-f061-4187-a9ed-720b291774e5","Type":"ContainerStarted","Data":"c524e7b36661c6d5ca64f259558c9eefe8e482b0d59e246fed7dbb689fd4fcae"} Mar 08 21:08:02 crc kubenswrapper[4885]: I0308 21:08:02.675160 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550068-8fldh" event={"ID":"2a1f5d5d-f061-4187-a9ed-720b291774e5","Type":"ContainerStarted","Data":"dcaa0b7048fe6fee9e8064fda1b6f6cbab5d7f0172b9d8b64c22f15e682b913a"} Mar 08 21:08:02 crc kubenswrapper[4885]: I0308 21:08:02.696903 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550068-8fldh" podStartSLOduration=1.563059212 podStartE2EDuration="2.696882095s" podCreationTimestamp="2026-03-08 21:08:00 +0000 UTC" firstStartedPulling="2026-03-08 21:08:00.999610964 +0000 UTC m=+5782.395665027" lastFinishedPulling="2026-03-08 21:08:02.133433867 +0000 UTC m=+5783.529487910" observedRunningTime="2026-03-08 21:08:02.692228131 +0000 UTC m=+5784.088282154" watchObservedRunningTime="2026-03-08 21:08:02.696882095 +0000 UTC m=+5784.092936128" Mar 08 21:08:03 crc kubenswrapper[4885]: I0308 21:08:03.688662 4885 generic.go:334] "Generic (PLEG): container finished" podID="2a1f5d5d-f061-4187-a9ed-720b291774e5" containerID="dcaa0b7048fe6fee9e8064fda1b6f6cbab5d7f0172b9d8b64c22f15e682b913a" exitCode=0 Mar 08 21:08:03 crc kubenswrapper[4885]: I0308 21:08:03.688756 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550068-8fldh" event={"ID":"2a1f5d5d-f061-4187-a9ed-720b291774e5","Type":"ContainerDied","Data":"dcaa0b7048fe6fee9e8064fda1b6f6cbab5d7f0172b9d8b64c22f15e682b913a"} Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.066955 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.207853 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tkxg\" (UniqueName: \"kubernetes.io/projected/2a1f5d5d-f061-4187-a9ed-720b291774e5-kube-api-access-5tkxg\") pod \"2a1f5d5d-f061-4187-a9ed-720b291774e5\" (UID: \"2a1f5d5d-f061-4187-a9ed-720b291774e5\") " Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.221212 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a1f5d5d-f061-4187-a9ed-720b291774e5-kube-api-access-5tkxg" (OuterVolumeSpecName: "kube-api-access-5tkxg") pod "2a1f5d5d-f061-4187-a9ed-720b291774e5" (UID: "2a1f5d5d-f061-4187-a9ed-720b291774e5"). InnerVolumeSpecName "kube-api-access-5tkxg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.310305 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tkxg\" (UniqueName: \"kubernetes.io/projected/2a1f5d5d-f061-4187-a9ed-720b291774e5-kube-api-access-5tkxg\") on node \"crc\" DevicePath \"\"" Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.712797 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550068-8fldh" event={"ID":"2a1f5d5d-f061-4187-a9ed-720b291774e5","Type":"ContainerDied","Data":"c524e7b36661c6d5ca64f259558c9eefe8e482b0d59e246fed7dbb689fd4fcae"} Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.712841 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c524e7b36661c6d5ca64f259558c9eefe8e482b0d59e246fed7dbb689fd4fcae" Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.712870 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550068-8fldh" Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.778389 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550062-kdw5m"] Mar 08 21:08:05 crc kubenswrapper[4885]: I0308 21:08:05.787197 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550062-kdw5m"] Mar 08 21:08:07 crc kubenswrapper[4885]: I0308 21:08:07.385511 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86555f65-5ef4-4c45-9ac3-9b561d985b57" path="/var/lib/kubelet/pods/86555f65-5ef4-4c45-9ac3-9b561d985b57/volumes" Mar 08 21:08:32 crc kubenswrapper[4885]: I0308 21:08:32.818687 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:08:32 crc kubenswrapper[4885]: I0308 21:08:32.819571 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:09:02 crc kubenswrapper[4885]: I0308 21:09:02.818102 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:09:02 crc kubenswrapper[4885]: I0308 21:09:02.820105 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:09:06 crc kubenswrapper[4885]: I0308 21:09:06.008032 4885 scope.go:117] "RemoveContainer" containerID="f10312aca3cd4fa64b1d669949edd0e9d6f21408d583e376501255867513b217" Mar 08 21:09:06 crc kubenswrapper[4885]: I0308 21:09:06.082007 4885 scope.go:117] "RemoveContainer" containerID="a8688238cdfd1dcfb7637a786ad101466774be95a20bd05d6000a2fb72881a5c" Mar 08 21:09:06 crc kubenswrapper[4885]: I0308 21:09:06.105664 4885 scope.go:117] "RemoveContainer" containerID="22e2cca09cd12eb836eb1e30ec93ff3eca0cfdcebc523a4eeae36a7ba702ee56" Mar 08 21:09:26 crc kubenswrapper[4885]: I0308 21:09:26.050376 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c81b-account-create-update-sqjkr"] Mar 08 21:09:26 crc kubenswrapper[4885]: I0308 21:09:26.059022 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-dkqb5"] Mar 08 21:09:26 crc kubenswrapper[4885]: I0308 21:09:26.066896 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c81b-account-create-update-sqjkr"] Mar 08 21:09:26 crc kubenswrapper[4885]: I0308 21:09:26.073670 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-dkqb5"] Mar 08 21:09:27 crc kubenswrapper[4885]: I0308 21:09:27.405632 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af0fb78-1571-4090-a0e4-009deb2915a5" path="/var/lib/kubelet/pods/2af0fb78-1571-4090-a0e4-009deb2915a5/volumes" Mar 08 21:09:27 crc kubenswrapper[4885]: I0308 21:09:27.406631 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dadcfb24-7e2e-42d4-b4da-4567105c11ad" path="/var/lib/kubelet/pods/dadcfb24-7e2e-42d4-b4da-4567105c11ad/volumes" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.433207 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m7crp"] Mar 08 21:09:28 crc kubenswrapper[4885]: E0308 21:09:28.433808 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a1f5d5d-f061-4187-a9ed-720b291774e5" containerName="oc" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.433829 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a1f5d5d-f061-4187-a9ed-720b291774e5" containerName="oc" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.434215 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a1f5d5d-f061-4187-a9ed-720b291774e5" containerName="oc" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.436498 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.451295 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7crp"] Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.614011 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xncnq\" (UniqueName: \"kubernetes.io/projected/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-kube-api-access-xncnq\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.614123 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-catalog-content\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.614207 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-utilities\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.715468 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xncnq\" (UniqueName: \"kubernetes.io/projected/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-kube-api-access-xncnq\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.715541 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-catalog-content\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.715606 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-utilities\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.716178 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-utilities\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.716775 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-catalog-content\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.755593 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xncnq\" (UniqueName: \"kubernetes.io/projected/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-kube-api-access-xncnq\") pod \"redhat-marketplace-m7crp\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:28 crc kubenswrapper[4885]: I0308 21:09:28.772532 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:29 crc kubenswrapper[4885]: I0308 21:09:29.276517 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7crp"] Mar 08 21:09:29 crc kubenswrapper[4885]: I0308 21:09:29.741477 4885 generic.go:334] "Generic (PLEG): container finished" podID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerID="2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1" exitCode=0 Mar 08 21:09:29 crc kubenswrapper[4885]: I0308 21:09:29.741559 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerDied","Data":"2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1"} Mar 08 21:09:29 crc kubenswrapper[4885]: I0308 21:09:29.741624 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerStarted","Data":"84590b4da479b1b889ba9e8ed38b36c22bd738109e3b3292bd5c830ce0a9b1f6"} Mar 08 21:09:29 crc kubenswrapper[4885]: I0308 21:09:29.744488 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:09:30 crc kubenswrapper[4885]: I0308 21:09:30.754276 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerStarted","Data":"37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b"} Mar 08 21:09:31 crc kubenswrapper[4885]: I0308 21:09:31.770692 4885 generic.go:334] "Generic (PLEG): container finished" podID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerID="37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b" exitCode=0 Mar 08 21:09:31 crc kubenswrapper[4885]: I0308 21:09:31.770790 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerDied","Data":"37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b"} Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.646476 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5jmft"] Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.647954 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.650176 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.652089 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-kwwpv" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.666127 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-b6j88"] Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.668473 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.693592 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5jmft"] Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.711001 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-b6j88"] Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.787599 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerStarted","Data":"632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf"} Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.795752 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-log-ovn\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.795823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00348ab8-7686-4e8d-bada-3d9e32edca19-scripts\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.795861 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-lib\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.795945 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-log\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.795970 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-run\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.795991 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9fbe86b-d12b-4122-93b5-4cd373fca82b-scripts\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.796036 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-run-ovn\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.796084 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-run\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.796113 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc5pq\" (UniqueName: \"kubernetes.io/projected/f9fbe86b-d12b-4122-93b5-4cd373fca82b-kube-api-access-qc5pq\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.796146 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrx56\" (UniqueName: \"kubernetes.io/projected/00348ab8-7686-4e8d-bada-3d9e32edca19-kube-api-access-lrx56\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.796173 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-etc-ovs\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.806573 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m7crp" podStartSLOduration=2.355334176 podStartE2EDuration="4.806554998s" podCreationTimestamp="2026-03-08 21:09:28 +0000 UTC" firstStartedPulling="2026-03-08 21:09:29.744202895 +0000 UTC m=+5871.140256928" lastFinishedPulling="2026-03-08 21:09:32.195423687 +0000 UTC m=+5873.591477750" observedRunningTime="2026-03-08 21:09:32.800663131 +0000 UTC m=+5874.196717154" watchObservedRunningTime="2026-03-08 21:09:32.806554998 +0000 UTC m=+5874.202609021" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.817998 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.818059 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.818102 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.818771 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2cd11e6d776229da9098efb4d94ce67906e2c52e2199ae80ec12db171f7eadf"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.818827 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://f2cd11e6d776229da9098efb4d94ce67906e2c52e2199ae80ec12db171f7eadf" gracePeriod=600 Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.897834 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-run-ovn\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.897913 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-run\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.897944 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc5pq\" (UniqueName: \"kubernetes.io/projected/f9fbe86b-d12b-4122-93b5-4cd373fca82b-kube-api-access-qc5pq\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.897968 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrx56\" (UniqueName: \"kubernetes.io/projected/00348ab8-7686-4e8d-bada-3d9e32edca19-kube-api-access-lrx56\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.897985 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-etc-ovs\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898068 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-log-ovn\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898100 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00348ab8-7686-4e8d-bada-3d9e32edca19-scripts\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-lib\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898167 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-run\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898182 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-log\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898197 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9fbe86b-d12b-4122-93b5-4cd373fca82b-scripts\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898699 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-run\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898723 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-lib\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898774 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-etc-ovs\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898775 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-log-ovn\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898813 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/00348ab8-7686-4e8d-bada-3d9e32edca19-var-run-ovn\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.898813 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-log\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.899022 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9fbe86b-d12b-4122-93b5-4cd373fca82b-var-run\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.900132 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9fbe86b-d12b-4122-93b5-4cd373fca82b-scripts\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.901804 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/00348ab8-7686-4e8d-bada-3d9e32edca19-scripts\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.920376 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc5pq\" (UniqueName: \"kubernetes.io/projected/f9fbe86b-d12b-4122-93b5-4cd373fca82b-kube-api-access-qc5pq\") pod \"ovn-controller-ovs-b6j88\" (UID: \"f9fbe86b-d12b-4122-93b5-4cd373fca82b\") " pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.920500 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrx56\" (UniqueName: \"kubernetes.io/projected/00348ab8-7686-4e8d-bada-3d9e32edca19-kube-api-access-lrx56\") pod \"ovn-controller-5jmft\" (UID: \"00348ab8-7686-4e8d-bada-3d9e32edca19\") " pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.963032 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jmft" Mar 08 21:09:32 crc kubenswrapper[4885]: I0308 21:09:32.992260 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.058026 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-nbq5w"] Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.069507 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-nbq5w"] Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.379491 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a21d9a63-6439-41e2-915d-9ffa3d014a30" path="/var/lib/kubelet/pods/a21d9a63-6439-41e2-915d-9ffa3d014a30/volumes" Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.447289 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5jmft"] Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.803028 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jmft" event={"ID":"00348ab8-7686-4e8d-bada-3d9e32edca19","Type":"ContainerStarted","Data":"0254d6259023a959e1f163389eb4a2580a3f8bc33c2bc4bb81ff66bfc41e343d"} Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.806366 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="f2cd11e6d776229da9098efb4d94ce67906e2c52e2199ae80ec12db171f7eadf" exitCode=0 Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.806739 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"f2cd11e6d776229da9098efb4d94ce67906e2c52e2199ae80ec12db171f7eadf"} Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.806765 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9"} Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.806964 4885 scope.go:117] "RemoveContainer" containerID="088f12c9fd9f4d010f8f310de67aba924cadeda302548d21523693996a55b080" Mar 08 21:09:33 crc kubenswrapper[4885]: I0308 21:09:33.910591 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-b6j88"] Mar 08 21:09:33 crc kubenswrapper[4885]: W0308 21:09:33.921549 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9fbe86b_d12b_4122_93b5_4cd373fca82b.slice/crio-9900ad53bc89302945bf0323cbd47d6d4a3850a40f20eab215864e96284d898e WatchSource:0}: Error finding container 9900ad53bc89302945bf0323cbd47d6d4a3850a40f20eab215864e96284d898e: Status 404 returned error can't find the container with id 9900ad53bc89302945bf0323cbd47d6d4a3850a40f20eab215864e96284d898e Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.210496 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-rzmvz"] Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.211957 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.214769 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.227885 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rzmvz"] Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.330692 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db23198-8297-4e77-aed3-78ca89d5e6f8-config\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.330760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2db23198-8297-4e77-aed3-78ca89d5e6f8-ovs-rundir\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.330803 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4bhl\" (UniqueName: \"kubernetes.io/projected/2db23198-8297-4e77-aed3-78ca89d5e6f8-kube-api-access-s4bhl\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.330880 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2db23198-8297-4e77-aed3-78ca89d5e6f8-ovn-rundir\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.432718 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db23198-8297-4e77-aed3-78ca89d5e6f8-config\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.432781 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2db23198-8297-4e77-aed3-78ca89d5e6f8-ovs-rundir\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.432827 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4bhl\" (UniqueName: \"kubernetes.io/projected/2db23198-8297-4e77-aed3-78ca89d5e6f8-kube-api-access-s4bhl\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.432946 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2db23198-8297-4e77-aed3-78ca89d5e6f8-ovn-rundir\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.433216 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2db23198-8297-4e77-aed3-78ca89d5e6f8-ovs-rundir\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.433218 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2db23198-8297-4e77-aed3-78ca89d5e6f8-ovn-rundir\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.433548 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2db23198-8297-4e77-aed3-78ca89d5e6f8-config\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.452672 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4bhl\" (UniqueName: \"kubernetes.io/projected/2db23198-8297-4e77-aed3-78ca89d5e6f8-kube-api-access-s4bhl\") pod \"ovn-controller-metrics-rzmvz\" (UID: \"2db23198-8297-4e77-aed3-78ca89d5e6f8\") " pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.543114 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-rzmvz" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.817222 4885 generic.go:334] "Generic (PLEG): container finished" podID="f9fbe86b-d12b-4122-93b5-4cd373fca82b" containerID="5ce551c19a49288b39bac620717716e0b01f78076da4b4df87c526bdf4dbf80b" exitCode=0 Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.817367 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-b6j88" event={"ID":"f9fbe86b-d12b-4122-93b5-4cd373fca82b","Type":"ContainerDied","Data":"5ce551c19a49288b39bac620717716e0b01f78076da4b4df87c526bdf4dbf80b"} Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.817612 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-b6j88" event={"ID":"f9fbe86b-d12b-4122-93b5-4cd373fca82b","Type":"ContainerStarted","Data":"9900ad53bc89302945bf0323cbd47d6d4a3850a40f20eab215864e96284d898e"} Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.822543 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jmft" event={"ID":"00348ab8-7686-4e8d-bada-3d9e32edca19","Type":"ContainerStarted","Data":"68e12085e6f3582286197eb951027095349dfadab80a051924dc07c9275dd729"} Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.822613 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-5jmft" Mar 08 21:09:34 crc kubenswrapper[4885]: I0308 21:09:34.863729 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-5jmft" podStartSLOduration=2.863711219 podStartE2EDuration="2.863711219s" podCreationTimestamp="2026-03-08 21:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:09:34.855285694 +0000 UTC m=+5876.251339717" watchObservedRunningTime="2026-03-08 21:09:34.863711219 +0000 UTC m=+5876.259765242" Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.012176 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-rzmvz"] Mar 08 21:09:35 crc kubenswrapper[4885]: W0308 21:09:35.039240 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2db23198_8297_4e77_aed3_78ca89d5e6f8.slice/crio-d218ea3785829b24171b5edfa68f1015d00eff87aa9cc7f3a7ebf8769c4439da WatchSource:0}: Error finding container d218ea3785829b24171b5edfa68f1015d00eff87aa9cc7f3a7ebf8769c4439da: Status 404 returned error can't find the container with id d218ea3785829b24171b5edfa68f1015d00eff87aa9cc7f3a7ebf8769c4439da Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.840803 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rzmvz" event={"ID":"2db23198-8297-4e77-aed3-78ca89d5e6f8","Type":"ContainerStarted","Data":"6930581ae0fe282030d39ffb7d4844bfa5f1e0254849a14c42e00f74481c136d"} Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.841783 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-rzmvz" event={"ID":"2db23198-8297-4e77-aed3-78ca89d5e6f8","Type":"ContainerStarted","Data":"d218ea3785829b24171b5edfa68f1015d00eff87aa9cc7f3a7ebf8769c4439da"} Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.844144 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-b6j88" event={"ID":"f9fbe86b-d12b-4122-93b5-4cd373fca82b","Type":"ContainerStarted","Data":"32de8c9ba548f56a7ef576caa8ec0e4d92ea98d64cd06b8e68fb4b8b001e1f76"} Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.844179 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-b6j88" event={"ID":"f9fbe86b-d12b-4122-93b5-4cd373fca82b","Type":"ContainerStarted","Data":"c2e033f0ed41790128c8a1c9bef8266f92b56cf1c5071cf6bd8265b11e1129d6"} Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.844475 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.865607 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-rzmvz" podStartSLOduration=1.865583922 podStartE2EDuration="1.865583922s" podCreationTimestamp="2026-03-08 21:09:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:09:35.858946085 +0000 UTC m=+5877.255000108" watchObservedRunningTime="2026-03-08 21:09:35.865583922 +0000 UTC m=+5877.261637965" Mar 08 21:09:35 crc kubenswrapper[4885]: I0308 21:09:35.905334 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-b6j88" podStartSLOduration=3.905308372 podStartE2EDuration="3.905308372s" podCreationTimestamp="2026-03-08 21:09:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:09:35.890912778 +0000 UTC m=+5877.286966801" watchObservedRunningTime="2026-03-08 21:09:35.905308372 +0000 UTC m=+5877.301362395" Mar 08 21:09:36 crc kubenswrapper[4885]: I0308 21:09:36.860416 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:09:38 crc kubenswrapper[4885]: I0308 21:09:38.772738 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:38 crc kubenswrapper[4885]: I0308 21:09:38.773068 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:39 crc kubenswrapper[4885]: I0308 21:09:39.835572 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-m7crp" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="registry-server" probeResult="failure" output=< Mar 08 21:09:39 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 21:09:39 crc kubenswrapper[4885]: > Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.670762 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dmxzc"] Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.674965 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.687595 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-catalog-content\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.687644 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjrg2\" (UniqueName: \"kubernetes.io/projected/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-kube-api-access-cjrg2\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.687794 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-utilities\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.690403 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dmxzc"] Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.790047 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-utilities\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.790157 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-catalog-content\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.790179 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjrg2\" (UniqueName: \"kubernetes.io/projected/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-kube-api-access-cjrg2\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.790644 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-utilities\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.790807 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-catalog-content\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:41 crc kubenswrapper[4885]: I0308 21:09:41.813463 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjrg2\" (UniqueName: \"kubernetes.io/projected/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-kube-api-access-cjrg2\") pod \"certified-operators-dmxzc\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.011905 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.222631 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-l5ssn"] Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.225523 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.237061 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-l5ssn"] Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.304085 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt2tb\" (UniqueName: \"kubernetes.io/projected/93292c62-a3f4-439e-98fd-85ff17958f38-kube-api-access-nt2tb\") pod \"octavia-db-create-l5ssn\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.304219 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93292c62-a3f4-439e-98fd-85ff17958f38-operator-scripts\") pod \"octavia-db-create-l5ssn\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.396508 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dmxzc"] Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.408086 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93292c62-a3f4-439e-98fd-85ff17958f38-operator-scripts\") pod \"octavia-db-create-l5ssn\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.408184 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt2tb\" (UniqueName: \"kubernetes.io/projected/93292c62-a3f4-439e-98fd-85ff17958f38-kube-api-access-nt2tb\") pod \"octavia-db-create-l5ssn\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.408776 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93292c62-a3f4-439e-98fd-85ff17958f38-operator-scripts\") pod \"octavia-db-create-l5ssn\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.450638 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt2tb\" (UniqueName: \"kubernetes.io/projected/93292c62-a3f4-439e-98fd-85ff17958f38-kube-api-access-nt2tb\") pod \"octavia-db-create-l5ssn\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.577217 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.931157 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerID="c4c2ca7970045efef5435938f6bb44bd5446c5ce53a852adfb403510ee1a79c2" exitCode=0 Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.931216 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmxzc" event={"ID":"ac39ed11-7986-48ee-adf8-aa9e4b653bb7","Type":"ContainerDied","Data":"c4c2ca7970045efef5435938f6bb44bd5446c5ce53a852adfb403510ee1a79c2"} Mar 08 21:09:42 crc kubenswrapper[4885]: I0308 21:09:42.931548 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmxzc" event={"ID":"ac39ed11-7986-48ee-adf8-aa9e4b653bb7","Type":"ContainerStarted","Data":"475981f5e367a38b12958173cb4464d8a6a0beff487ca2a30c7e10f1682ef6fa"} Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.124999 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-l5ssn"] Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.796557 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-208d-account-create-update-22tjb"] Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.798373 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.803044 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.808854 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-208d-account-create-update-22tjb"] Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.944284 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-operator-scripts\") pod \"octavia-208d-account-create-update-22tjb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.944403 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4l2x\" (UniqueName: \"kubernetes.io/projected/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-kube-api-access-p4l2x\") pod \"octavia-208d-account-create-update-22tjb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.948183 4885 generic.go:334] "Generic (PLEG): container finished" podID="93292c62-a3f4-439e-98fd-85ff17958f38" containerID="4f877a81b26ff73a856887aee5ba6b11b65e1a2c9a19d4946db0e527c8b1dfc9" exitCode=0 Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.948246 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-l5ssn" event={"ID":"93292c62-a3f4-439e-98fd-85ff17958f38","Type":"ContainerDied","Data":"4f877a81b26ff73a856887aee5ba6b11b65e1a2c9a19d4946db0e527c8b1dfc9"} Mar 08 21:09:43 crc kubenswrapper[4885]: I0308 21:09:43.948269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-l5ssn" event={"ID":"93292c62-a3f4-439e-98fd-85ff17958f38","Type":"ContainerStarted","Data":"176bc362a836ac09c93678d8e5b76cb9e4fa35bd9c97f372ab52656583346472"} Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.046623 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4l2x\" (UniqueName: \"kubernetes.io/projected/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-kube-api-access-p4l2x\") pod \"octavia-208d-account-create-update-22tjb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.046803 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-operator-scripts\") pod \"octavia-208d-account-create-update-22tjb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.047553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-operator-scripts\") pod \"octavia-208d-account-create-update-22tjb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.078433 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4l2x\" (UniqueName: \"kubernetes.io/projected/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-kube-api-access-p4l2x\") pod \"octavia-208d-account-create-update-22tjb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.176076 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.672496 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-208d-account-create-update-22tjb"] Mar 08 21:09:44 crc kubenswrapper[4885]: W0308 21:09:44.683006 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef43c9be_bb15_4927_b520_fe1b5ea3cabb.slice/crio-5f8b30563255dd7cbc3c53076d105a1f3adfe9407c0a9864fcf52c478ab38e92 WatchSource:0}: Error finding container 5f8b30563255dd7cbc3c53076d105a1f3adfe9407c0a9864fcf52c478ab38e92: Status 404 returned error can't find the container with id 5f8b30563255dd7cbc3c53076d105a1f3adfe9407c0a9864fcf52c478ab38e92 Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.969207 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerID="85ace4bbc263d67af4ff24cc59994a076cc980df82df1e2ae92a9834af20ce31" exitCode=0 Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.969314 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmxzc" event={"ID":"ac39ed11-7986-48ee-adf8-aa9e4b653bb7","Type":"ContainerDied","Data":"85ace4bbc263d67af4ff24cc59994a076cc980df82df1e2ae92a9834af20ce31"} Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.976063 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-208d-account-create-update-22tjb" event={"ID":"ef43c9be-bb15-4927-b520-fe1b5ea3cabb","Type":"ContainerStarted","Data":"52641a15b7d3eed6bc15113db82cacc9c6bd5304460efbff7b7427e47ea8d579"} Mar 08 21:09:44 crc kubenswrapper[4885]: I0308 21:09:44.976162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-208d-account-create-update-22tjb" event={"ID":"ef43c9be-bb15-4927-b520-fe1b5ea3cabb","Type":"ContainerStarted","Data":"5f8b30563255dd7cbc3c53076d105a1f3adfe9407c0a9864fcf52c478ab38e92"} Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.018736 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-208d-account-create-update-22tjb" podStartSLOduration=2.018714176 podStartE2EDuration="2.018714176s" podCreationTimestamp="2026-03-08 21:09:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:09:45.017214116 +0000 UTC m=+5886.413268159" watchObservedRunningTime="2026-03-08 21:09:45.018714176 +0000 UTC m=+5886.414768209" Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.394246 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.580198 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93292c62-a3f4-439e-98fd-85ff17958f38-operator-scripts\") pod \"93292c62-a3f4-439e-98fd-85ff17958f38\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.580513 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nt2tb\" (UniqueName: \"kubernetes.io/projected/93292c62-a3f4-439e-98fd-85ff17958f38-kube-api-access-nt2tb\") pod \"93292c62-a3f4-439e-98fd-85ff17958f38\" (UID: \"93292c62-a3f4-439e-98fd-85ff17958f38\") " Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.581300 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93292c62-a3f4-439e-98fd-85ff17958f38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "93292c62-a3f4-439e-98fd-85ff17958f38" (UID: "93292c62-a3f4-439e-98fd-85ff17958f38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.592112 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93292c62-a3f4-439e-98fd-85ff17958f38-kube-api-access-nt2tb" (OuterVolumeSpecName: "kube-api-access-nt2tb") pod "93292c62-a3f4-439e-98fd-85ff17958f38" (UID: "93292c62-a3f4-439e-98fd-85ff17958f38"). InnerVolumeSpecName "kube-api-access-nt2tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.682684 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/93292c62-a3f4-439e-98fd-85ff17958f38-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:45 crc kubenswrapper[4885]: I0308 21:09:45.682722 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nt2tb\" (UniqueName: \"kubernetes.io/projected/93292c62-a3f4-439e-98fd-85ff17958f38-kube-api-access-nt2tb\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.015371 4885 generic.go:334] "Generic (PLEG): container finished" podID="ef43c9be-bb15-4927-b520-fe1b5ea3cabb" containerID="52641a15b7d3eed6bc15113db82cacc9c6bd5304460efbff7b7427e47ea8d579" exitCode=0 Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.015510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-208d-account-create-update-22tjb" event={"ID":"ef43c9be-bb15-4927-b520-fe1b5ea3cabb","Type":"ContainerDied","Data":"52641a15b7d3eed6bc15113db82cacc9c6bd5304460efbff7b7427e47ea8d579"} Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.021526 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmxzc" event={"ID":"ac39ed11-7986-48ee-adf8-aa9e4b653bb7","Type":"ContainerStarted","Data":"da2f1a91c9bb51280241448d964496931abf663a64970bb68efa8e74c760d038"} Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.025479 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-l5ssn" Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.026488 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-l5ssn" event={"ID":"93292c62-a3f4-439e-98fd-85ff17958f38","Type":"ContainerDied","Data":"176bc362a836ac09c93678d8e5b76cb9e4fa35bd9c97f372ab52656583346472"} Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.026570 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="176bc362a836ac09c93678d8e5b76cb9e4fa35bd9c97f372ab52656583346472" Mar 08 21:09:46 crc kubenswrapper[4885]: I0308 21:09:46.054944 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dmxzc" podStartSLOduration=2.596190973 podStartE2EDuration="5.054903474s" podCreationTimestamp="2026-03-08 21:09:41 +0000 UTC" firstStartedPulling="2026-03-08 21:09:42.937113293 +0000 UTC m=+5884.333167316" lastFinishedPulling="2026-03-08 21:09:45.395825754 +0000 UTC m=+5886.791879817" observedRunningTime="2026-03-08 21:09:46.052908551 +0000 UTC m=+5887.448962574" watchObservedRunningTime="2026-03-08 21:09:46.054903474 +0000 UTC m=+5887.450957507" Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.411987 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.417314 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-operator-scripts\") pod \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.417374 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4l2x\" (UniqueName: \"kubernetes.io/projected/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-kube-api-access-p4l2x\") pod \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\" (UID: \"ef43c9be-bb15-4927-b520-fe1b5ea3cabb\") " Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.418275 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ef43c9be-bb15-4927-b520-fe1b5ea3cabb" (UID: "ef43c9be-bb15-4927-b520-fe1b5ea3cabb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.422034 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-kube-api-access-p4l2x" (OuterVolumeSpecName: "kube-api-access-p4l2x") pod "ef43c9be-bb15-4927-b520-fe1b5ea3cabb" (UID: "ef43c9be-bb15-4927-b520-fe1b5ea3cabb"). InnerVolumeSpecName "kube-api-access-p4l2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.519944 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:47 crc kubenswrapper[4885]: I0308 21:09:47.519975 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4l2x\" (UniqueName: \"kubernetes.io/projected/ef43c9be-bb15-4927-b520-fe1b5ea3cabb-kube-api-access-p4l2x\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.052225 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-208d-account-create-update-22tjb" event={"ID":"ef43c9be-bb15-4927-b520-fe1b5ea3cabb","Type":"ContainerDied","Data":"5f8b30563255dd7cbc3c53076d105a1f3adfe9407c0a9864fcf52c478ab38e92"} Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.052293 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f8b30563255dd7cbc3c53076d105a1f3adfe9407c0a9864fcf52c478ab38e92" Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.052366 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-208d-account-create-update-22tjb" Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.068654 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-bqsgq"] Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.078518 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-bqsgq"] Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.849806 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:48 crc kubenswrapper[4885]: I0308 21:09:48.934166 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.108698 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7crp"] Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.398023 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bd0921d-5173-43dd-ac53-0ec3417dce77" path="/var/lib/kubelet/pods/8bd0921d-5173-43dd-ac53-0ec3417dce77/volumes" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.513126 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-kg2st"] Mar 08 21:09:49 crc kubenswrapper[4885]: E0308 21:09:49.513766 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93292c62-a3f4-439e-98fd-85ff17958f38" containerName="mariadb-database-create" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.513796 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="93292c62-a3f4-439e-98fd-85ff17958f38" containerName="mariadb-database-create" Mar 08 21:09:49 crc kubenswrapper[4885]: E0308 21:09:49.513839 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef43c9be-bb15-4927-b520-fe1b5ea3cabb" containerName="mariadb-account-create-update" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.513852 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef43c9be-bb15-4927-b520-fe1b5ea3cabb" containerName="mariadb-account-create-update" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.514219 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="93292c62-a3f4-439e-98fd-85ff17958f38" containerName="mariadb-database-create" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.514262 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef43c9be-bb15-4927-b520-fe1b5ea3cabb" containerName="mariadb-account-create-update" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.515034 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.524229 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-kg2st"] Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.568443 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d6415b-d535-426e-a500-cd8e25255bde-operator-scripts\") pod \"octavia-persistence-db-create-kg2st\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.568610 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c46l5\" (UniqueName: \"kubernetes.io/projected/c4d6415b-d535-426e-a500-cd8e25255bde-kube-api-access-c46l5\") pod \"octavia-persistence-db-create-kg2st\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.670367 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c46l5\" (UniqueName: \"kubernetes.io/projected/c4d6415b-d535-426e-a500-cd8e25255bde-kube-api-access-c46l5\") pod \"octavia-persistence-db-create-kg2st\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.670865 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d6415b-d535-426e-a500-cd8e25255bde-operator-scripts\") pod \"octavia-persistence-db-create-kg2st\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.672034 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d6415b-d535-426e-a500-cd8e25255bde-operator-scripts\") pod \"octavia-persistence-db-create-kg2st\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.700727 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c46l5\" (UniqueName: \"kubernetes.io/projected/c4d6415b-d535-426e-a500-cd8e25255bde-kube-api-access-c46l5\") pod \"octavia-persistence-db-create-kg2st\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:49 crc kubenswrapper[4885]: I0308 21:09:49.840720 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.070995 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-70a3-account-create-update-kvgcv"] Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.072217 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.074501 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.075805 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m7crp" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="registry-server" containerID="cri-o://632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf" gracePeriod=2 Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.084436 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-70a3-account-create-update-kvgcv"] Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.179461 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9ftx\" (UniqueName: \"kubernetes.io/projected/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-kube-api-access-h9ftx\") pod \"octavia-70a3-account-create-update-kvgcv\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.179639 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-operator-scripts\") pod \"octavia-70a3-account-create-update-kvgcv\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.281792 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-operator-scripts\") pod \"octavia-70a3-account-create-update-kvgcv\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.281860 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9ftx\" (UniqueName: \"kubernetes.io/projected/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-kube-api-access-h9ftx\") pod \"octavia-70a3-account-create-update-kvgcv\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.283395 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-operator-scripts\") pod \"octavia-70a3-account-create-update-kvgcv\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.299766 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9ftx\" (UniqueName: \"kubernetes.io/projected/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-kube-api-access-h9ftx\") pod \"octavia-70a3-account-create-update-kvgcv\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: W0308 21:09:50.334162 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4d6415b_d535_426e_a500_cd8e25255bde.slice/crio-0459c0879a8816cfd6bd141776109255047e8e53100e246fb60e76cb40fb78a2 WatchSource:0}: Error finding container 0459c0879a8816cfd6bd141776109255047e8e53100e246fb60e76cb40fb78a2: Status 404 returned error can't find the container with id 0459c0879a8816cfd6bd141776109255047e8e53100e246fb60e76cb40fb78a2 Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.336544 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-kg2st"] Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.417017 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.503967 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.588501 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xncnq\" (UniqueName: \"kubernetes.io/projected/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-kube-api-access-xncnq\") pod \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.588867 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-utilities\") pod \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.588913 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-catalog-content\") pod \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\" (UID: \"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23\") " Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.589819 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-utilities" (OuterVolumeSpecName: "utilities") pod "bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" (UID: "bfcf343f-f5c8-46e9-a3be-95ddaa56bf23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.598438 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-kube-api-access-xncnq" (OuterVolumeSpecName: "kube-api-access-xncnq") pod "bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" (UID: "bfcf343f-f5c8-46e9-a3be-95ddaa56bf23"). InnerVolumeSpecName "kube-api-access-xncnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.618123 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" (UID: "bfcf343f-f5c8-46e9-a3be-95ddaa56bf23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.691239 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xncnq\" (UniqueName: \"kubernetes.io/projected/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-kube-api-access-xncnq\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.691282 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.691292 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:50 crc kubenswrapper[4885]: I0308 21:09:50.890114 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-70a3-account-create-update-kvgcv"] Mar 08 21:09:50 crc kubenswrapper[4885]: W0308 21:09:50.902969 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddfd3421_88b0_49f2_b94e_fe31c3b5c12f.slice/crio-b9038abbfd9fa45044d554a369aed3835055887a0d11fad9aefb8310028c8da2 WatchSource:0}: Error finding container b9038abbfd9fa45044d554a369aed3835055887a0d11fad9aefb8310028c8da2: Status 404 returned error can't find the container with id b9038abbfd9fa45044d554a369aed3835055887a0d11fad9aefb8310028c8da2 Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.093620 4885 generic.go:334] "Generic (PLEG): container finished" podID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerID="632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf" exitCode=0 Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.093706 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerDied","Data":"632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf"} Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.095085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m7crp" event={"ID":"bfcf343f-f5c8-46e9-a3be-95ddaa56bf23","Type":"ContainerDied","Data":"84590b4da479b1b889ba9e8ed38b36c22bd738109e3b3292bd5c830ce0a9b1f6"} Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.093728 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m7crp" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.095114 4885 scope.go:117] "RemoveContainer" containerID="632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.099020 4885 generic.go:334] "Generic (PLEG): container finished" podID="c4d6415b-d535-426e-a500-cd8e25255bde" containerID="10f01dfd93c84f82b0e33850f2cd43179983bc59ab2fd73179f62505bcc743de" exitCode=0 Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.099167 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kg2st" event={"ID":"c4d6415b-d535-426e-a500-cd8e25255bde","Type":"ContainerDied","Data":"10f01dfd93c84f82b0e33850f2cd43179983bc59ab2fd73179f62505bcc743de"} Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.099191 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kg2st" event={"ID":"c4d6415b-d535-426e-a500-cd8e25255bde","Type":"ContainerStarted","Data":"0459c0879a8816cfd6bd141776109255047e8e53100e246fb60e76cb40fb78a2"} Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.101296 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-70a3-account-create-update-kvgcv" event={"ID":"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f","Type":"ContainerStarted","Data":"b9038abbfd9fa45044d554a369aed3835055887a0d11fad9aefb8310028c8da2"} Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.126540 4885 scope.go:117] "RemoveContainer" containerID="37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.142034 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-70a3-account-create-update-kvgcv" podStartSLOduration=1.142010954 podStartE2EDuration="1.142010954s" podCreationTimestamp="2026-03-08 21:09:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:09:51.135535681 +0000 UTC m=+5892.531589724" watchObservedRunningTime="2026-03-08 21:09:51.142010954 +0000 UTC m=+5892.538064987" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.162448 4885 scope.go:117] "RemoveContainer" containerID="2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.172190 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7crp"] Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.184093 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m7crp"] Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.210529 4885 scope.go:117] "RemoveContainer" containerID="632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf" Mar 08 21:09:51 crc kubenswrapper[4885]: E0308 21:09:51.211339 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf\": container with ID starting with 632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf not found: ID does not exist" containerID="632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.211390 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf"} err="failed to get container status \"632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf\": rpc error: code = NotFound desc = could not find container \"632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf\": container with ID starting with 632cd1a82919b9a3346be79937bd250722d1a3990135b361dfdc2adf1004d1bf not found: ID does not exist" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.211424 4885 scope.go:117] "RemoveContainer" containerID="37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b" Mar 08 21:09:51 crc kubenswrapper[4885]: E0308 21:09:51.211945 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b\": container with ID starting with 37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b not found: ID does not exist" containerID="37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.211996 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b"} err="failed to get container status \"37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b\": rpc error: code = NotFound desc = could not find container \"37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b\": container with ID starting with 37ddc53f132d1062cf80648be255333993946324bdf04001326cd0fa641dec9b not found: ID does not exist" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.212057 4885 scope.go:117] "RemoveContainer" containerID="2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1" Mar 08 21:09:51 crc kubenswrapper[4885]: E0308 21:09:51.212491 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1\": container with ID starting with 2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1 not found: ID does not exist" containerID="2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.212531 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1"} err="failed to get container status \"2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1\": rpc error: code = NotFound desc = could not find container \"2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1\": container with ID starting with 2ea65e969b7073d98a06fdbf71a526394a13f8f941e8dce3983252283c6f2dc1 not found: ID does not exist" Mar 08 21:09:51 crc kubenswrapper[4885]: I0308 21:09:51.387209 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" path="/var/lib/kubelet/pods/bfcf343f-f5c8-46e9-a3be-95ddaa56bf23/volumes" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.012081 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.012145 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.095824 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.121518 4885 generic.go:334] "Generic (PLEG): container finished" podID="ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" containerID="787b1783aeddf609e5e191b59369719fad9abea1b0e98367db13c4196466f2fe" exitCode=0 Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.121722 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-70a3-account-create-update-kvgcv" event={"ID":"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f","Type":"ContainerDied","Data":"787b1783aeddf609e5e191b59369719fad9abea1b0e98367db13c4196466f2fe"} Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.211600 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.538527 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.633092 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c46l5\" (UniqueName: \"kubernetes.io/projected/c4d6415b-d535-426e-a500-cd8e25255bde-kube-api-access-c46l5\") pod \"c4d6415b-d535-426e-a500-cd8e25255bde\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.633219 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d6415b-d535-426e-a500-cd8e25255bde-operator-scripts\") pod \"c4d6415b-d535-426e-a500-cd8e25255bde\" (UID: \"c4d6415b-d535-426e-a500-cd8e25255bde\") " Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.635116 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4d6415b-d535-426e-a500-cd8e25255bde-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4d6415b-d535-426e-a500-cd8e25255bde" (UID: "c4d6415b-d535-426e-a500-cd8e25255bde"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.639889 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d6415b-d535-426e-a500-cd8e25255bde-kube-api-access-c46l5" (OuterVolumeSpecName: "kube-api-access-c46l5") pod "c4d6415b-d535-426e-a500-cd8e25255bde" (UID: "c4d6415b-d535-426e-a500-cd8e25255bde"). InnerVolumeSpecName "kube-api-access-c46l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.735894 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c46l5\" (UniqueName: \"kubernetes.io/projected/c4d6415b-d535-426e-a500-cd8e25255bde-kube-api-access-c46l5\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:52 crc kubenswrapper[4885]: I0308 21:09:52.735986 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4d6415b-d535-426e-a500-cd8e25255bde-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.140047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-kg2st" event={"ID":"c4d6415b-d535-426e-a500-cd8e25255bde","Type":"ContainerDied","Data":"0459c0879a8816cfd6bd141776109255047e8e53100e246fb60e76cb40fb78a2"} Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.140401 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0459c0879a8816cfd6bd141776109255047e8e53100e246fb60e76cb40fb78a2" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.140332 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-kg2st" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.512850 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dmxzc"] Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.570266 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.651762 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9ftx\" (UniqueName: \"kubernetes.io/projected/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-kube-api-access-h9ftx\") pod \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.651821 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-operator-scripts\") pod \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\" (UID: \"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f\") " Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.652462 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" (UID: "ddfd3421-88b0-49f2-b94e-fe31c3b5c12f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.661122 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-kube-api-access-h9ftx" (OuterVolumeSpecName: "kube-api-access-h9ftx") pod "ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" (UID: "ddfd3421-88b0-49f2-b94e-fe31c3b5c12f"). InnerVolumeSpecName "kube-api-access-h9ftx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.753437 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9ftx\" (UniqueName: \"kubernetes.io/projected/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-kube-api-access-h9ftx\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:53 crc kubenswrapper[4885]: I0308 21:09:53.753485 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:54 crc kubenswrapper[4885]: I0308 21:09:54.156164 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-70a3-account-create-update-kvgcv" Mar 08 21:09:54 crc kubenswrapper[4885]: I0308 21:09:54.156187 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-70a3-account-create-update-kvgcv" event={"ID":"ddfd3421-88b0-49f2-b94e-fe31c3b5c12f","Type":"ContainerDied","Data":"b9038abbfd9fa45044d554a369aed3835055887a0d11fad9aefb8310028c8da2"} Mar 08 21:09:54 crc kubenswrapper[4885]: I0308 21:09:54.156266 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9038abbfd9fa45044d554a369aed3835055887a0d11fad9aefb8310028c8da2" Mar 08 21:09:54 crc kubenswrapper[4885]: I0308 21:09:54.156645 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dmxzc" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="registry-server" containerID="cri-o://da2f1a91c9bb51280241448d964496931abf663a64970bb68efa8e74c760d038" gracePeriod=2 Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.174550 4885 generic.go:334] "Generic (PLEG): container finished" podID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerID="da2f1a91c9bb51280241448d964496931abf663a64970bb68efa8e74c760d038" exitCode=0 Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.174770 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmxzc" event={"ID":"ac39ed11-7986-48ee-adf8-aa9e4b653bb7","Type":"ContainerDied","Data":"da2f1a91c9bb51280241448d964496931abf663a64970bb68efa8e74c760d038"} Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.175658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dmxzc" event={"ID":"ac39ed11-7986-48ee-adf8-aa9e4b653bb7","Type":"ContainerDied","Data":"475981f5e367a38b12958173cb4464d8a6a0beff487ca2a30c7e10f1682ef6fa"} Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.175740 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="475981f5e367a38b12958173cb4464d8a6a0beff487ca2a30c7e10f1682ef6fa" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.215533 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.284881 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-utilities\") pod \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.285487 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-catalog-content\") pod \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.285825 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjrg2\" (UniqueName: \"kubernetes.io/projected/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-kube-api-access-cjrg2\") pod \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\" (UID: \"ac39ed11-7986-48ee-adf8-aa9e4b653bb7\") " Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.286826 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-utilities" (OuterVolumeSpecName: "utilities") pod "ac39ed11-7986-48ee-adf8-aa9e4b653bb7" (UID: "ac39ed11-7986-48ee-adf8-aa9e4b653bb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.303183 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-kube-api-access-cjrg2" (OuterVolumeSpecName: "kube-api-access-cjrg2") pod "ac39ed11-7986-48ee-adf8-aa9e4b653bb7" (UID: "ac39ed11-7986-48ee-adf8-aa9e4b653bb7"). InnerVolumeSpecName "kube-api-access-cjrg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.388345 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac39ed11-7986-48ee-adf8-aa9e4b653bb7" (UID: "ac39ed11-7986-48ee-adf8-aa9e4b653bb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.388759 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.388797 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjrg2\" (UniqueName: \"kubernetes.io/projected/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-kube-api-access-cjrg2\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.388810 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac39ed11-7986-48ee-adf8-aa9e4b653bb7-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492247 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-754cf98f97-rw6hg"] Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492697 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="registry-server" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492719 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="registry-server" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492735 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" containerName="mariadb-account-create-update" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492745 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" containerName="mariadb-account-create-update" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492766 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="registry-server" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492774 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="registry-server" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492783 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d6415b-d535-426e-a500-cd8e25255bde" containerName="mariadb-database-create" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492792 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d6415b-d535-426e-a500-cd8e25255bde" containerName="mariadb-database-create" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492805 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="extract-utilities" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492815 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="extract-utilities" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492832 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="extract-content" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492839 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="extract-content" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492855 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="extract-utilities" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492864 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="extract-utilities" Mar 08 21:09:55 crc kubenswrapper[4885]: E0308 21:09:55.492885 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="extract-content" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.492893 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="extract-content" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.493465 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d6415b-d535-426e-a500-cd8e25255bde" containerName="mariadb-database-create" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.493488 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" containerName="mariadb-account-create-update" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.493511 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfcf343f-f5c8-46e9-a3be-95ddaa56bf23" containerName="registry-server" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.493522 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" containerName="registry-server" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.495125 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.499436 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.499650 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.500452 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-xz7gf" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.519222 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-754cf98f97-rw6hg"] Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.592169 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-scripts\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.592234 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-config-data\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.592261 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/4fef7207-0a04-4fb4-af9e-d9efcd13226f-octavia-run\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.592279 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-combined-ca-bundle\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.592315 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4fef7207-0a04-4fb4-af9e-d9efcd13226f-config-data-merged\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.693985 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-scripts\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.694057 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-config-data\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.694086 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/4fef7207-0a04-4fb4-af9e-d9efcd13226f-octavia-run\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.694102 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-combined-ca-bundle\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.694137 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4fef7207-0a04-4fb4-af9e-d9efcd13226f-config-data-merged\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.694599 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4fef7207-0a04-4fb4-af9e-d9efcd13226f-config-data-merged\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.695362 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/4fef7207-0a04-4fb4-af9e-d9efcd13226f-octavia-run\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.699219 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-scripts\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.699241 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-combined-ca-bundle\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.699595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fef7207-0a04-4fb4-af9e-d9efcd13226f-config-data\") pod \"octavia-api-754cf98f97-rw6hg\" (UID: \"4fef7207-0a04-4fb4-af9e-d9efcd13226f\") " pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:55 crc kubenswrapper[4885]: I0308 21:09:55.819238 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:09:56 crc kubenswrapper[4885]: I0308 21:09:56.183295 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dmxzc" Mar 08 21:09:56 crc kubenswrapper[4885]: I0308 21:09:56.233969 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dmxzc"] Mar 08 21:09:56 crc kubenswrapper[4885]: I0308 21:09:56.260759 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dmxzc"] Mar 08 21:09:56 crc kubenswrapper[4885]: E0308 21:09:56.309471 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac39ed11_7986_48ee_adf8_aa9e4b653bb7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac39ed11_7986_48ee_adf8_aa9e4b653bb7.slice/crio-475981f5e367a38b12958173cb4464d8a6a0beff487ca2a30c7e10f1682ef6fa\": RecentStats: unable to find data in memory cache]" Mar 08 21:09:56 crc kubenswrapper[4885]: I0308 21:09:56.456575 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-754cf98f97-rw6hg"] Mar 08 21:09:56 crc kubenswrapper[4885]: W0308 21:09:56.467900 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4fef7207_0a04_4fb4_af9e_d9efcd13226f.slice/crio-97cf67887692bf125409a55f9c01db4e747f1c1495e7111c1df09e9027829ed5 WatchSource:0}: Error finding container 97cf67887692bf125409a55f9c01db4e747f1c1495e7111c1df09e9027829ed5: Status 404 returned error can't find the container with id 97cf67887692bf125409a55f9c01db4e747f1c1495e7111c1df09e9027829ed5 Mar 08 21:09:57 crc kubenswrapper[4885]: I0308 21:09:57.192409 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-754cf98f97-rw6hg" event={"ID":"4fef7207-0a04-4fb4-af9e-d9efcd13226f","Type":"ContainerStarted","Data":"97cf67887692bf125409a55f9c01db4e747f1c1495e7111c1df09e9027829ed5"} Mar 08 21:09:57 crc kubenswrapper[4885]: I0308 21:09:57.377664 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac39ed11-7986-48ee-adf8-aa9e4b653bb7" path="/var/lib/kubelet/pods/ac39ed11-7986-48ee-adf8-aa9e4b653bb7/volumes" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.144952 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550070-gjrwt"] Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.146313 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.148557 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.148703 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.149758 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.163818 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550070-gjrwt"] Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.302940 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s94lp\" (UniqueName: \"kubernetes.io/projected/4497ae4b-d188-4afa-9546-11fbe209a9a7-kube-api-access-s94lp\") pod \"auto-csr-approver-29550070-gjrwt\" (UID: \"4497ae4b-d188-4afa-9546-11fbe209a9a7\") " pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.404529 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s94lp\" (UniqueName: \"kubernetes.io/projected/4497ae4b-d188-4afa-9546-11fbe209a9a7-kube-api-access-s94lp\") pod \"auto-csr-approver-29550070-gjrwt\" (UID: \"4497ae4b-d188-4afa-9546-11fbe209a9a7\") " pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.444685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s94lp\" (UniqueName: \"kubernetes.io/projected/4497ae4b-d188-4afa-9546-11fbe209a9a7-kube-api-access-s94lp\") pod \"auto-csr-approver-29550070-gjrwt\" (UID: \"4497ae4b-d188-4afa-9546-11fbe209a9a7\") " pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:00 crc kubenswrapper[4885]: I0308 21:10:00.476693 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:06 crc kubenswrapper[4885]: W0308 21:10:06.133739 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4497ae4b_d188_4afa_9546_11fbe209a9a7.slice/crio-25dbb246034629d7ee3627972c12042fdd0fd5f9f8ec1f9db01e012fed97a898 WatchSource:0}: Error finding container 25dbb246034629d7ee3627972c12042fdd0fd5f9f8ec1f9db01e012fed97a898: Status 404 returned error can't find the container with id 25dbb246034629d7ee3627972c12042fdd0fd5f9f8ec1f9db01e012fed97a898 Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.135763 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550070-gjrwt"] Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.201986 4885 scope.go:117] "RemoveContainer" containerID="b2ba1b445c0bfbdc509da995c43b1467221966fc77b2d2c35df9edb0c74ad904" Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.236990 4885 scope.go:117] "RemoveContainer" containerID="7dd39daad27eae124834c42bda6676c4988f5e52cc85f87170d8845fcdc1c6e4" Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.280129 4885 scope.go:117] "RemoveContainer" containerID="3210dc1871dd8ae46bd14950976c866628de438227db3aa55b84daa5b1afb3d6" Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.300721 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-754cf98f97-rw6hg" event={"ID":"4fef7207-0a04-4fb4-af9e-d9efcd13226f","Type":"ContainerStarted","Data":"0193a69efe647b7c037ecf1f168e6419f8bff47ccc9aa4f9943fd876b33a7028"} Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.303465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" event={"ID":"4497ae4b-d188-4afa-9546-11fbe209a9a7","Type":"ContainerStarted","Data":"25dbb246034629d7ee3627972c12042fdd0fd5f9f8ec1f9db01e012fed97a898"} Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.335453 4885 scope.go:117] "RemoveContainer" containerID="1ebba46867f58104b9cbcc29fb91d1649cf147537f67ba19ec589a45bcb62ce8" Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.356865 4885 scope.go:117] "RemoveContainer" containerID="2a92dad5038281e9a909493af60718e80efbed4b0a30aad4e3ed0e4f55630488" Mar 08 21:10:06 crc kubenswrapper[4885]: I0308 21:10:06.379166 4885 scope.go:117] "RemoveContainer" containerID="de07ff485f289c819ea06e6137cf2c359f8f6dec75a1a6a503e4f8a88ac8bac6" Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.327783 4885 generic.go:334] "Generic (PLEG): container finished" podID="4fef7207-0a04-4fb4-af9e-d9efcd13226f" containerID="0193a69efe647b7c037ecf1f168e6419f8bff47ccc9aa4f9943fd876b33a7028" exitCode=0 Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.327884 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-754cf98f97-rw6hg" event={"ID":"4fef7207-0a04-4fb4-af9e-d9efcd13226f","Type":"ContainerDied","Data":"0193a69efe647b7c037ecf1f168e6419f8bff47ccc9aa4f9943fd876b33a7028"} Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.328339 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.328352 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.328361 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-754cf98f97-rw6hg" event={"ID":"4fef7207-0a04-4fb4-af9e-d9efcd13226f","Type":"ContainerStarted","Data":"6ae10a00f19c61478c4a9ec9eb56415ace500c6485074ba5682ba6415e59ce20"} Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.328375 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-754cf98f97-rw6hg" event={"ID":"4fef7207-0a04-4fb4-af9e-d9efcd13226f","Type":"ContainerStarted","Data":"6079820a59a3b2673729c7a6d46f50629c38688217d9ad16fcbf0cc730077bab"} Mar 08 21:10:07 crc kubenswrapper[4885]: I0308 21:10:07.347227 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-754cf98f97-rw6hg" podStartSLOduration=2.94644817 podStartE2EDuration="12.347207898s" podCreationTimestamp="2026-03-08 21:09:55 +0000 UTC" firstStartedPulling="2026-03-08 21:09:56.469926757 +0000 UTC m=+5897.865980780" lastFinishedPulling="2026-03-08 21:10:05.870686475 +0000 UTC m=+5907.266740508" observedRunningTime="2026-03-08 21:10:07.34540255 +0000 UTC m=+5908.741456573" watchObservedRunningTime="2026-03-08 21:10:07.347207898 +0000 UTC m=+5908.743261931" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.008279 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-5jmft" podUID="00348ab8-7686-4e8d-bada-3d9e32edca19" containerName="ovn-controller" probeResult="failure" output=< Mar 08 21:10:08 crc kubenswrapper[4885]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 08 21:10:08 crc kubenswrapper[4885]: > Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.034300 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.040646 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-b6j88" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.150001 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-5jmft-config-8xgln"] Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.151063 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.153390 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.161205 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5jmft-config-8xgln"] Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.177690 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-log-ovn\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.177747 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-scripts\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.177811 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run-ovn\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.177981 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.178063 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67s7m\" (UniqueName: \"kubernetes.io/projected/ae101502-72b5-4462-b83f-cb263bcda010-kube-api-access-67s7m\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.178112 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-additional-scripts\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.279664 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.279737 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67s7m\" (UniqueName: \"kubernetes.io/projected/ae101502-72b5-4462-b83f-cb263bcda010-kube-api-access-67s7m\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.279781 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-additional-scripts\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.279869 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-log-ovn\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.279893 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-scripts\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.279957 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run-ovn\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.280006 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.280166 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run-ovn\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.280167 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-log-ovn\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.280892 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-additional-scripts\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.282489 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-scripts\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.310111 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67s7m\" (UniqueName: \"kubernetes.io/projected/ae101502-72b5-4462-b83f-cb263bcda010-kube-api-access-67s7m\") pod \"ovn-controller-5jmft-config-8xgln\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.339465 4885 generic.go:334] "Generic (PLEG): container finished" podID="4497ae4b-d188-4afa-9546-11fbe209a9a7" containerID="81e3094200cf292808dcdd9d841162dea6305875cbf7d44e4dda3138e170a8d5" exitCode=0 Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.339511 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" event={"ID":"4497ae4b-d188-4afa-9546-11fbe209a9a7","Type":"ContainerDied","Data":"81e3094200cf292808dcdd9d841162dea6305875cbf7d44e4dda3138e170a8d5"} Mar 08 21:10:08 crc kubenswrapper[4885]: I0308 21:10:08.524831 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:09 crc kubenswrapper[4885]: W0308 21:10:09.072177 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae101502_72b5_4462_b83f_cb263bcda010.slice/crio-6f3fa9fbf41c8e4a15a75a35fbe898f60f2edc6332bf2cfe62ad1194335026a8 WatchSource:0}: Error finding container 6f3fa9fbf41c8e4a15a75a35fbe898f60f2edc6332bf2cfe62ad1194335026a8: Status 404 returned error can't find the container with id 6f3fa9fbf41c8e4a15a75a35fbe898f60f2edc6332bf2cfe62ad1194335026a8 Mar 08 21:10:09 crc kubenswrapper[4885]: I0308 21:10:09.091253 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-5jmft-config-8xgln"] Mar 08 21:10:09 crc kubenswrapper[4885]: I0308 21:10:09.353398 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jmft-config-8xgln" event={"ID":"ae101502-72b5-4462-b83f-cb263bcda010","Type":"ContainerStarted","Data":"6f3fa9fbf41c8e4a15a75a35fbe898f60f2edc6332bf2cfe62ad1194335026a8"} Mar 08 21:10:09 crc kubenswrapper[4885]: I0308 21:10:09.754008 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:09 crc kubenswrapper[4885]: I0308 21:10:09.812272 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s94lp\" (UniqueName: \"kubernetes.io/projected/4497ae4b-d188-4afa-9546-11fbe209a9a7-kube-api-access-s94lp\") pod \"4497ae4b-d188-4afa-9546-11fbe209a9a7\" (UID: \"4497ae4b-d188-4afa-9546-11fbe209a9a7\") " Mar 08 21:10:09 crc kubenswrapper[4885]: I0308 21:10:09.822041 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4497ae4b-d188-4afa-9546-11fbe209a9a7-kube-api-access-s94lp" (OuterVolumeSpecName: "kube-api-access-s94lp") pod "4497ae4b-d188-4afa-9546-11fbe209a9a7" (UID: "4497ae4b-d188-4afa-9546-11fbe209a9a7"). InnerVolumeSpecName "kube-api-access-s94lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:10:09 crc kubenswrapper[4885]: I0308 21:10:09.914491 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s94lp\" (UniqueName: \"kubernetes.io/projected/4497ae4b-d188-4afa-9546-11fbe209a9a7-kube-api-access-s94lp\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.364459 4885 generic.go:334] "Generic (PLEG): container finished" podID="ae101502-72b5-4462-b83f-cb263bcda010" containerID="b17a0512c9878aeaac8d1e7a329d963d016e56adb51b5c377e47629e2282f0c5" exitCode=0 Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.364752 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jmft-config-8xgln" event={"ID":"ae101502-72b5-4462-b83f-cb263bcda010","Type":"ContainerDied","Data":"b17a0512c9878aeaac8d1e7a329d963d016e56adb51b5c377e47629e2282f0c5"} Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.366396 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" event={"ID":"4497ae4b-d188-4afa-9546-11fbe209a9a7","Type":"ContainerDied","Data":"25dbb246034629d7ee3627972c12042fdd0fd5f9f8ec1f9db01e012fed97a898"} Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.366460 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25dbb246034629d7ee3627972c12042fdd0fd5f9f8ec1f9db01e012fed97a898" Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.366497 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550070-gjrwt" Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.848504 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550064-54cxw"] Mar 08 21:10:10 crc kubenswrapper[4885]: I0308 21:10:10.860145 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550064-54cxw"] Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.403856 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b5a791-d720-4a5c-9138-abe584a56755" path="/var/lib/kubelet/pods/94b5a791-d720-4a5c-9138-abe584a56755/volumes" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.757110 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.853765 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-log-ovn\") pod \"ae101502-72b5-4462-b83f-cb263bcda010\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.853872 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-scripts\") pod \"ae101502-72b5-4462-b83f-cb263bcda010\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.853937 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run-ovn\") pod \"ae101502-72b5-4462-b83f-cb263bcda010\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.854069 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67s7m\" (UniqueName: \"kubernetes.io/projected/ae101502-72b5-4462-b83f-cb263bcda010-kube-api-access-67s7m\") pod \"ae101502-72b5-4462-b83f-cb263bcda010\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.854168 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-additional-scripts\") pod \"ae101502-72b5-4462-b83f-cb263bcda010\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.854203 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run\") pod \"ae101502-72b5-4462-b83f-cb263bcda010\" (UID: \"ae101502-72b5-4462-b83f-cb263bcda010\") " Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.854724 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run" (OuterVolumeSpecName: "var-run") pod "ae101502-72b5-4462-b83f-cb263bcda010" (UID: "ae101502-72b5-4462-b83f-cb263bcda010"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.854755 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ae101502-72b5-4462-b83f-cb263bcda010" (UID: "ae101502-72b5-4462-b83f-cb263bcda010"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.856550 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-scripts" (OuterVolumeSpecName: "scripts") pod "ae101502-72b5-4462-b83f-cb263bcda010" (UID: "ae101502-72b5-4462-b83f-cb263bcda010"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.857005 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ae101502-72b5-4462-b83f-cb263bcda010" (UID: "ae101502-72b5-4462-b83f-cb263bcda010"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.857356 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ae101502-72b5-4462-b83f-cb263bcda010" (UID: "ae101502-72b5-4462-b83f-cb263bcda010"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.866141 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae101502-72b5-4462-b83f-cb263bcda010-kube-api-access-67s7m" (OuterVolumeSpecName: "kube-api-access-67s7m") pod "ae101502-72b5-4462-b83f-cb263bcda010" (UID: "ae101502-72b5-4462-b83f-cb263bcda010"). InnerVolumeSpecName "kube-api-access-67s7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.959072 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67s7m\" (UniqueName: \"kubernetes.io/projected/ae101502-72b5-4462-b83f-cb263bcda010-kube-api-access-67s7m\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.959114 4885 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.959123 4885 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.959131 4885 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.959141 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ae101502-72b5-4462-b83f-cb263bcda010-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:11 crc kubenswrapper[4885]: I0308 21:10:11.959150 4885 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ae101502-72b5-4462-b83f-cb263bcda010-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.281427 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-b8ndv"] Mar 08 21:10:12 crc kubenswrapper[4885]: E0308 21:10:12.281982 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae101502-72b5-4462-b83f-cb263bcda010" containerName="ovn-config" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.282014 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae101502-72b5-4462-b83f-cb263bcda010" containerName="ovn-config" Mar 08 21:10:12 crc kubenswrapper[4885]: E0308 21:10:12.282034 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4497ae4b-d188-4afa-9546-11fbe209a9a7" containerName="oc" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.282043 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4497ae4b-d188-4afa-9546-11fbe209a9a7" containerName="oc" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.282257 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4497ae4b-d188-4afa-9546-11fbe209a9a7" containerName="oc" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.282276 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae101502-72b5-4462-b83f-cb263bcda010" containerName="ovn-config" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.283684 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.287202 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.287423 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.287591 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.296316 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-b8ndv"] Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.400515 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-5jmft-config-8xgln" event={"ID":"ae101502-72b5-4462-b83f-cb263bcda010","Type":"ContainerDied","Data":"6f3fa9fbf41c8e4a15a75a35fbe898f60f2edc6332bf2cfe62ad1194335026a8"} Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.400552 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-5jmft-config-8xgln" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.400567 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f3fa9fbf41c8e4a15a75a35fbe898f60f2edc6332bf2cfe62ad1194335026a8" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.465723 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/956e4845-c662-402d-adb6-b05143af6570-hm-ports\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.465874 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/956e4845-c662-402d-adb6-b05143af6570-config-data\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.466196 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/956e4845-c662-402d-adb6-b05143af6570-scripts\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.466225 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/956e4845-c662-402d-adb6-b05143af6570-config-data-merged\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.568161 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/956e4845-c662-402d-adb6-b05143af6570-hm-ports\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.568272 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/956e4845-c662-402d-adb6-b05143af6570-config-data\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.568440 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/956e4845-c662-402d-adb6-b05143af6570-scripts\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.568492 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/956e4845-c662-402d-adb6-b05143af6570-config-data-merged\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.568939 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/956e4845-c662-402d-adb6-b05143af6570-config-data-merged\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.569236 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/956e4845-c662-402d-adb6-b05143af6570-hm-ports\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.575062 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/956e4845-c662-402d-adb6-b05143af6570-scripts\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.575501 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/956e4845-c662-402d-adb6-b05143af6570-config-data\") pod \"octavia-rsyslog-b8ndv\" (UID: \"956e4845-c662-402d-adb6-b05143af6570\") " pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.609839 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.885080 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-5jmft-config-8xgln"] Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.957138 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-5jmft-config-8xgln"] Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.976338 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-gv4p5"] Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.978131 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.981154 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Mar 08 21:10:12 crc kubenswrapper[4885]: I0308 21:10:12.986276 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-gv4p5"] Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.012056 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-5jmft" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.077803 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/302e26ba-0f77-4f06-a12e-74888dfc7821-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-gv4p5\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.077873 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/302e26ba-0f77-4f06-a12e-74888dfc7821-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-gv4p5\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.180403 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/302e26ba-0f77-4f06-a12e-74888dfc7821-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-gv4p5\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.180455 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/302e26ba-0f77-4f06-a12e-74888dfc7821-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-gv4p5\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.181079 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/302e26ba-0f77-4f06-a12e-74888dfc7821-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-gv4p5\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.185680 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/302e26ba-0f77-4f06-a12e-74888dfc7821-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-gv4p5\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.228271 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-b8ndv"] Mar 08 21:10:13 crc kubenswrapper[4885]: W0308 21:10:13.243468 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod956e4845_c662_402d_adb6_b05143af6570.slice/crio-3d7e853f05d33451ea28aec8f3c7751140e01db295fd2d484cb1a21f18698f16 WatchSource:0}: Error finding container 3d7e853f05d33451ea28aec8f3c7751140e01db295fd2d484cb1a21f18698f16: Status 404 returned error can't find the container with id 3d7e853f05d33451ea28aec8f3c7751140e01db295fd2d484cb1a21f18698f16 Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.314520 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.327521 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-b8ndv"] Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.380225 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae101502-72b5-4462-b83f-cb263bcda010" path="/var/lib/kubelet/pods/ae101502-72b5-4462-b83f-cb263bcda010/volumes" Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.437720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-b8ndv" event={"ID":"956e4845-c662-402d-adb6-b05143af6570","Type":"ContainerStarted","Data":"3d7e853f05d33451ea28aec8f3c7751140e01db295fd2d484cb1a21f18698f16"} Mar 08 21:10:13 crc kubenswrapper[4885]: W0308 21:10:13.754470 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod302e26ba_0f77_4f06_a12e_74888dfc7821.slice/crio-93038e652442b1279a170e696ea3cb5fbf35b66d2f9f1b908f62cf65c5297a72 WatchSource:0}: Error finding container 93038e652442b1279a170e696ea3cb5fbf35b66d2f9f1b908f62cf65c5297a72: Status 404 returned error can't find the container with id 93038e652442b1279a170e696ea3cb5fbf35b66d2f9f1b908f62cf65c5297a72 Mar 08 21:10:13 crc kubenswrapper[4885]: I0308 21:10:13.760167 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-gv4p5"] Mar 08 21:10:14 crc kubenswrapper[4885]: I0308 21:10:14.480982 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" event={"ID":"302e26ba-0f77-4f06-a12e-74888dfc7821","Type":"ContainerStarted","Data":"93038e652442b1279a170e696ea3cb5fbf35b66d2f9f1b908f62cf65c5297a72"} Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.515263 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-b8ndv" event={"ID":"956e4845-c662-402d-adb6-b05143af6570","Type":"ContainerStarted","Data":"c7fe7eab8c650abf7e77d2475a720f1ad85befa41cb8706c15a3ecb4c7232e6c"} Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.875989 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-bvj5k"] Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.878275 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.881159 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.893348 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-combined-ca-bundle\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.893459 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data-merged\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.893511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-scripts\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.893612 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.895509 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-bvj5k"] Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.995602 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-combined-ca-bundle\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.995685 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data-merged\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.995718 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-scripts\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.995739 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:17 crc kubenswrapper[4885]: I0308 21:10:17.996396 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data-merged\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:18 crc kubenswrapper[4885]: I0308 21:10:18.006683 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-scripts\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:18 crc kubenswrapper[4885]: I0308 21:10:18.007611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:18 crc kubenswrapper[4885]: I0308 21:10:18.009514 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-combined-ca-bundle\") pod \"octavia-db-sync-bvj5k\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:18 crc kubenswrapper[4885]: I0308 21:10:18.209360 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:18 crc kubenswrapper[4885]: I0308 21:10:18.825022 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-bvj5k"] Mar 08 21:10:18 crc kubenswrapper[4885]: W0308 21:10:18.842481 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod020ad790_8a8c_4e05_b3da_d6b823bb37e2.slice/crio-8f80b9a4caf87bf128bde1ac48ff0fea061835911ef4bb376737482f72040c76 WatchSource:0}: Error finding container 8f80b9a4caf87bf128bde1ac48ff0fea061835911ef4bb376737482f72040c76: Status 404 returned error can't find the container with id 8f80b9a4caf87bf128bde1ac48ff0fea061835911ef4bb376737482f72040c76 Mar 08 21:10:19 crc kubenswrapper[4885]: I0308 21:10:19.537194 4885 generic.go:334] "Generic (PLEG): container finished" podID="956e4845-c662-402d-adb6-b05143af6570" containerID="c7fe7eab8c650abf7e77d2475a720f1ad85befa41cb8706c15a3ecb4c7232e6c" exitCode=0 Mar 08 21:10:19 crc kubenswrapper[4885]: I0308 21:10:19.537281 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-b8ndv" event={"ID":"956e4845-c662-402d-adb6-b05143af6570","Type":"ContainerDied","Data":"c7fe7eab8c650abf7e77d2475a720f1ad85befa41cb8706c15a3ecb4c7232e6c"} Mar 08 21:10:19 crc kubenswrapper[4885]: I0308 21:10:19.540452 4885 generic.go:334] "Generic (PLEG): container finished" podID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerID="952b1007a07234fed2a2f1ecec5204600c8240958b93368c30ef4f62fcb4517a" exitCode=0 Mar 08 21:10:19 crc kubenswrapper[4885]: I0308 21:10:19.540495 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bvj5k" event={"ID":"020ad790-8a8c-4e05-b3da-d6b823bb37e2","Type":"ContainerDied","Data":"952b1007a07234fed2a2f1ecec5204600c8240958b93368c30ef4f62fcb4517a"} Mar 08 21:10:19 crc kubenswrapper[4885]: I0308 21:10:19.540536 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bvj5k" event={"ID":"020ad790-8a8c-4e05-b3da-d6b823bb37e2","Type":"ContainerStarted","Data":"8f80b9a4caf87bf128bde1ac48ff0fea061835911ef4bb376737482f72040c76"} Mar 08 21:10:20 crc kubenswrapper[4885]: I0308 21:10:20.552641 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bvj5k" event={"ID":"020ad790-8a8c-4e05-b3da-d6b823bb37e2","Type":"ContainerStarted","Data":"3c54259ed2f9cad06d6b82cb394e43cf58f48cf3c6a30620f67aa6eb4637dc84"} Mar 08 21:10:20 crc kubenswrapper[4885]: I0308 21:10:20.579211 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-bvj5k" podStartSLOduration=3.579187778 podStartE2EDuration="3.579187778s" podCreationTimestamp="2026-03-08 21:10:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:10:20.575252193 +0000 UTC m=+5921.971306216" watchObservedRunningTime="2026-03-08 21:10:20.579187778 +0000 UTC m=+5921.975241831" Mar 08 21:10:22 crc kubenswrapper[4885]: I0308 21:10:22.574168 4885 generic.go:334] "Generic (PLEG): container finished" podID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerID="3c54259ed2f9cad06d6b82cb394e43cf58f48cf3c6a30620f67aa6eb4637dc84" exitCode=0 Mar 08 21:10:22 crc kubenswrapper[4885]: I0308 21:10:22.574454 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bvj5k" event={"ID":"020ad790-8a8c-4e05-b3da-d6b823bb37e2","Type":"ContainerDied","Data":"3c54259ed2f9cad06d6b82cb394e43cf58f48cf3c6a30620f67aa6eb4637dc84"} Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.053737 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.137914 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data\") pod \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.138087 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data-merged\") pod \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.138246 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-scripts\") pod \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.138295 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-combined-ca-bundle\") pod \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\" (UID: \"020ad790-8a8c-4e05-b3da-d6b823bb37e2\") " Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.143622 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data" (OuterVolumeSpecName: "config-data") pod "020ad790-8a8c-4e05-b3da-d6b823bb37e2" (UID: "020ad790-8a8c-4e05-b3da-d6b823bb37e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.145514 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-scripts" (OuterVolumeSpecName: "scripts") pod "020ad790-8a8c-4e05-b3da-d6b823bb37e2" (UID: "020ad790-8a8c-4e05-b3da-d6b823bb37e2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.165214 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "020ad790-8a8c-4e05-b3da-d6b823bb37e2" (UID: "020ad790-8a8c-4e05-b3da-d6b823bb37e2"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.166066 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "020ad790-8a8c-4e05-b3da-d6b823bb37e2" (UID: "020ad790-8a8c-4e05-b3da-d6b823bb37e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.240806 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.240863 4885 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/020ad790-8a8c-4e05-b3da-d6b823bb37e2-config-data-merged\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.240882 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.240895 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020ad790-8a8c-4e05-b3da-d6b823bb37e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.592846 4885 generic.go:334] "Generic (PLEG): container finished" podID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerID="83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40" exitCode=0 Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.592958 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" event={"ID":"302e26ba-0f77-4f06-a12e-74888dfc7821","Type":"ContainerDied","Data":"83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40"} Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.596972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-bvj5k" event={"ID":"020ad790-8a8c-4e05-b3da-d6b823bb37e2","Type":"ContainerDied","Data":"8f80b9a4caf87bf128bde1ac48ff0fea061835911ef4bb376737482f72040c76"} Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.597031 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f80b9a4caf87bf128bde1ac48ff0fea061835911ef4bb376737482f72040c76" Mar 08 21:10:24 crc kubenswrapper[4885]: I0308 21:10:24.597114 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-bvj5k" Mar 08 21:10:25 crc kubenswrapper[4885]: I0308 21:10:25.650974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" event={"ID":"302e26ba-0f77-4f06-a12e-74888dfc7821","Type":"ContainerStarted","Data":"187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338"} Mar 08 21:10:25 crc kubenswrapper[4885]: I0308 21:10:25.659467 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-b8ndv" event={"ID":"956e4845-c662-402d-adb6-b05143af6570","Type":"ContainerStarted","Data":"c03d4c72c49695c50f6e72d74250532258d64ecd15f5de57f003ea28da91d6df"} Mar 08 21:10:25 crc kubenswrapper[4885]: I0308 21:10:25.660364 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:25 crc kubenswrapper[4885]: I0308 21:10:25.687381 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" podStartSLOduration=3.788861104 podStartE2EDuration="13.687360099s" podCreationTimestamp="2026-03-08 21:10:12 +0000 UTC" firstStartedPulling="2026-03-08 21:10:13.762119545 +0000 UTC m=+5915.158173578" lastFinishedPulling="2026-03-08 21:10:23.66061855 +0000 UTC m=+5925.056672573" observedRunningTime="2026-03-08 21:10:25.675502523 +0000 UTC m=+5927.071556546" watchObservedRunningTime="2026-03-08 21:10:25.687360099 +0000 UTC m=+5927.083414142" Mar 08 21:10:25 crc kubenswrapper[4885]: I0308 21:10:25.725901 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-b8ndv" podStartSLOduration=2.3416002320000002 podStartE2EDuration="13.725879357s" podCreationTimestamp="2026-03-08 21:10:12 +0000 UTC" firstStartedPulling="2026-03-08 21:10:13.245886565 +0000 UTC m=+5914.641940588" lastFinishedPulling="2026-03-08 21:10:24.63016568 +0000 UTC m=+5926.026219713" observedRunningTime="2026-03-08 21:10:25.714839912 +0000 UTC m=+5927.110893945" watchObservedRunningTime="2026-03-08 21:10:25.725879357 +0000 UTC m=+5927.121933380" Mar 08 21:10:29 crc kubenswrapper[4885]: I0308 21:10:29.626556 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:10:29 crc kubenswrapper[4885]: I0308 21:10:29.858968 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-754cf98f97-rw6hg" Mar 08 21:10:42 crc kubenswrapper[4885]: I0308 21:10:42.640804 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-b8ndv" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.188057 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-gv4p5"] Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.188995 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerName="octavia-amphora-httpd" containerID="cri-o://187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338" gracePeriod=30 Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.747471 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.855664 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/302e26ba-0f77-4f06-a12e-74888dfc7821-amphora-image\") pod \"302e26ba-0f77-4f06-a12e-74888dfc7821\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.855794 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/302e26ba-0f77-4f06-a12e-74888dfc7821-httpd-config\") pod \"302e26ba-0f77-4f06-a12e-74888dfc7821\" (UID: \"302e26ba-0f77-4f06-a12e-74888dfc7821\") " Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.883260 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/302e26ba-0f77-4f06-a12e-74888dfc7821-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "302e26ba-0f77-4f06-a12e-74888dfc7821" (UID: "302e26ba-0f77-4f06-a12e-74888dfc7821"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.926772 4885 generic.go:334] "Generic (PLEG): container finished" podID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerID="187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338" exitCode=0 Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.927062 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" event={"ID":"302e26ba-0f77-4f06-a12e-74888dfc7821","Type":"ContainerDied","Data":"187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338"} Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.927191 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" event={"ID":"302e26ba-0f77-4f06-a12e-74888dfc7821","Type":"ContainerDied","Data":"93038e652442b1279a170e696ea3cb5fbf35b66d2f9f1b908f62cf65c5297a72"} Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.927281 4885 scope.go:117] "RemoveContainer" containerID="187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.927483 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-gv4p5" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.947487 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/302e26ba-0f77-4f06-a12e-74888dfc7821-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "302e26ba-0f77-4f06-a12e-74888dfc7821" (UID: "302e26ba-0f77-4f06-a12e-74888dfc7821"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.957817 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/302e26ba-0f77-4f06-a12e-74888dfc7821-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.957847 4885 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/302e26ba-0f77-4f06-a12e-74888dfc7821-amphora-image\") on node \"crc\" DevicePath \"\"" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.970205 4885 scope.go:117] "RemoveContainer" containerID="83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.991304 4885 scope.go:117] "RemoveContainer" containerID="187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338" Mar 08 21:10:50 crc kubenswrapper[4885]: E0308 21:10:50.991807 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338\": container with ID starting with 187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338 not found: ID does not exist" containerID="187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.991870 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338"} err="failed to get container status \"187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338\": rpc error: code = NotFound desc = could not find container \"187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338\": container with ID starting with 187ac590810e16797e28dda8ef03733fb5bcfb71f6ad90adffbbdb9d3f736338 not found: ID does not exist" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.991894 4885 scope.go:117] "RemoveContainer" containerID="83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40" Mar 08 21:10:50 crc kubenswrapper[4885]: E0308 21:10:50.992424 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40\": container with ID starting with 83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40 not found: ID does not exist" containerID="83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40" Mar 08 21:10:50 crc kubenswrapper[4885]: I0308 21:10:50.992477 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40"} err="failed to get container status \"83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40\": rpc error: code = NotFound desc = could not find container \"83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40\": container with ID starting with 83ac9b2c351f6071efd7cf78102902bd331870e9dbd3486dcadaada6741c2d40 not found: ID does not exist" Mar 08 21:10:51 crc kubenswrapper[4885]: I0308 21:10:51.278483 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-gv4p5"] Mar 08 21:10:51 crc kubenswrapper[4885]: I0308 21:10:51.295188 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-gv4p5"] Mar 08 21:10:51 crc kubenswrapper[4885]: I0308 21:10:51.390052 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" path="/var/lib/kubelet/pods/302e26ba-0f77-4f06-a12e-74888dfc7821/volumes" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.465744 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-msqj4"] Mar 08 21:10:54 crc kubenswrapper[4885]: E0308 21:10:54.466869 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerName="init" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.466890 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerName="init" Mar 08 21:10:54 crc kubenswrapper[4885]: E0308 21:10:54.466958 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerName="init" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.466971 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerName="init" Mar 08 21:10:54 crc kubenswrapper[4885]: E0308 21:10:54.467000 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerName="octavia-amphora-httpd" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.467014 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerName="octavia-amphora-httpd" Mar 08 21:10:54 crc kubenswrapper[4885]: E0308 21:10:54.467038 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerName="octavia-db-sync" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.467051 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerName="octavia-db-sync" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.467373 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" containerName="octavia-db-sync" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.467398 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="302e26ba-0f77-4f06-a12e-74888dfc7821" containerName="octavia-amphora-httpd" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.469092 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.472118 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.490762 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-msqj4"] Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.534874 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6b1c35f0-ed5f-411a-a0ec-1270fd04e266-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-msqj4\" (UID: \"6b1c35f0-ed5f-411a-a0ec-1270fd04e266\") " pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.534948 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/6b1c35f0-ed5f-411a-a0ec-1270fd04e266-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-msqj4\" (UID: \"6b1c35f0-ed5f-411a-a0ec-1270fd04e266\") " pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.636824 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6b1c35f0-ed5f-411a-a0ec-1270fd04e266-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-msqj4\" (UID: \"6b1c35f0-ed5f-411a-a0ec-1270fd04e266\") " pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.636882 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/6b1c35f0-ed5f-411a-a0ec-1270fd04e266-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-msqj4\" (UID: \"6b1c35f0-ed5f-411a-a0ec-1270fd04e266\") " pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.637479 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/6b1c35f0-ed5f-411a-a0ec-1270fd04e266-amphora-image\") pod \"octavia-image-upload-6f5964dbc9-msqj4\" (UID: \"6b1c35f0-ed5f-411a-a0ec-1270fd04e266\") " pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.647042 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6b1c35f0-ed5f-411a-a0ec-1270fd04e266-httpd-config\") pod \"octavia-image-upload-6f5964dbc9-msqj4\" (UID: \"6b1c35f0-ed5f-411a-a0ec-1270fd04e266\") " pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:54 crc kubenswrapper[4885]: I0308 21:10:54.803513 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" Mar 08 21:10:55 crc kubenswrapper[4885]: I0308 21:10:55.263494 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-6f5964dbc9-msqj4"] Mar 08 21:10:55 crc kubenswrapper[4885]: I0308 21:10:55.999039 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" event={"ID":"6b1c35f0-ed5f-411a-a0ec-1270fd04e266","Type":"ContainerStarted","Data":"565a88213cdf37df5cce6da306934159b3d147d6d31342f60943171f8b47a5f0"} Mar 08 21:10:55 crc kubenswrapper[4885]: I0308 21:10:55.999429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" event={"ID":"6b1c35f0-ed5f-411a-a0ec-1270fd04e266","Type":"ContainerStarted","Data":"13b44627660c6d5828e1ffce08afc4d5b640263a51ff23dacea929e4384d65bb"} Mar 08 21:10:57 crc kubenswrapper[4885]: I0308 21:10:57.021278 4885 generic.go:334] "Generic (PLEG): container finished" podID="6b1c35f0-ed5f-411a-a0ec-1270fd04e266" containerID="565a88213cdf37df5cce6da306934159b3d147d6d31342f60943171f8b47a5f0" exitCode=0 Mar 08 21:10:57 crc kubenswrapper[4885]: I0308 21:10:57.021882 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" event={"ID":"6b1c35f0-ed5f-411a-a0ec-1270fd04e266","Type":"ContainerDied","Data":"565a88213cdf37df5cce6da306934159b3d147d6d31342f60943171f8b47a5f0"} Mar 08 21:10:58 crc kubenswrapper[4885]: I0308 21:10:58.036360 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" event={"ID":"6b1c35f0-ed5f-411a-a0ec-1270fd04e266","Type":"ContainerStarted","Data":"74683b96c32b01537600c9e68d79b2282c47487f6c8eae71ca83b94efdde9d7f"} Mar 08 21:10:58 crc kubenswrapper[4885]: I0308 21:10:58.059562 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-6f5964dbc9-msqj4" podStartSLOduration=3.611811667 podStartE2EDuration="4.05954289s" podCreationTimestamp="2026-03-08 21:10:54 +0000 UTC" firstStartedPulling="2026-03-08 21:10:55.269733446 +0000 UTC m=+5956.665787479" lastFinishedPulling="2026-03-08 21:10:55.717464649 +0000 UTC m=+5957.113518702" observedRunningTime="2026-03-08 21:10:58.056748845 +0000 UTC m=+5959.452802868" watchObservedRunningTime="2026-03-08 21:10:58.05954289 +0000 UTC m=+5959.455596913" Mar 08 21:11:06 crc kubenswrapper[4885]: I0308 21:11:06.555669 4885 scope.go:117] "RemoveContainer" containerID="d40c8b02d2c6b1b5fefbc9a10d09bda45776bb36be850d751409c013d3a63ca6" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.529197 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-8t2fl"] Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.531493 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.535085 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.535420 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.542071 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.550108 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-8t2fl"] Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.602677 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-scripts\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.602757 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-combined-ca-bundle\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.602840 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-config-data\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.602896 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-hm-ports\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.602962 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-amphora-certs\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.603017 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-config-data-merged\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.705802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-config-data\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.705986 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-hm-ports\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.706103 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-amphora-certs\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.706233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-config-data-merged\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.706318 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-scripts\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.706421 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-combined-ca-bundle\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.710818 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-hm-ports\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.712160 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-config-data-merged\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.715093 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-combined-ca-bundle\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.717582 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-amphora-certs\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.717637 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-config-data\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.719990 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9d4d983f-9ee9-4341-bf69-0c2fc610a2d6-scripts\") pod \"octavia-healthmanager-8t2fl\" (UID: \"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6\") " pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:16 crc kubenswrapper[4885]: I0308 21:11:16.855641 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:17 crc kubenswrapper[4885]: I0308 21:11:17.696414 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-8t2fl"] Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.194930 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-pchrs"] Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.198013 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.208934 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-pchrs"] Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.214339 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.215498 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.239603 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-hm-ports\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.239644 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-config-data\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.239702 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-config-data-merged\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.239743 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-amphora-certs\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.239775 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-scripts\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.239803 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-combined-ca-bundle\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.255435 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-8t2fl" event={"ID":"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6","Type":"ContainerStarted","Data":"c5956a80a01d11ccb179dc4eae96a4ea0e3a4f92806db99072d0278c874aeb7b"} Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.341301 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-hm-ports\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.341347 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-config-data\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.341390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-config-data-merged\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.341416 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-amphora-certs\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.341446 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-scripts\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.341474 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-combined-ca-bundle\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.342699 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-config-data-merged\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.344157 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-hm-ports\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.346889 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-combined-ca-bundle\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.347065 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-amphora-certs\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.354501 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-config-data\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.354761 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8-scripts\") pod \"octavia-housekeeping-pchrs\" (UID: \"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8\") " pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:18 crc kubenswrapper[4885]: I0308 21:11:18.527224 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.106703 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-pchrs"] Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.265957 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-8t2fl" event={"ID":"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6","Type":"ContainerStarted","Data":"14f9c425ffdeac5dd5e3ac325d2da1e0f647a77b481194d0dbb2a2f44b7efda0"} Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.267341 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-pchrs" event={"ID":"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8","Type":"ContainerStarted","Data":"fd030b5bbf532bf3f09a68982b303a7feff3c04c76910c861b175be15ef9b193"} Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.769971 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-8847z"] Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.772675 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.787381 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.787578 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.787691 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-8847z"] Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.873382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-amphora-certs\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.873804 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-combined-ca-bundle\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.874053 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-scripts\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.874239 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-hm-ports\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.874374 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-config-data\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.874575 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-config-data-merged\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.976603 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-combined-ca-bundle\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.976664 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-scripts\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.976696 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-hm-ports\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.976715 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-config-data\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.976750 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-config-data-merged\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.976804 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-amphora-certs\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.979040 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-hm-ports\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.986895 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-combined-ca-bundle\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.987140 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-config-data-merged\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.988113 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-scripts\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.991186 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-amphora-certs\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:19 crc kubenswrapper[4885]: I0308 21:11:19.992175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c0bddea-5630-4e74-8bc9-ec81fc3eba56-config-data\") pod \"octavia-worker-8847z\" (UID: \"3c0bddea-5630-4e74-8bc9-ec81fc3eba56\") " pod="openstack/octavia-worker-8847z" Mar 08 21:11:20 crc kubenswrapper[4885]: I0308 21:11:20.127693 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-8847z" Mar 08 21:11:20 crc kubenswrapper[4885]: I0308 21:11:20.312194 4885 generic.go:334] "Generic (PLEG): container finished" podID="9d4d983f-9ee9-4341-bf69-0c2fc610a2d6" containerID="14f9c425ffdeac5dd5e3ac325d2da1e0f647a77b481194d0dbb2a2f44b7efda0" exitCode=0 Mar 08 21:11:20 crc kubenswrapper[4885]: I0308 21:11:20.312250 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-8t2fl" event={"ID":"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6","Type":"ContainerDied","Data":"14f9c425ffdeac5dd5e3ac325d2da1e0f647a77b481194d0dbb2a2f44b7efda0"} Mar 08 21:11:20 crc kubenswrapper[4885]: I0308 21:11:20.800508 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-8847z"] Mar 08 21:11:21 crc kubenswrapper[4885]: I0308 21:11:21.444780 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:21 crc kubenswrapper[4885]: I0308 21:11:21.445321 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-8t2fl" event={"ID":"9d4d983f-9ee9-4341-bf69-0c2fc610a2d6","Type":"ContainerStarted","Data":"1b4bf725669d42dcefe21ca4568a95968ca89f01d4de0ce4713d48355980145e"} Mar 08 21:11:21 crc kubenswrapper[4885]: I0308 21:11:21.446018 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8847z" event={"ID":"3c0bddea-5630-4e74-8bc9-ec81fc3eba56","Type":"ContainerStarted","Data":"892302d5823444d7c1ba78dc7ffe8568afc3a073b1628b7312df055cd88abc0b"} Mar 08 21:11:21 crc kubenswrapper[4885]: I0308 21:11:21.486947 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-8t2fl" podStartSLOduration=5.486916562 podStartE2EDuration="5.486916562s" podCreationTimestamp="2026-03-08 21:11:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:11:21.46211211 +0000 UTC m=+5982.858166153" watchObservedRunningTime="2026-03-08 21:11:21.486916562 +0000 UTC m=+5982.882970585" Mar 08 21:11:21 crc kubenswrapper[4885]: I0308 21:11:21.563396 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-8t2fl"] Mar 08 21:11:22 crc kubenswrapper[4885]: I0308 21:11:22.461105 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-pchrs" event={"ID":"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8","Type":"ContainerStarted","Data":"20d00b8688c1c08fbabb5e9207d2b56871e50bcfe9f23bb1ce969cd8c9fdc903"} Mar 08 21:11:23 crc kubenswrapper[4885]: I0308 21:11:23.480226 4885 generic.go:334] "Generic (PLEG): container finished" podID="c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8" containerID="20d00b8688c1c08fbabb5e9207d2b56871e50bcfe9f23bb1ce969cd8c9fdc903" exitCode=0 Mar 08 21:11:23 crc kubenswrapper[4885]: I0308 21:11:23.480324 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-pchrs" event={"ID":"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8","Type":"ContainerDied","Data":"20d00b8688c1c08fbabb5e9207d2b56871e50bcfe9f23bb1ce969cd8c9fdc903"} Mar 08 21:11:24 crc kubenswrapper[4885]: I0308 21:11:24.489346 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8847z" event={"ID":"3c0bddea-5630-4e74-8bc9-ec81fc3eba56","Type":"ContainerStarted","Data":"5e9e8c96253911a7ae51b04fedb535672ce6656d583c7b23da5f45fde8b758e0"} Mar 08 21:11:24 crc kubenswrapper[4885]: I0308 21:11:24.496175 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-pchrs" event={"ID":"c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8","Type":"ContainerStarted","Data":"710a8f5cb5fdacb2d1629e83c91473191cd887af35339771a85b6a6fc89819fb"} Mar 08 21:11:24 crc kubenswrapper[4885]: I0308 21:11:24.496399 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:24 crc kubenswrapper[4885]: I0308 21:11:24.532158 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-pchrs" podStartSLOduration=4.068948046 podStartE2EDuration="6.532142808s" podCreationTimestamp="2026-03-08 21:11:18 +0000 UTC" firstStartedPulling="2026-03-08 21:11:19.106289862 +0000 UTC m=+5980.502343885" lastFinishedPulling="2026-03-08 21:11:21.569484624 +0000 UTC m=+5982.965538647" observedRunningTime="2026-03-08 21:11:24.528273444 +0000 UTC m=+5985.924327467" watchObservedRunningTime="2026-03-08 21:11:24.532142808 +0000 UTC m=+5985.928196831" Mar 08 21:11:25 crc kubenswrapper[4885]: I0308 21:11:25.509644 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c0bddea-5630-4e74-8bc9-ec81fc3eba56" containerID="5e9e8c96253911a7ae51b04fedb535672ce6656d583c7b23da5f45fde8b758e0" exitCode=0 Mar 08 21:11:25 crc kubenswrapper[4885]: I0308 21:11:25.509703 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8847z" event={"ID":"3c0bddea-5630-4e74-8bc9-ec81fc3eba56","Type":"ContainerDied","Data":"5e9e8c96253911a7ae51b04fedb535672ce6656d583c7b23da5f45fde8b758e0"} Mar 08 21:11:26 crc kubenswrapper[4885]: I0308 21:11:26.524700 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-8847z" event={"ID":"3c0bddea-5630-4e74-8bc9-ec81fc3eba56","Type":"ContainerStarted","Data":"1b6dfb0be525ff4b81354acb4f385146a928027992b7a317dbe0994268d40bb9"} Mar 08 21:11:26 crc kubenswrapper[4885]: I0308 21:11:26.525018 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-8847z" Mar 08 21:11:26 crc kubenswrapper[4885]: I0308 21:11:26.552911 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-8847z" podStartSLOduration=4.84498176 podStartE2EDuration="7.552888148s" podCreationTimestamp="2026-03-08 21:11:19 +0000 UTC" firstStartedPulling="2026-03-08 21:11:20.813156441 +0000 UTC m=+5982.209210464" lastFinishedPulling="2026-03-08 21:11:23.521062809 +0000 UTC m=+5984.917116852" observedRunningTime="2026-03-08 21:11:26.5451354 +0000 UTC m=+5987.941189473" watchObservedRunningTime="2026-03-08 21:11:26.552888148 +0000 UTC m=+5987.948942171" Mar 08 21:11:31 crc kubenswrapper[4885]: I0308 21:11:31.888682 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-8t2fl" Mar 08 21:11:33 crc kubenswrapper[4885]: I0308 21:11:33.564715 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-pchrs" Mar 08 21:11:35 crc kubenswrapper[4885]: I0308 21:11:35.202379 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-8847z" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.009589 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-86f4656c87-zrnj4"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.011665 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.036570 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.036999 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.037506 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-dwvls" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.043836 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.076432 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-86f4656c87-zrnj4"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.088624 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.092092 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-log" containerID="cri-o://393e441f3ad9404ef23527c1976a928e080636334f6d4fef814d8251c19b8033" gracePeriod=30 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.092439 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-httpd" containerID="cri-o://45cd60c6ca50a5d19396518626fd9ead690756368037eda8deb040859bd438c4" gracePeriod=30 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.101296 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmnvn\" (UniqueName: \"kubernetes.io/projected/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-kube-api-access-xmnvn\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.101363 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-logs\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.101430 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-config-data\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.101461 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-scripts\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.101515 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-horizon-secret-key\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.137943 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.138386 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-log" containerID="cri-o://6b230c929340da8fad3a15c45277ba0e659ca5d1578d43a5536de68f31fcf158" gracePeriod=30 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.138871 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-httpd" containerID="cri-o://77cc037af22a97f866052e3343ac5fbb7bf64bc7562100db2c700da1dbaae719" gracePeriod=30 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.159676 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-cfbd9754f-492lw"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.161886 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.179613 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-cfbd9754f-492lw"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.203038 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-logs\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.203476 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-config-data\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.203568 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-scripts\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.203634 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-logs\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.203758 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-horizon-secret-key\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.203900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmnvn\" (UniqueName: \"kubernetes.io/projected/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-kube-api-access-xmnvn\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.204431 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-scripts\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.205262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-config-data\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.212453 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-horizon-secret-key\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.222961 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmnvn\" (UniqueName: \"kubernetes.io/projected/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-kube-api-access-xmnvn\") pod \"horizon-86f4656c87-zrnj4\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.305467 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-config-data\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.305518 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-scripts\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.305557 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm7hl\" (UniqueName: \"kubernetes.io/projected/89778e39-b609-494b-b2b2-aebf98447dd0-kube-api-access-pm7hl\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.305803 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89778e39-b609-494b-b2b2-aebf98447dd0-horizon-secret-key\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.306069 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89778e39-b609-494b-b2b2-aebf98447dd0-logs\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.377414 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.410326 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm7hl\" (UniqueName: \"kubernetes.io/projected/89778e39-b609-494b-b2b2-aebf98447dd0-kube-api-access-pm7hl\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.410638 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89778e39-b609-494b-b2b2-aebf98447dd0-horizon-secret-key\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.410824 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89778e39-b609-494b-b2b2-aebf98447dd0-logs\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.411007 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-config-data\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.411134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-scripts\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.411318 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89778e39-b609-494b-b2b2-aebf98447dd0-logs\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.412393 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-scripts\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.412403 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-config-data\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.414815 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89778e39-b609-494b-b2b2-aebf98447dd0-horizon-secret-key\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.434485 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm7hl\" (UniqueName: \"kubernetes.io/projected/89778e39-b609-494b-b2b2-aebf98447dd0-kube-api-access-pm7hl\") pod \"horizon-cfbd9754f-492lw\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.479120 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.682688 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86f4656c87-zrnj4"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.709866 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-dbdd8c5b9-56mvx"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.711512 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.739726 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dbdd8c5b9-56mvx"] Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.784671 4885 generic.go:334] "Generic (PLEG): container finished" podID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerID="6b230c929340da8fad3a15c45277ba0e659ca5d1578d43a5536de68f31fcf158" exitCode=143 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.785020 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8","Type":"ContainerDied","Data":"6b230c929340da8fad3a15c45277ba0e659ca5d1578d43a5536de68f31fcf158"} Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.786639 4885 generic.go:334] "Generic (PLEG): container finished" podID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerID="393e441f3ad9404ef23527c1976a928e080636334f6d4fef814d8251c19b8033" exitCode=143 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.786661 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e","Type":"ContainerDied","Data":"393e441f3ad9404ef23527c1976a928e080636334f6d4fef814d8251c19b8033"} Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.817873 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-config-data\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.817993 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-horizon-secret-key\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.818030 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-scripts\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.818185 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-logs\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.818595 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2bq7\" (UniqueName: \"kubernetes.io/projected/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-kube-api-access-x2bq7\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.923621 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-logs\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.923903 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2bq7\" (UniqueName: \"kubernetes.io/projected/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-kube-api-access-x2bq7\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.924042 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-config-data\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.924107 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-horizon-secret-key\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.924272 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-scripts\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.925243 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-scripts\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.925543 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-logs\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.927105 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-config-data\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.930176 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86f4656c87-zrnj4"] Mar 08 21:11:45 crc kubenswrapper[4885]: W0308 21:11:45.933169 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc58da53b_fda0_486e_b7d0_d6b50bfc1b62.slice/crio-3192252c7ddb8920e2f562a80b90e4b4271792e7b8e099a5857dac94813f27a8 WatchSource:0}: Error finding container 3192252c7ddb8920e2f562a80b90e4b4271792e7b8e099a5857dac94813f27a8: Status 404 returned error can't find the container with id 3192252c7ddb8920e2f562a80b90e4b4271792e7b8e099a5857dac94813f27a8 Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.936498 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-horizon-secret-key\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:45 crc kubenswrapper[4885]: I0308 21:11:45.944320 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2bq7\" (UniqueName: \"kubernetes.io/projected/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-kube-api-access-x2bq7\") pod \"horizon-dbdd8c5b9-56mvx\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:46 crc kubenswrapper[4885]: I0308 21:11:46.035856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:46 crc kubenswrapper[4885]: I0308 21:11:46.043869 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-cfbd9754f-492lw"] Mar 08 21:11:46 crc kubenswrapper[4885]: I0308 21:11:46.535052 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-dbdd8c5b9-56mvx"] Mar 08 21:11:46 crc kubenswrapper[4885]: I0308 21:11:46.799871 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f4656c87-zrnj4" event={"ID":"c58da53b-fda0-486e-b7d0-d6b50bfc1b62","Type":"ContainerStarted","Data":"3192252c7ddb8920e2f562a80b90e4b4271792e7b8e099a5857dac94813f27a8"} Mar 08 21:11:46 crc kubenswrapper[4885]: I0308 21:11:46.802188 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cfbd9754f-492lw" event={"ID":"89778e39-b609-494b-b2b2-aebf98447dd0","Type":"ContainerStarted","Data":"e4a35d78251bc7cb6befb4354f6fb7487a2a564eecb0e58296c981c935357145"} Mar 08 21:11:46 crc kubenswrapper[4885]: I0308 21:11:46.804171 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dbdd8c5b9-56mvx" event={"ID":"4d4063e1-1725-43e5-bf87-422d8e4a0e5b","Type":"ContainerStarted","Data":"e892f970f210beda78d54563132ede0defb9bfa40842e20b8ca03cfb2cdffe13"} Mar 08 21:11:48 crc kubenswrapper[4885]: I0308 21:11:48.853247 4885 generic.go:334] "Generic (PLEG): container finished" podID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerID="45cd60c6ca50a5d19396518626fd9ead690756368037eda8deb040859bd438c4" exitCode=0 Mar 08 21:11:48 crc kubenswrapper[4885]: I0308 21:11:48.853338 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e","Type":"ContainerDied","Data":"45cd60c6ca50a5d19396518626fd9ead690756368037eda8deb040859bd438c4"} Mar 08 21:11:48 crc kubenswrapper[4885]: I0308 21:11:48.856107 4885 generic.go:334] "Generic (PLEG): container finished" podID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerID="77cc037af22a97f866052e3343ac5fbb7bf64bc7562100db2c700da1dbaae719" exitCode=0 Mar 08 21:11:48 crc kubenswrapper[4885]: I0308 21:11:48.856146 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8","Type":"ContainerDied","Data":"77cc037af22a97f866052e3343ac5fbb7bf64bc7562100db2c700da1dbaae719"} Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.488333 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.496674 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.519096 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhs2t\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-kube-api-access-xhs2t\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.528424 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-kube-api-access-xhs2t" (OuterVolumeSpecName: "kube-api-access-xhs2t") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "kube-api-access-xhs2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628266 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-logs\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628337 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-scripts\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628360 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-combined-ca-bundle\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628403 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-ceph\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628510 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-scripts\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628561 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-combined-ca-bundle\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628637 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-ceph\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628688 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-config-data\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628728 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-httpd-run\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628768 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-httpd-run\") pod \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\" (UID: \"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628791 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-logs\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628807 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qvkv\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-kube-api-access-6qvkv\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.628827 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-config-data\") pod \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\" (UID: \"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e\") " Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.629254 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhs2t\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-kube-api-access-xhs2t\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.631912 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-logs" (OuterVolumeSpecName: "logs") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.632663 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.633060 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-logs" (OuterVolumeSpecName: "logs") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.633080 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.638817 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-ceph" (OuterVolumeSpecName: "ceph") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.639044 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-ceph" (OuterVolumeSpecName: "ceph") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.640592 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-scripts" (OuterVolumeSpecName: "scripts") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.648376 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-kube-api-access-6qvkv" (OuterVolumeSpecName: "kube-api-access-6qvkv") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "kube-api-access-6qvkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.648625 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-scripts" (OuterVolumeSpecName: "scripts") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.674294 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.706228 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731378 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731404 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731413 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731421 4885 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731429 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731438 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qvkv\" (UniqueName: \"kubernetes.io/projected/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-kube-api-access-6qvkv\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731448 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731455 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731463 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731472 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.731481 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.754964 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-config-data" (OuterVolumeSpecName: "config-data") pod "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" (UID: "6c8b8de1-aa1f-41bf-b8b4-64216af62a1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.758967 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-config-data" (OuterVolumeSpecName: "config-data") pod "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" (UID: "f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.833262 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.833299 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.927110 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.927106 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6c8b8de1-aa1f-41bf-b8b4-64216af62a1e","Type":"ContainerDied","Data":"8a73a30c989c3b2ba3367e2ccdac633bed1d9b688e4b2e65328b8a2f07a6fe3b"} Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.927486 4885 scope.go:117] "RemoveContainer" containerID="45cd60c6ca50a5d19396518626fd9ead690756368037eda8deb040859bd438c4" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.931615 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cfbd9754f-492lw" event={"ID":"89778e39-b609-494b-b2b2-aebf98447dd0","Type":"ContainerStarted","Data":"71f6b4756a73f0398ecd6d1de73ef9818474e86bd61fdf4176acdd4f948e0e51"} Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.936283 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dbdd8c5b9-56mvx" event={"ID":"4d4063e1-1725-43e5-bf87-422d8e4a0e5b","Type":"ContainerStarted","Data":"321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097"} Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.951471 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8","Type":"ContainerDied","Data":"c1d3ebe64d38a65fd438521c81fee0953dda62071a7f9b6b92b7e2810c5ba230"} Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.951505 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.954582 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f4656c87-zrnj4" event={"ID":"c58da53b-fda0-486e-b7d0-d6b50bfc1b62","Type":"ContainerStarted","Data":"3931ee1c99d16a879ccfc12229692a2bae1b407bce8d0e34d3d46a2ffcee39dc"} Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.966372 4885 scope.go:117] "RemoveContainer" containerID="393e441f3ad9404ef23527c1976a928e080636334f6d4fef814d8251c19b8033" Mar 08 21:11:54 crc kubenswrapper[4885]: I0308 21:11:54.997805 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.031570 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.054230 4885 scope.go:117] "RemoveContainer" containerID="77cc037af22a97f866052e3343ac5fbb7bf64bc7562100db2c700da1dbaae719" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.059787 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.069526 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.078631 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: E0308 21:11:55.079111 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-httpd" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079130 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-httpd" Mar 08 21:11:55 crc kubenswrapper[4885]: E0308 21:11:55.079149 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-log" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079157 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-log" Mar 08 21:11:55 crc kubenswrapper[4885]: E0308 21:11:55.079182 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-log" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079191 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-log" Mar 08 21:11:55 crc kubenswrapper[4885]: E0308 21:11:55.079202 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-httpd" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079208 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-httpd" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079409 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-log" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079427 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-httpd" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079438 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-log" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.079447 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-httpd" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.080460 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.083642 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.083759 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-zrqqr" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.083784 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.085740 4885 scope.go:117] "RemoveContainer" containerID="6b230c929340da8fad3a15c45277ba0e659ca5d1578d43a5536de68f31fcf158" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.089393 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.091129 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.092860 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.108620 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.130975 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.268918 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1efb870-06f3-40b8-baca-e418a034eaed-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269017 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1efb870-06f3-40b8-baca-e418a034eaed-logs\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269056 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269080 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpjfn\" (UniqueName: \"kubernetes.io/projected/c1efb870-06f3-40b8-baca-e418a034eaed-kube-api-access-qpjfn\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269128 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxrwl\" (UniqueName: \"kubernetes.io/projected/cd58de31-5f82-4acb-8713-397027fbae4f-kube-api-access-kxrwl\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269150 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd58de31-5f82-4acb-8713-397027fbae4f-logs\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269216 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd58de31-5f82-4acb-8713-397027fbae4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269242 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269276 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c1efb870-06f3-40b8-baca-e418a034eaed-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269309 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269353 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd58de31-5f82-4acb-8713-397027fbae4f-ceph\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269381 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269410 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.269463 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371024 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1efb870-06f3-40b8-baca-e418a034eaed-logs\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371068 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371089 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpjfn\" (UniqueName: \"kubernetes.io/projected/c1efb870-06f3-40b8-baca-e418a034eaed-kube-api-access-qpjfn\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371122 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxrwl\" (UniqueName: \"kubernetes.io/projected/cd58de31-5f82-4acb-8713-397027fbae4f-kube-api-access-kxrwl\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371142 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd58de31-5f82-4acb-8713-397027fbae4f-logs\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371185 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd58de31-5f82-4acb-8713-397027fbae4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371203 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371225 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c1efb870-06f3-40b8-baca-e418a034eaed-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371246 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371273 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd58de31-5f82-4acb-8713-397027fbae4f-ceph\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371294 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371311 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371342 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371385 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1efb870-06f3-40b8-baca-e418a034eaed-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371560 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c1efb870-06f3-40b8-baca-e418a034eaed-logs\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.371715 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c1efb870-06f3-40b8-baca-e418a034eaed-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.373563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd58de31-5f82-4acb-8713-397027fbae4f-logs\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.374278 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cd58de31-5f82-4acb-8713-397027fbae4f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.376757 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c1efb870-06f3-40b8-baca-e418a034eaed-ceph\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.376991 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.377656 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-config-data\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.378305 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cd58de31-5f82-4acb-8713-397027fbae4f-ceph\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.379303 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.381632 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd58de31-5f82-4acb-8713-397027fbae4f-scripts\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.382437 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.384272 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" path="/var/lib/kubelet/pods/6c8b8de1-aa1f-41bf-b8b4-64216af62a1e/volumes" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.385308 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" path="/var/lib/kubelet/pods/f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8/volumes" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.390297 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1efb870-06f3-40b8-baca-e418a034eaed-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.391153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpjfn\" (UniqueName: \"kubernetes.io/projected/c1efb870-06f3-40b8-baca-e418a034eaed-kube-api-access-qpjfn\") pod \"glance-default-internal-api-0\" (UID: \"c1efb870-06f3-40b8-baca-e418a034eaed\") " pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.392599 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxrwl\" (UniqueName: \"kubernetes.io/projected/cd58de31-5f82-4acb-8713-397027fbae4f-kube-api-access-kxrwl\") pod \"glance-default-external-api-0\" (UID: \"cd58de31-5f82-4acb-8713-397027fbae4f\") " pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.408430 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.416408 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.972343 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f4656c87-zrnj4" event={"ID":"c58da53b-fda0-486e-b7d0-d6b50bfc1b62","Type":"ContainerStarted","Data":"b0909c7e110caec9a5e1944f3751a491442563718d6ad89214ea7bdc75d423aa"} Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.973081 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86f4656c87-zrnj4" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon-log" containerID="cri-o://3931ee1c99d16a879ccfc12229692a2bae1b407bce8d0e34d3d46a2ffcee39dc" gracePeriod=30 Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.973611 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-86f4656c87-zrnj4" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon" containerID="cri-o://b0909c7e110caec9a5e1944f3751a491442563718d6ad89214ea7bdc75d423aa" gracePeriod=30 Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.983170 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cfbd9754f-492lw" event={"ID":"89778e39-b609-494b-b2b2-aebf98447dd0","Type":"ContainerStarted","Data":"e6bcfb134bfcc69718a960dce95ed650e8b3cfcf3c24bcd14d2d103ff348f959"} Mar 08 21:11:55 crc kubenswrapper[4885]: I0308 21:11:55.989451 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dbdd8c5b9-56mvx" event={"ID":"4d4063e1-1725-43e5-bf87-422d8e4a0e5b","Type":"ContainerStarted","Data":"1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0"} Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.009369 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-86f4656c87-zrnj4" podStartSLOduration=3.50976765 podStartE2EDuration="12.009348599s" podCreationTimestamp="2026-03-08 21:11:44 +0000 UTC" firstStartedPulling="2026-03-08 21:11:45.935161736 +0000 UTC m=+6007.331215759" lastFinishedPulling="2026-03-08 21:11:54.434742645 +0000 UTC m=+6015.830796708" observedRunningTime="2026-03-08 21:11:55.991702146 +0000 UTC m=+6017.387756159" watchObservedRunningTime="2026-03-08 21:11:56.009348599 +0000 UTC m=+6017.405402612" Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.024162 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-cfbd9754f-492lw" podStartSLOduration=2.618126808 podStartE2EDuration="11.024141686s" podCreationTimestamp="2026-03-08 21:11:45 +0000 UTC" firstStartedPulling="2026-03-08 21:11:46.063066997 +0000 UTC m=+6007.459121020" lastFinishedPulling="2026-03-08 21:11:54.469081835 +0000 UTC m=+6015.865135898" observedRunningTime="2026-03-08 21:11:56.016657065 +0000 UTC m=+6017.412711078" watchObservedRunningTime="2026-03-08 21:11:56.024141686 +0000 UTC m=+6017.420195709" Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.039189 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.039256 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.039315 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-dbdd8c5b9-56mvx" podStartSLOduration=3.141853761 podStartE2EDuration="11.039297732s" podCreationTimestamp="2026-03-08 21:11:45 +0000 UTC" firstStartedPulling="2026-03-08 21:11:46.559236032 +0000 UTC m=+6007.955290055" lastFinishedPulling="2026-03-08 21:11:54.456679963 +0000 UTC m=+6015.852734026" observedRunningTime="2026-03-08 21:11:56.039107577 +0000 UTC m=+6017.435161610" watchObservedRunningTime="2026-03-08 21:11:56.039297732 +0000 UTC m=+6017.435351755" Mar 08 21:11:56 crc kubenswrapper[4885]: W0308 21:11:56.056056 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd58de31_5f82_4acb_8713_397027fbae4f.slice/crio-95ed03836c302809eb4f2e57338885423ead13da66237ba50c0720d1bf8411c1 WatchSource:0}: Error finding container 95ed03836c302809eb4f2e57338885423ead13da66237ba50c0720d1bf8411c1: Status 404 returned error can't find the container with id 95ed03836c302809eb4f2e57338885423ead13da66237ba50c0720d1bf8411c1 Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.076407 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 08 21:11:56 crc kubenswrapper[4885]: I0308 21:11:56.868153 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 08 21:11:56 crc kubenswrapper[4885]: W0308 21:11:56.873382 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1efb870_06f3_40b8_baca_e418a034eaed.slice/crio-1e7bc1b9d7482149b440e313e74f1ead03cb3a2f15b4b8229215c086b038effb WatchSource:0}: Error finding container 1e7bc1b9d7482149b440e313e74f1ead03cb3a2f15b4b8229215c086b038effb: Status 404 returned error can't find the container with id 1e7bc1b9d7482149b440e313e74f1ead03cb3a2f15b4b8229215c086b038effb Mar 08 21:11:57 crc kubenswrapper[4885]: I0308 21:11:57.007296 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1efb870-06f3-40b8-baca-e418a034eaed","Type":"ContainerStarted","Data":"1e7bc1b9d7482149b440e313e74f1ead03cb3a2f15b4b8229215c086b038effb"} Mar 08 21:11:57 crc kubenswrapper[4885]: I0308 21:11:57.009674 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cd58de31-5f82-4acb-8713-397027fbae4f","Type":"ContainerStarted","Data":"ff00fc7a32001b8b633a5ecb4d4224b2a0c15491ae5712a09f7af91239ca81bf"} Mar 08 21:11:57 crc kubenswrapper[4885]: I0308 21:11:57.009721 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cd58de31-5f82-4acb-8713-397027fbae4f","Type":"ContainerStarted","Data":"95ed03836c302809eb4f2e57338885423ead13da66237ba50c0720d1bf8411c1"} Mar 08 21:11:58 crc kubenswrapper[4885]: I0308 21:11:58.025492 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cd58de31-5f82-4acb-8713-397027fbae4f","Type":"ContainerStarted","Data":"7e19df0da4127f0cc92522bd74716ccab4a8247e1c70a6b805f7b3d0e7dd4f4d"} Mar 08 21:11:58 crc kubenswrapper[4885]: I0308 21:11:58.031871 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1efb870-06f3-40b8-baca-e418a034eaed","Type":"ContainerStarted","Data":"30d1be3af3da185dd6d8f4b82fc8ebb16839644c9ecf7f387b6e417124e194a0"} Mar 08 21:11:58 crc kubenswrapper[4885]: I0308 21:11:58.031901 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c1efb870-06f3-40b8-baca-e418a034eaed","Type":"ContainerStarted","Data":"bca57804f3d85629f3ba71e2ac77cc012e945329d5dcd93c84975a1075cd1068"} Mar 08 21:11:58 crc kubenswrapper[4885]: I0308 21:11:58.080037 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.08001641 podStartE2EDuration="4.08001641s" podCreationTimestamp="2026-03-08 21:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:11:58.065360357 +0000 UTC m=+6019.461414390" watchObservedRunningTime="2026-03-08 21:11:58.08001641 +0000 UTC m=+6019.476070433" Mar 08 21:11:58 crc kubenswrapper[4885]: I0308 21:11:58.080895 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.080889513 podStartE2EDuration="4.080889513s" podCreationTimestamp="2026-03-08 21:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:11:58.046795419 +0000 UTC m=+6019.442849442" watchObservedRunningTime="2026-03-08 21:11:58.080889513 +0000 UTC m=+6019.476943536" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.148503 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550072-7xh8c"] Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.151342 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.153231 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.154318 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.156130 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.160334 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550072-7xh8c"] Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.274712 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hq2t\" (UniqueName: \"kubernetes.io/projected/4225aab0-53fa-4aa3-ac19-6827ea262916-kube-api-access-6hq2t\") pod \"auto-csr-approver-29550072-7xh8c\" (UID: \"4225aab0-53fa-4aa3-ac19-6827ea262916\") " pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.377452 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hq2t\" (UniqueName: \"kubernetes.io/projected/4225aab0-53fa-4aa3-ac19-6827ea262916-kube-api-access-6hq2t\") pod \"auto-csr-approver-29550072-7xh8c\" (UID: \"4225aab0-53fa-4aa3-ac19-6827ea262916\") " pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.411449 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hq2t\" (UniqueName: \"kubernetes.io/projected/4225aab0-53fa-4aa3-ac19-6827ea262916-kube-api-access-6hq2t\") pod \"auto-csr-approver-29550072-7xh8c\" (UID: \"4225aab0-53fa-4aa3-ac19-6827ea262916\") " pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:00 crc kubenswrapper[4885]: I0308 21:12:00.469175 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:01 crc kubenswrapper[4885]: W0308 21:12:01.014839 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4225aab0_53fa_4aa3_ac19_6827ea262916.slice/crio-a0985d21d3b19b49202374881f12c1ed2b2b428d566c4b4152de491be1467847 WatchSource:0}: Error finding container a0985d21d3b19b49202374881f12c1ed2b2b428d566c4b4152de491be1467847: Status 404 returned error can't find the container with id a0985d21d3b19b49202374881f12c1ed2b2b428d566c4b4152de491be1467847 Mar 08 21:12:01 crc kubenswrapper[4885]: I0308 21:12:01.014880 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550072-7xh8c"] Mar 08 21:12:01 crc kubenswrapper[4885]: I0308 21:12:01.070015 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" event={"ID":"4225aab0-53fa-4aa3-ac19-6827ea262916","Type":"ContainerStarted","Data":"a0985d21d3b19b49202374881f12c1ed2b2b428d566c4b4152de491be1467847"} Mar 08 21:12:02 crc kubenswrapper[4885]: I0308 21:12:02.818990 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:12:02 crc kubenswrapper[4885]: I0308 21:12:02.819468 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:12:03 crc kubenswrapper[4885]: I0308 21:12:03.106850 4885 generic.go:334] "Generic (PLEG): container finished" podID="4225aab0-53fa-4aa3-ac19-6827ea262916" containerID="37c658fc25b8a42ab3b33c1713dd08f3921d30fceba25de9d5cd0b6ec8c45fc8" exitCode=0 Mar 08 21:12:03 crc kubenswrapper[4885]: I0308 21:12:03.106914 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" event={"ID":"4225aab0-53fa-4aa3-ac19-6827ea262916","Type":"ContainerDied","Data":"37c658fc25b8a42ab3b33c1713dd08f3921d30fceba25de9d5cd0b6ec8c45fc8"} Mar 08 21:12:04 crc kubenswrapper[4885]: I0308 21:12:04.546722 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:04 crc kubenswrapper[4885]: I0308 21:12:04.681451 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hq2t\" (UniqueName: \"kubernetes.io/projected/4225aab0-53fa-4aa3-ac19-6827ea262916-kube-api-access-6hq2t\") pod \"4225aab0-53fa-4aa3-ac19-6827ea262916\" (UID: \"4225aab0-53fa-4aa3-ac19-6827ea262916\") " Mar 08 21:12:04 crc kubenswrapper[4885]: I0308 21:12:04.688060 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4225aab0-53fa-4aa3-ac19-6827ea262916-kube-api-access-6hq2t" (OuterVolumeSpecName: "kube-api-access-6hq2t") pod "4225aab0-53fa-4aa3-ac19-6827ea262916" (UID: "4225aab0-53fa-4aa3-ac19-6827ea262916"). InnerVolumeSpecName "kube-api-access-6hq2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:12:04 crc kubenswrapper[4885]: I0308 21:12:04.785117 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hq2t\" (UniqueName: \"kubernetes.io/projected/4225aab0-53fa-4aa3-ac19-6827ea262916-kube-api-access-6hq2t\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.132749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" event={"ID":"4225aab0-53fa-4aa3-ac19-6827ea262916","Type":"ContainerDied","Data":"a0985d21d3b19b49202374881f12c1ed2b2b428d566c4b4152de491be1467847"} Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.132791 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0985d21d3b19b49202374881f12c1ed2b2b428d566c4b4152de491be1467847" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.132883 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550072-7xh8c" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.386903 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.408955 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.408988 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.418220 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.418252 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.453010 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.472147 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.473636 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.481842 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.481891 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.487117 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-cfbd9754f-492lw" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.157:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.157:8080: connect: connection refused" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.504987 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.691976 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550066-6hpw5"] Mar 08 21:12:05 crc kubenswrapper[4885]: I0308 21:12:05.729103 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550066-6hpw5"] Mar 08 21:12:06 crc kubenswrapper[4885]: I0308 21:12:06.038658 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-dbdd8c5b9-56mvx" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.158:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8080: connect: connection refused" Mar 08 21:12:06 crc kubenswrapper[4885]: I0308 21:12:06.146874 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 21:12:06 crc kubenswrapper[4885]: I0308 21:12:06.146959 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:06 crc kubenswrapper[4885]: I0308 21:12:06.146977 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 08 21:12:06 crc kubenswrapper[4885]: I0308 21:12:06.146988 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:06 crc kubenswrapper[4885]: I0308 21:12:06.680945 4885 scope.go:117] "RemoveContainer" containerID="b24462edde9f60cfa7555c270a546d933c909d490fc62394b9ed6e4a826084f2" Mar 08 21:12:07 crc kubenswrapper[4885]: I0308 21:12:07.384551 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddbd1248-e534-4251-b5a6-0505b7710e6e" path="/var/lib/kubelet/pods/ddbd1248-e534-4251-b5a6-0505b7710e6e/volumes" Mar 08 21:12:08 crc kubenswrapper[4885]: I0308 21:12:08.209102 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 21:12:08 crc kubenswrapper[4885]: I0308 21:12:08.209893 4885 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 08 21:12:08 crc kubenswrapper[4885]: I0308 21:12:08.234563 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:08 crc kubenswrapper[4885]: I0308 21:12:08.234638 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 08 21:12:08 crc kubenswrapper[4885]: I0308 21:12:08.238536 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 08 21:12:12 crc kubenswrapper[4885]: I0308 21:12:12.043174 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-2750-account-create-update-m8vl9"] Mar 08 21:12:12 crc kubenswrapper[4885]: I0308 21:12:12.056936 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bd9sr"] Mar 08 21:12:12 crc kubenswrapper[4885]: I0308 21:12:12.065530 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-2750-account-create-update-m8vl9"] Mar 08 21:12:12 crc kubenswrapper[4885]: I0308 21:12:12.073669 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bd9sr"] Mar 08 21:12:13 crc kubenswrapper[4885]: I0308 21:12:13.387720 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="379d344d-9828-4fae-a4f4-5712113f506d" path="/var/lib/kubelet/pods/379d344d-9828-4fae-a4f4-5712113f506d/volumes" Mar 08 21:12:13 crc kubenswrapper[4885]: I0308 21:12:13.389353 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df6b9c40-d823-4cb8-aadd-4f2aee7bd899" path="/var/lib/kubelet/pods/df6b9c40-d823-4cb8-aadd-4f2aee7bd899/volumes" Mar 08 21:12:17 crc kubenswrapper[4885]: I0308 21:12:17.276509 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:17 crc kubenswrapper[4885]: I0308 21:12:17.756657 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:12:18 crc kubenswrapper[4885]: I0308 21:12:18.940574 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.051907 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-lkccb"] Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.063069 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-lkccb"] Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.317099 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.397454 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca4bc30a-842b-45d6-8eb4-c964cbdd2c46" path="/var/lib/kubelet/pods/ca4bc30a-842b-45d6-8eb4-c964cbdd2c46/volumes" Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.409775 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-cfbd9754f-492lw"] Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.410069 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-cfbd9754f-492lw" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon-log" containerID="cri-o://71f6b4756a73f0398ecd6d1de73ef9818474e86bd61fdf4176acdd4f948e0e51" gracePeriod=30 Mar 08 21:12:19 crc kubenswrapper[4885]: I0308 21:12:19.410209 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-cfbd9754f-492lw" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" containerID="cri-o://e6bcfb134bfcc69718a960dce95ed650e8b3cfcf3c24bcd14d2d103ff348f959" gracePeriod=30 Mar 08 21:12:21 crc kubenswrapper[4885]: I0308 21:12:21.171819 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.86:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:12:21 crc kubenswrapper[4885]: I0308 21:12:21.171909 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="6c8b8de1-aa1f-41bf-b8b4-64216af62a1e" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.86:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:12:23 crc kubenswrapper[4885]: I0308 21:12:23.237653 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.87:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:12:23 crc kubenswrapper[4885]: I0308 21:12:23.237722 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="f82d5c8f-6a14-4b1b-9143-eb52cf7e67e8" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.87:9292/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:12:23 crc kubenswrapper[4885]: I0308 21:12:23.378882 4885 generic.go:334] "Generic (PLEG): container finished" podID="89778e39-b609-494b-b2b2-aebf98447dd0" containerID="e6bcfb134bfcc69718a960dce95ed650e8b3cfcf3c24bcd14d2d103ff348f959" exitCode=0 Mar 08 21:12:23 crc kubenswrapper[4885]: I0308 21:12:23.410587 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cfbd9754f-492lw" event={"ID":"89778e39-b609-494b-b2b2-aebf98447dd0","Type":"ContainerDied","Data":"e6bcfb134bfcc69718a960dce95ed650e8b3cfcf3c24bcd14d2d103ff348f959"} Mar 08 21:12:25 crc kubenswrapper[4885]: I0308 21:12:25.480600 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-cfbd9754f-492lw" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.157:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.157:8080: connect: connection refused" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.442411 4885 generic.go:334] "Generic (PLEG): container finished" podID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerID="b0909c7e110caec9a5e1944f3751a491442563718d6ad89214ea7bdc75d423aa" exitCode=137 Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.442775 4885 generic.go:334] "Generic (PLEG): container finished" podID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerID="3931ee1c99d16a879ccfc12229692a2bae1b407bce8d0e34d3d46a2ffcee39dc" exitCode=137 Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.442499 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f4656c87-zrnj4" event={"ID":"c58da53b-fda0-486e-b7d0-d6b50bfc1b62","Type":"ContainerDied","Data":"b0909c7e110caec9a5e1944f3751a491442563718d6ad89214ea7bdc75d423aa"} Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.442825 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f4656c87-zrnj4" event={"ID":"c58da53b-fda0-486e-b7d0-d6b50bfc1b62","Type":"ContainerDied","Data":"3931ee1c99d16a879ccfc12229692a2bae1b407bce8d0e34d3d46a2ffcee39dc"} Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.442838 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-86f4656c87-zrnj4" event={"ID":"c58da53b-fda0-486e-b7d0-d6b50bfc1b62","Type":"ContainerDied","Data":"3192252c7ddb8920e2f562a80b90e4b4271792e7b8e099a5857dac94813f27a8"} Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.442848 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3192252c7ddb8920e2f562a80b90e4b4271792e7b8e099a5857dac94813f27a8" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.506859 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.632005 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-config-data\") pod \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.632212 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-horizon-secret-key\") pod \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.632291 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-logs\") pod \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.632329 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-scripts\") pod \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.632347 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmnvn\" (UniqueName: \"kubernetes.io/projected/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-kube-api-access-xmnvn\") pod \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\" (UID: \"c58da53b-fda0-486e-b7d0-d6b50bfc1b62\") " Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.635514 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-logs" (OuterVolumeSpecName: "logs") pod "c58da53b-fda0-486e-b7d0-d6b50bfc1b62" (UID: "c58da53b-fda0-486e-b7d0-d6b50bfc1b62"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.640218 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c58da53b-fda0-486e-b7d0-d6b50bfc1b62" (UID: "c58da53b-fda0-486e-b7d0-d6b50bfc1b62"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.652169 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-kube-api-access-xmnvn" (OuterVolumeSpecName: "kube-api-access-xmnvn") pod "c58da53b-fda0-486e-b7d0-d6b50bfc1b62" (UID: "c58da53b-fda0-486e-b7d0-d6b50bfc1b62"). InnerVolumeSpecName "kube-api-access-xmnvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.662149 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-config-data" (OuterVolumeSpecName: "config-data") pod "c58da53b-fda0-486e-b7d0-d6b50bfc1b62" (UID: "c58da53b-fda0-486e-b7d0-d6b50bfc1b62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.664287 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-scripts" (OuterVolumeSpecName: "scripts") pod "c58da53b-fda0-486e-b7d0-d6b50bfc1b62" (UID: "c58da53b-fda0-486e-b7d0-d6b50bfc1b62"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.734025 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.734060 4885 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.734073 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.734081 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:26 crc kubenswrapper[4885]: I0308 21:12:26.734090 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmnvn\" (UniqueName: \"kubernetes.io/projected/c58da53b-fda0-486e-b7d0-d6b50bfc1b62-kube-api-access-xmnvn\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:27 crc kubenswrapper[4885]: I0308 21:12:27.455588 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-86f4656c87-zrnj4" Mar 08 21:12:27 crc kubenswrapper[4885]: I0308 21:12:27.496878 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-86f4656c87-zrnj4"] Mar 08 21:12:27 crc kubenswrapper[4885]: I0308 21:12:27.509834 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-86f4656c87-zrnj4"] Mar 08 21:12:29 crc kubenswrapper[4885]: I0308 21:12:29.387361 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" path="/var/lib/kubelet/pods/c58da53b-fda0-486e-b7d0-d6b50bfc1b62/volumes" Mar 08 21:12:32 crc kubenswrapper[4885]: I0308 21:12:32.818392 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:12:32 crc kubenswrapper[4885]: I0308 21:12:32.819098 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:12:35 crc kubenswrapper[4885]: I0308 21:12:35.480198 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-cfbd9754f-492lw" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.157:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.157:8080: connect: connection refused" Mar 08 21:12:45 crc kubenswrapper[4885]: I0308 21:12:45.480519 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-cfbd9754f-492lw" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.157:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.157:8080: connect: connection refused" Mar 08 21:12:45 crc kubenswrapper[4885]: I0308 21:12:45.481513 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:47 crc kubenswrapper[4885]: I0308 21:12:47.061889 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-6ln6s"] Mar 08 21:12:47 crc kubenswrapper[4885]: I0308 21:12:47.078227 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b046-account-create-update-7xtx9"] Mar 08 21:12:47 crc kubenswrapper[4885]: I0308 21:12:47.087305 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-6ln6s"] Mar 08 21:12:47 crc kubenswrapper[4885]: I0308 21:12:47.096299 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b046-account-create-update-7xtx9"] Mar 08 21:12:47 crc kubenswrapper[4885]: I0308 21:12:47.381374 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e0914c1-cac8-4c2d-bbe4-615218170f10" path="/var/lib/kubelet/pods/0e0914c1-cac8-4c2d-bbe4-615218170f10/volumes" Mar 08 21:12:47 crc kubenswrapper[4885]: I0308 21:12:47.382433 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6ea4544-00f0-4646-a598-1efa92af4e49" path="/var/lib/kubelet/pods/d6ea4544-00f0-4646-a598-1efa92af4e49/volumes" Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.719139 4885 generic.go:334] "Generic (PLEG): container finished" podID="89778e39-b609-494b-b2b2-aebf98447dd0" containerID="71f6b4756a73f0398ecd6d1de73ef9818474e86bd61fdf4176acdd4f948e0e51" exitCode=137 Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.719181 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cfbd9754f-492lw" event={"ID":"89778e39-b609-494b-b2b2-aebf98447dd0","Type":"ContainerDied","Data":"71f6b4756a73f0398ecd6d1de73ef9818474e86bd61fdf4176acdd4f948e0e51"} Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.880456 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.990544 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm7hl\" (UniqueName: \"kubernetes.io/projected/89778e39-b609-494b-b2b2-aebf98447dd0-kube-api-access-pm7hl\") pod \"89778e39-b609-494b-b2b2-aebf98447dd0\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.990652 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-config-data\") pod \"89778e39-b609-494b-b2b2-aebf98447dd0\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.990708 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-scripts\") pod \"89778e39-b609-494b-b2b2-aebf98447dd0\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.990733 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89778e39-b609-494b-b2b2-aebf98447dd0-horizon-secret-key\") pod \"89778e39-b609-494b-b2b2-aebf98447dd0\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.990798 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89778e39-b609-494b-b2b2-aebf98447dd0-logs\") pod \"89778e39-b609-494b-b2b2-aebf98447dd0\" (UID: \"89778e39-b609-494b-b2b2-aebf98447dd0\") " Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.991471 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89778e39-b609-494b-b2b2-aebf98447dd0-logs" (OuterVolumeSpecName: "logs") pod "89778e39-b609-494b-b2b2-aebf98447dd0" (UID: "89778e39-b609-494b-b2b2-aebf98447dd0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:12:49 crc kubenswrapper[4885]: I0308 21:12:49.997263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89778e39-b609-494b-b2b2-aebf98447dd0-kube-api-access-pm7hl" (OuterVolumeSpecName: "kube-api-access-pm7hl") pod "89778e39-b609-494b-b2b2-aebf98447dd0" (UID: "89778e39-b609-494b-b2b2-aebf98447dd0"). InnerVolumeSpecName "kube-api-access-pm7hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.014744 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-scripts" (OuterVolumeSpecName: "scripts") pod "89778e39-b609-494b-b2b2-aebf98447dd0" (UID: "89778e39-b609-494b-b2b2-aebf98447dd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.022674 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89778e39-b609-494b-b2b2-aebf98447dd0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "89778e39-b609-494b-b2b2-aebf98447dd0" (UID: "89778e39-b609-494b-b2b2-aebf98447dd0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.023886 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-config-data" (OuterVolumeSpecName: "config-data") pod "89778e39-b609-494b-b2b2-aebf98447dd0" (UID: "89778e39-b609-494b-b2b2-aebf98447dd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.092450 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm7hl\" (UniqueName: \"kubernetes.io/projected/89778e39-b609-494b-b2b2-aebf98447dd0-kube-api-access-pm7hl\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.092484 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.092499 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/89778e39-b609-494b-b2b2-aebf98447dd0-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.092512 4885 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/89778e39-b609-494b-b2b2-aebf98447dd0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.092523 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89778e39-b609-494b-b2b2-aebf98447dd0-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.733962 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-cfbd9754f-492lw" event={"ID":"89778e39-b609-494b-b2b2-aebf98447dd0","Type":"ContainerDied","Data":"e4a35d78251bc7cb6befb4354f6fb7487a2a564eecb0e58296c981c935357145"} Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.734024 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-cfbd9754f-492lw" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.734048 4885 scope.go:117] "RemoveContainer" containerID="e6bcfb134bfcc69718a960dce95ed650e8b3cfcf3c24bcd14d2d103ff348f959" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.783795 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-cfbd9754f-492lw"] Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.792664 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-cfbd9754f-492lw"] Mar 08 21:12:50 crc kubenswrapper[4885]: E0308 21:12:50.920545 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89778e39_b609_494b_b2b2_aebf98447dd0.slice/crio-e4a35d78251bc7cb6befb4354f6fb7487a2a564eecb0e58296c981c935357145\": RecentStats: unable to find data in memory cache]" Mar 08 21:12:50 crc kubenswrapper[4885]: I0308 21:12:50.981028 4885 scope.go:117] "RemoveContainer" containerID="71f6b4756a73f0398ecd6d1de73ef9818474e86bd61fdf4176acdd4f948e0e51" Mar 08 21:12:51 crc kubenswrapper[4885]: I0308 21:12:51.386305 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" path="/var/lib/kubelet/pods/89778e39-b609-494b-b2b2-aebf98447dd0/volumes" Mar 08 21:12:56 crc kubenswrapper[4885]: I0308 21:12:56.049798 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-th6xc"] Mar 08 21:12:56 crc kubenswrapper[4885]: I0308 21:12:56.071094 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-th6xc"] Mar 08 21:12:57 crc kubenswrapper[4885]: I0308 21:12:57.386006 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faeab210-5195-4d9a-a17e-5aed2f14dc68" path="/var/lib/kubelet/pods/faeab210-5195-4d9a-a17e-5aed2f14dc68/volumes" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.521824 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5b7cfb69fc-bhpx4"] Mar 08 21:13:02 crc kubenswrapper[4885]: E0308 21:13:02.524615 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.524746 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" Mar 08 21:13:02 crc kubenswrapper[4885]: E0308 21:13:02.524881 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4225aab0-53fa-4aa3-ac19-6827ea262916" containerName="oc" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.525018 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4225aab0-53fa-4aa3-ac19-6827ea262916" containerName="oc" Mar 08 21:13:02 crc kubenswrapper[4885]: E0308 21:13:02.525142 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon-log" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.525257 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon-log" Mar 08 21:13:02 crc kubenswrapper[4885]: E0308 21:13:02.525373 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.525467 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon" Mar 08 21:13:02 crc kubenswrapper[4885]: E0308 21:13:02.525581 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon-log" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.525680 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon-log" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.526400 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4225aab0-53fa-4aa3-ac19-6827ea262916" containerName="oc" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.526540 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon-log" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.526685 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon-log" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.526805 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58da53b-fda0-486e-b7d0-d6b50bfc1b62" containerName="horizon" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.526909 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="89778e39-b609-494b-b2b2-aebf98447dd0" containerName="horizon" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.528779 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.548115 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b7cfb69fc-bhpx4"] Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.596069 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jlxh\" (UniqueName: \"kubernetes.io/projected/f24559d3-3f44-434a-b790-32c52475d532-kube-api-access-7jlxh\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.596171 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f24559d3-3f44-434a-b790-32c52475d532-scripts\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.596219 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f24559d3-3f44-434a-b790-32c52475d532-config-data\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.596304 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f24559d3-3f44-434a-b790-32c52475d532-horizon-secret-key\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.596455 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24559d3-3f44-434a-b790-32c52475d532-logs\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.697934 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jlxh\" (UniqueName: \"kubernetes.io/projected/f24559d3-3f44-434a-b790-32c52475d532-kube-api-access-7jlxh\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.698008 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f24559d3-3f44-434a-b790-32c52475d532-scripts\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.698040 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f24559d3-3f44-434a-b790-32c52475d532-config-data\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.698090 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f24559d3-3f44-434a-b790-32c52475d532-horizon-secret-key\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.698162 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24559d3-3f44-434a-b790-32c52475d532-logs\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.698856 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f24559d3-3f44-434a-b790-32c52475d532-logs\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.699028 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f24559d3-3f44-434a-b790-32c52475d532-scripts\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.699534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f24559d3-3f44-434a-b790-32c52475d532-config-data\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.704484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f24559d3-3f44-434a-b790-32c52475d532-horizon-secret-key\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.730729 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jlxh\" (UniqueName: \"kubernetes.io/projected/f24559d3-3f44-434a-b790-32c52475d532-kube-api-access-7jlxh\") pod \"horizon-5b7cfb69fc-bhpx4\" (UID: \"f24559d3-3f44-434a-b790-32c52475d532\") " pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.839971 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.840027 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.840084 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.840882 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.840961 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" gracePeriod=600 Mar 08 21:13:02 crc kubenswrapper[4885]: I0308 21:13:02.851449 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:03 crc kubenswrapper[4885]: E0308 21:13:03.001911 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.336652 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5b7cfb69fc-bhpx4"] Mar 08 21:13:03 crc kubenswrapper[4885]: W0308 21:13:03.338263 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf24559d3_3f44_434a_b790_32c52475d532.slice/crio-1165389af88d1d8eab2ddbc795533660ee3ebbae487f8979b787bcd281da821d WatchSource:0}: Error finding container 1165389af88d1d8eab2ddbc795533660ee3ebbae487f8979b787bcd281da821d: Status 404 returned error can't find the container with id 1165389af88d1d8eab2ddbc795533660ee3ebbae487f8979b787bcd281da821d Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.811231 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-rstwd"] Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.814006 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-rstwd" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.819172 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-rstwd"] Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.819760 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b956841b-a9a1-4c38-99e9-05c6e5f9f363-operator-scripts\") pod \"heat-db-create-rstwd\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " pod="openstack/heat-db-create-rstwd" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.819843 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-668g2\" (UniqueName: \"kubernetes.io/projected/b956841b-a9a1-4c38-99e9-05c6e5f9f363-kube-api-access-668g2\") pod \"heat-db-create-rstwd\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " pod="openstack/heat-db-create-rstwd" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.896664 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-6865-account-create-update-5p2d8"] Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.898049 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.901873 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.905002 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7cfb69fc-bhpx4" event={"ID":"f24559d3-3f44-434a-b790-32c52475d532","Type":"ContainerStarted","Data":"7de957ed968b504a9b250ecaaa70845cef18ee78b3fab7c8c4348d16ba4ab56c"} Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.905030 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7cfb69fc-bhpx4" event={"ID":"f24559d3-3f44-434a-b790-32c52475d532","Type":"ContainerStarted","Data":"2825c1cab2f6626bb6e17a99f19c671d0f0d4d3ae3037cdf4c42e846c7933c14"} Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.905041 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5b7cfb69fc-bhpx4" event={"ID":"f24559d3-3f44-434a-b790-32c52475d532","Type":"ContainerStarted","Data":"1165389af88d1d8eab2ddbc795533660ee3ebbae487f8979b787bcd281da821d"} Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.906146 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-6865-account-create-update-5p2d8"] Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.910006 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" exitCode=0 Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.910062 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9"} Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.910101 4885 scope.go:117] "RemoveContainer" containerID="f2cd11e6d776229da9098efb4d94ce67906e2c52e2199ae80ec12db171f7eadf" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.911022 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:13:03 crc kubenswrapper[4885]: E0308 21:13:03.911315 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.920994 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b956841b-a9a1-4c38-99e9-05c6e5f9f363-operator-scripts\") pod \"heat-db-create-rstwd\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " pod="openstack/heat-db-create-rstwd" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.921075 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj2zw\" (UniqueName: \"kubernetes.io/projected/49449c65-7a7c-437f-b4d9-23b2a219485f-kube-api-access-jj2zw\") pod \"heat-6865-account-create-update-5p2d8\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.921137 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-668g2\" (UniqueName: \"kubernetes.io/projected/b956841b-a9a1-4c38-99e9-05c6e5f9f363-kube-api-access-668g2\") pod \"heat-db-create-rstwd\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " pod="openstack/heat-db-create-rstwd" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.921202 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49449c65-7a7c-437f-b4d9-23b2a219485f-operator-scripts\") pod \"heat-6865-account-create-update-5p2d8\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.922818 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b956841b-a9a1-4c38-99e9-05c6e5f9f363-operator-scripts\") pod \"heat-db-create-rstwd\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " pod="openstack/heat-db-create-rstwd" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.936380 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5b7cfb69fc-bhpx4" podStartSLOduration=1.93636371 podStartE2EDuration="1.93636371s" podCreationTimestamp="2026-03-08 21:13:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:13:03.93261554 +0000 UTC m=+6085.328669563" watchObservedRunningTime="2026-03-08 21:13:03.93636371 +0000 UTC m=+6085.332417733" Mar 08 21:13:03 crc kubenswrapper[4885]: I0308 21:13:03.948418 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-668g2\" (UniqueName: \"kubernetes.io/projected/b956841b-a9a1-4c38-99e9-05c6e5f9f363-kube-api-access-668g2\") pod \"heat-db-create-rstwd\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " pod="openstack/heat-db-create-rstwd" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.022639 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49449c65-7a7c-437f-b4d9-23b2a219485f-operator-scripts\") pod \"heat-6865-account-create-update-5p2d8\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.022766 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj2zw\" (UniqueName: \"kubernetes.io/projected/49449c65-7a7c-437f-b4d9-23b2a219485f-kube-api-access-jj2zw\") pod \"heat-6865-account-create-update-5p2d8\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.023650 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49449c65-7a7c-437f-b4d9-23b2a219485f-operator-scripts\") pod \"heat-6865-account-create-update-5p2d8\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.046262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj2zw\" (UniqueName: \"kubernetes.io/projected/49449c65-7a7c-437f-b4d9-23b2a219485f-kube-api-access-jj2zw\") pod \"heat-6865-account-create-update-5p2d8\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.143775 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-rstwd" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.307389 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.509454 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-rstwd"] Mar 08 21:13:04 crc kubenswrapper[4885]: W0308 21:13:04.834944 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49449c65_7a7c_437f_b4d9_23b2a219485f.slice/crio-2cc4775ee08eed7d39f8be982b9ca6f7f48f1e4e6afe3854394f9b6b73790fd2 WatchSource:0}: Error finding container 2cc4775ee08eed7d39f8be982b9ca6f7f48f1e4e6afe3854394f9b6b73790fd2: Status 404 returned error can't find the container with id 2cc4775ee08eed7d39f8be982b9ca6f7f48f1e4e6afe3854394f9b6b73790fd2 Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.844894 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-6865-account-create-update-5p2d8"] Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.922125 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6865-account-create-update-5p2d8" event={"ID":"49449c65-7a7c-437f-b4d9-23b2a219485f","Type":"ContainerStarted","Data":"2cc4775ee08eed7d39f8be982b9ca6f7f48f1e4e6afe3854394f9b6b73790fd2"} Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.924703 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-rstwd" event={"ID":"b956841b-a9a1-4c38-99e9-05c6e5f9f363","Type":"ContainerStarted","Data":"9a171ca46b7a7190a2333a95503ee285e088cd83dd56cbc13cb8f2021b946782"} Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.924749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-rstwd" event={"ID":"b956841b-a9a1-4c38-99e9-05c6e5f9f363","Type":"ContainerStarted","Data":"0b52c9b69df82d97be5873b32fca34b5609792286e11cbfa2932f9a0a474109a"} Mar 08 21:13:04 crc kubenswrapper[4885]: I0308 21:13:04.943061 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-create-rstwd" podStartSLOduration=1.943046512 podStartE2EDuration="1.943046512s" podCreationTimestamp="2026-03-08 21:13:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:13:04.939528778 +0000 UTC m=+6086.335582801" watchObservedRunningTime="2026-03-08 21:13:04.943046512 +0000 UTC m=+6086.339100535" Mar 08 21:13:05 crc kubenswrapper[4885]: I0308 21:13:05.961008 4885 generic.go:334] "Generic (PLEG): container finished" podID="49449c65-7a7c-437f-b4d9-23b2a219485f" containerID="d70a935397b593663bdf26afd9e76f5f57ebfae75c4ef218c57dc585c1689e21" exitCode=0 Mar 08 21:13:05 crc kubenswrapper[4885]: I0308 21:13:05.961140 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6865-account-create-update-5p2d8" event={"ID":"49449c65-7a7c-437f-b4d9-23b2a219485f","Type":"ContainerDied","Data":"d70a935397b593663bdf26afd9e76f5f57ebfae75c4ef218c57dc585c1689e21"} Mar 08 21:13:05 crc kubenswrapper[4885]: I0308 21:13:05.968070 4885 generic.go:334] "Generic (PLEG): container finished" podID="b956841b-a9a1-4c38-99e9-05c6e5f9f363" containerID="9a171ca46b7a7190a2333a95503ee285e088cd83dd56cbc13cb8f2021b946782" exitCode=0 Mar 08 21:13:05 crc kubenswrapper[4885]: I0308 21:13:05.968117 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-rstwd" event={"ID":"b956841b-a9a1-4c38-99e9-05c6e5f9f363","Type":"ContainerDied","Data":"9a171ca46b7a7190a2333a95503ee285e088cd83dd56cbc13cb8f2021b946782"} Mar 08 21:13:06 crc kubenswrapper[4885]: I0308 21:13:06.876606 4885 scope.go:117] "RemoveContainer" containerID="e6faf1d3d8e8aff220f85b274a14bd4ce7db4420e55b8b8af43285742e4b286e" Mar 08 21:13:06 crc kubenswrapper[4885]: I0308 21:13:06.914536 4885 scope.go:117] "RemoveContainer" containerID="005e15265fa043b9b659e044fe35d74117bca8b49d4e6e5ad4ce0be3aeda6fee" Mar 08 21:13:06 crc kubenswrapper[4885]: I0308 21:13:06.979692 4885 scope.go:117] "RemoveContainer" containerID="f4d740c9938b3b085cc1665a4c48f0e8e5909dace559f7eedf545ed929b6ffde" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.081693 4885 scope.go:117] "RemoveContainer" containerID="bbb04e31c22cb5e6b28c152a2e21dcf4858fe3b83e03435366a3a4cecdd397ef" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.149717 4885 scope.go:117] "RemoveContainer" containerID="5238776febd95a86282109260190e0f71b38f87e63e8af8a383a946420238586" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.181282 4885 scope.go:117] "RemoveContainer" containerID="0976abd329fd9ab97b11b664eb364407380933db3aade963696b316b7306d1fa" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.235951 4885 scope.go:117] "RemoveContainer" containerID="71bb049c2d9773b9c9e48cbd2812e843fb5d6ca86b0d975e407dfe49238257fc" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.349966 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-rstwd" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.426457 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.506690 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-668g2\" (UniqueName: \"kubernetes.io/projected/b956841b-a9a1-4c38-99e9-05c6e5f9f363-kube-api-access-668g2\") pod \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.507650 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b956841b-a9a1-4c38-99e9-05c6e5f9f363-operator-scripts\") pod \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\" (UID: \"b956841b-a9a1-4c38-99e9-05c6e5f9f363\") " Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.508233 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b956841b-a9a1-4c38-99e9-05c6e5f9f363-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b956841b-a9a1-4c38-99e9-05c6e5f9f363" (UID: "b956841b-a9a1-4c38-99e9-05c6e5f9f363"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.508734 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b956841b-a9a1-4c38-99e9-05c6e5f9f363-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.511635 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b956841b-a9a1-4c38-99e9-05c6e5f9f363-kube-api-access-668g2" (OuterVolumeSpecName: "kube-api-access-668g2") pod "b956841b-a9a1-4c38-99e9-05c6e5f9f363" (UID: "b956841b-a9a1-4c38-99e9-05c6e5f9f363"). InnerVolumeSpecName "kube-api-access-668g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.610557 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj2zw\" (UniqueName: \"kubernetes.io/projected/49449c65-7a7c-437f-b4d9-23b2a219485f-kube-api-access-jj2zw\") pod \"49449c65-7a7c-437f-b4d9-23b2a219485f\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.610714 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49449c65-7a7c-437f-b4d9-23b2a219485f-operator-scripts\") pod \"49449c65-7a7c-437f-b4d9-23b2a219485f\" (UID: \"49449c65-7a7c-437f-b4d9-23b2a219485f\") " Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.611399 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49449c65-7a7c-437f-b4d9-23b2a219485f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "49449c65-7a7c-437f-b4d9-23b2a219485f" (UID: "49449c65-7a7c-437f-b4d9-23b2a219485f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.614235 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49449c65-7a7c-437f-b4d9-23b2a219485f-kube-api-access-jj2zw" (OuterVolumeSpecName: "kube-api-access-jj2zw") pod "49449c65-7a7c-437f-b4d9-23b2a219485f" (UID: "49449c65-7a7c-437f-b4d9-23b2a219485f"). InnerVolumeSpecName "kube-api-access-jj2zw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.620160 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-668g2\" (UniqueName: \"kubernetes.io/projected/b956841b-a9a1-4c38-99e9-05c6e5f9f363-kube-api-access-668g2\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.722427 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/49449c65-7a7c-437f-b4d9-23b2a219485f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:07 crc kubenswrapper[4885]: I0308 21:13:07.722478 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj2zw\" (UniqueName: \"kubernetes.io/projected/49449c65-7a7c-437f-b4d9-23b2a219485f-kube-api-access-jj2zw\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:08 crc kubenswrapper[4885]: I0308 21:13:08.060393 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-rstwd" event={"ID":"b956841b-a9a1-4c38-99e9-05c6e5f9f363","Type":"ContainerDied","Data":"0b52c9b69df82d97be5873b32fca34b5609792286e11cbfa2932f9a0a474109a"} Mar 08 21:13:08 crc kubenswrapper[4885]: I0308 21:13:08.060649 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b52c9b69df82d97be5873b32fca34b5609792286e11cbfa2932f9a0a474109a" Mar 08 21:13:08 crc kubenswrapper[4885]: I0308 21:13:08.060751 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-rstwd" Mar 08 21:13:08 crc kubenswrapper[4885]: I0308 21:13:08.106247 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-6865-account-create-update-5p2d8" event={"ID":"49449c65-7a7c-437f-b4d9-23b2a219485f","Type":"ContainerDied","Data":"2cc4775ee08eed7d39f8be982b9ca6f7f48f1e4e6afe3854394f9b6b73790fd2"} Mar 08 21:13:08 crc kubenswrapper[4885]: I0308 21:13:08.106303 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cc4775ee08eed7d39f8be982b9ca6f7f48f1e4e6afe3854394f9b6b73790fd2" Mar 08 21:13:08 crc kubenswrapper[4885]: I0308 21:13:08.106317 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-6865-account-create-update-5p2d8" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.033165 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-fs9dx"] Mar 08 21:13:09 crc kubenswrapper[4885]: E0308 21:13:09.033706 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b956841b-a9a1-4c38-99e9-05c6e5f9f363" containerName="mariadb-database-create" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.033734 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b956841b-a9a1-4c38-99e9-05c6e5f9f363" containerName="mariadb-database-create" Mar 08 21:13:09 crc kubenswrapper[4885]: E0308 21:13:09.033772 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49449c65-7a7c-437f-b4d9-23b2a219485f" containerName="mariadb-account-create-update" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.033784 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="49449c65-7a7c-437f-b4d9-23b2a219485f" containerName="mariadb-account-create-update" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.034146 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="49449c65-7a7c-437f-b4d9-23b2a219485f" containerName="mariadb-account-create-update" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.034193 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b956841b-a9a1-4c38-99e9-05c6e5f9f363" containerName="mariadb-database-create" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.035237 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.038208 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.038720 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-2n2s7" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.050769 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fs9dx"] Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.163432 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-combined-ca-bundle\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.163511 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7n26\" (UniqueName: \"kubernetes.io/projected/495d39cf-6a4d-4ca0-90b6-9a22323d1568-kube-api-access-v7n26\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.163575 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-config-data\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.265542 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-combined-ca-bundle\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.265621 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7n26\" (UniqueName: \"kubernetes.io/projected/495d39cf-6a4d-4ca0-90b6-9a22323d1568-kube-api-access-v7n26\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.265688 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-config-data\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.272823 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-combined-ca-bundle\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.272990 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-config-data\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.283514 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7n26\" (UniqueName: \"kubernetes.io/projected/495d39cf-6a4d-4ca0-90b6-9a22323d1568-kube-api-access-v7n26\") pod \"heat-db-sync-fs9dx\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.367006 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:09 crc kubenswrapper[4885]: I0308 21:13:09.892417 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-fs9dx"] Mar 08 21:13:09 crc kubenswrapper[4885]: W0308 21:13:09.903477 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod495d39cf_6a4d_4ca0_90b6_9a22323d1568.slice/crio-6aac3196b5f7ae47a15b2efcb4dde28177fa458b581fbd72f2cbb909d4173d5a WatchSource:0}: Error finding container 6aac3196b5f7ae47a15b2efcb4dde28177fa458b581fbd72f2cbb909d4173d5a: Status 404 returned error can't find the container with id 6aac3196b5f7ae47a15b2efcb4dde28177fa458b581fbd72f2cbb909d4173d5a Mar 08 21:13:10 crc kubenswrapper[4885]: I0308 21:13:10.137595 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fs9dx" event={"ID":"495d39cf-6a4d-4ca0-90b6-9a22323d1568","Type":"ContainerStarted","Data":"6aac3196b5f7ae47a15b2efcb4dde28177fa458b581fbd72f2cbb909d4173d5a"} Mar 08 21:13:12 crc kubenswrapper[4885]: I0308 21:13:12.852161 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:12 crc kubenswrapper[4885]: I0308 21:13:12.852894 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:15 crc kubenswrapper[4885]: I0308 21:13:15.370605 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:13:15 crc kubenswrapper[4885]: E0308 21:13:15.371513 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:13:19 crc kubenswrapper[4885]: I0308 21:13:19.271163 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fs9dx" event={"ID":"495d39cf-6a4d-4ca0-90b6-9a22323d1568","Type":"ContainerStarted","Data":"27c7130d460aa9b10cdac4b0fbc1bf5fbafc6534f511b76384c6bd5cf7ea008a"} Mar 08 21:13:19 crc kubenswrapper[4885]: I0308 21:13:19.313105 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-fs9dx" podStartSLOduration=2.581579234 podStartE2EDuration="11.313078524s" podCreationTimestamp="2026-03-08 21:13:08 +0000 UTC" firstStartedPulling="2026-03-08 21:13:09.907408653 +0000 UTC m=+6091.303462686" lastFinishedPulling="2026-03-08 21:13:18.638907943 +0000 UTC m=+6100.034961976" observedRunningTime="2026-03-08 21:13:19.294734772 +0000 UTC m=+6100.690788815" watchObservedRunningTime="2026-03-08 21:13:19.313078524 +0000 UTC m=+6100.709132577" Mar 08 21:13:21 crc kubenswrapper[4885]: I0308 21:13:21.303856 4885 generic.go:334] "Generic (PLEG): container finished" podID="495d39cf-6a4d-4ca0-90b6-9a22323d1568" containerID="27c7130d460aa9b10cdac4b0fbc1bf5fbafc6534f511b76384c6bd5cf7ea008a" exitCode=0 Mar 08 21:13:21 crc kubenswrapper[4885]: I0308 21:13:21.303974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fs9dx" event={"ID":"495d39cf-6a4d-4ca0-90b6-9a22323d1568","Type":"ContainerDied","Data":"27c7130d460aa9b10cdac4b0fbc1bf5fbafc6534f511b76384c6bd5cf7ea008a"} Mar 08 21:13:22 crc kubenswrapper[4885]: I0308 21:13:22.816064 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:22 crc kubenswrapper[4885]: I0308 21:13:22.931488 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-combined-ca-bundle\") pod \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " Mar 08 21:13:22 crc kubenswrapper[4885]: I0308 21:13:22.931618 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-config-data\") pod \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " Mar 08 21:13:22 crc kubenswrapper[4885]: I0308 21:13:22.931673 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7n26\" (UniqueName: \"kubernetes.io/projected/495d39cf-6a4d-4ca0-90b6-9a22323d1568-kube-api-access-v7n26\") pod \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\" (UID: \"495d39cf-6a4d-4ca0-90b6-9a22323d1568\") " Mar 08 21:13:22 crc kubenswrapper[4885]: I0308 21:13:22.939404 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/495d39cf-6a4d-4ca0-90b6-9a22323d1568-kube-api-access-v7n26" (OuterVolumeSpecName: "kube-api-access-v7n26") pod "495d39cf-6a4d-4ca0-90b6-9a22323d1568" (UID: "495d39cf-6a4d-4ca0-90b6-9a22323d1568"). InnerVolumeSpecName "kube-api-access-v7n26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:13:22 crc kubenswrapper[4885]: I0308 21:13:22.959542 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "495d39cf-6a4d-4ca0-90b6-9a22323d1568" (UID: "495d39cf-6a4d-4ca0-90b6-9a22323d1568"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.034745 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.034783 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7n26\" (UniqueName: \"kubernetes.io/projected/495d39cf-6a4d-4ca0-90b6-9a22323d1568-kube-api-access-v7n26\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.050351 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-config-data" (OuterVolumeSpecName: "config-data") pod "495d39cf-6a4d-4ca0-90b6-9a22323d1568" (UID: "495d39cf-6a4d-4ca0-90b6-9a22323d1568"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.136781 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/495d39cf-6a4d-4ca0-90b6-9a22323d1568-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.332985 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-fs9dx" event={"ID":"495d39cf-6a4d-4ca0-90b6-9a22323d1568","Type":"ContainerDied","Data":"6aac3196b5f7ae47a15b2efcb4dde28177fa458b581fbd72f2cbb909d4173d5a"} Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.333044 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aac3196b5f7ae47a15b2efcb4dde28177fa458b581fbd72f2cbb909d4173d5a" Mar 08 21:13:23 crc kubenswrapper[4885]: I0308 21:13:23.333116 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-fs9dx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.577730 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-595686fb49-hx4rx"] Mar 08 21:13:24 crc kubenswrapper[4885]: E0308 21:13:24.580384 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="495d39cf-6a4d-4ca0-90b6-9a22323d1568" containerName="heat-db-sync" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.580408 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="495d39cf-6a4d-4ca0-90b6-9a22323d1568" containerName="heat-db-sync" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.580613 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="495d39cf-6a4d-4ca0-90b6-9a22323d1568" containerName="heat-db-sync" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.581307 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.586073 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.586294 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-2n2s7" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.586437 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.588528 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-595686fb49-hx4rx"] Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.675072 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-config-data\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.675138 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-config-data-custom\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.675182 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc98j\" (UniqueName: \"kubernetes.io/projected/9c41cdd1-29dd-4252-b988-1efaeed01573-kube-api-access-kc98j\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.675270 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-combined-ca-bundle\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.776526 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc98j\" (UniqueName: \"kubernetes.io/projected/9c41cdd1-29dd-4252-b988-1efaeed01573-kube-api-access-kc98j\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.776667 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-combined-ca-bundle\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.776743 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-config-data\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.776778 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-config-data-custom\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.784186 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-796f99d566-r2p9d"] Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.795036 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-config-data-custom\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.798665 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-combined-ca-bundle\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.806400 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.815986 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-796f99d566-r2p9d"] Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.826292 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.827029 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c41cdd1-29dd-4252-b988-1efaeed01573-config-data\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.839557 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc98j\" (UniqueName: \"kubernetes.io/projected/9c41cdd1-29dd-4252-b988-1efaeed01573-kube-api-access-kc98j\") pod \"heat-engine-595686fb49-hx4rx\" (UID: \"9c41cdd1-29dd-4252-b988-1efaeed01573\") " pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.850276 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-fffd5d5b8-82pm2"] Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.851604 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.859832 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.866942 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-fffd5d5b8-82pm2"] Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.878327 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-combined-ca-bundle\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.878437 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-config-data-custom\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.878481 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p49zs\" (UniqueName: \"kubernetes.io/projected/78788c18-3ce2-4e27-841d-e7d380fbab71-kube-api-access-p49zs\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.878519 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-config-data\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.966903 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981158 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-config-data-custom\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981221 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p49zs\" (UniqueName: \"kubernetes.io/projected/78788c18-3ce2-4e27-841d-e7d380fbab71-kube-api-access-p49zs\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981268 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-config-data\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981324 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-config-data\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981401 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-combined-ca-bundle\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981488 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvkv2\" (UniqueName: \"kubernetes.io/projected/979b34eb-586a-4d86-8e2d-7937614c714a-kube-api-access-jvkv2\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981559 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-combined-ca-bundle\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.981599 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-config-data-custom\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:24 crc kubenswrapper[4885]: I0308 21:13:24.996743 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-config-data-custom\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.001552 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-combined-ca-bundle\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.002337 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/78788c18-3ce2-4e27-841d-e7d380fbab71-config-data\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.004121 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p49zs\" (UniqueName: \"kubernetes.io/projected/78788c18-3ce2-4e27-841d-e7d380fbab71-kube-api-access-p49zs\") pod \"heat-api-796f99d566-r2p9d\" (UID: \"78788c18-3ce2-4e27-841d-e7d380fbab71\") " pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.050975 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.083623 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-config-data\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.083695 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-combined-ca-bundle\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.083746 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvkv2\" (UniqueName: \"kubernetes.io/projected/979b34eb-586a-4d86-8e2d-7937614c714a-kube-api-access-jvkv2\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.083802 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-config-data-custom\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.087109 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-config-data-custom\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.088307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-config-data\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.095678 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/979b34eb-586a-4d86-8e2d-7937614c714a-combined-ca-bundle\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.104323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvkv2\" (UniqueName: \"kubernetes.io/projected/979b34eb-586a-4d86-8e2d-7937614c714a-kube-api-access-jvkv2\") pod \"heat-cfnapi-fffd5d5b8-82pm2\" (UID: \"979b34eb-586a-4d86-8e2d-7937614c714a\") " pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.229759 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.236573 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:25 crc kubenswrapper[4885]: W0308 21:13:25.529581 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c41cdd1_29dd_4252_b988_1efaeed01573.slice/crio-c21e7c516389f65587a579c829742a5469afc9fa2598a4abc47d8616960692bf WatchSource:0}: Error finding container c21e7c516389f65587a579c829742a5469afc9fa2598a4abc47d8616960692bf: Status 404 returned error can't find the container with id c21e7c516389f65587a579c829742a5469afc9fa2598a4abc47d8616960692bf Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.536953 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-595686fb49-hx4rx"] Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.794632 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-fffd5d5b8-82pm2"] Mar 08 21:13:25 crc kubenswrapper[4885]: W0308 21:13:25.795286 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod979b34eb_586a_4d86_8e2d_7937614c714a.slice/crio-91c785b931fb62b2da3608f0aa73d5e6baefb65455cf6c98670fd099e47dadea WatchSource:0}: Error finding container 91c785b931fb62b2da3608f0aa73d5e6baefb65455cf6c98670fd099e47dadea: Status 404 returned error can't find the container with id 91c785b931fb62b2da3608f0aa73d5e6baefb65455cf6c98670fd099e47dadea Mar 08 21:13:25 crc kubenswrapper[4885]: I0308 21:13:25.816830 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-796f99d566-r2p9d"] Mar 08 21:13:25 crc kubenswrapper[4885]: W0308 21:13:25.817019 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78788c18_3ce2_4e27_841d_e7d380fbab71.slice/crio-f754ac183e677874f43efb9bd022b528556a8293133acc15fea8884b253153af WatchSource:0}: Error finding container f754ac183e677874f43efb9bd022b528556a8293133acc15fea8884b253153af: Status 404 returned error can't find the container with id f754ac183e677874f43efb9bd022b528556a8293133acc15fea8884b253153af Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.369023 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:13:26 crc kubenswrapper[4885]: E0308 21:13:26.369628 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.370972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-796f99d566-r2p9d" event={"ID":"78788c18-3ce2-4e27-841d-e7d380fbab71","Type":"ContainerStarted","Data":"f754ac183e677874f43efb9bd022b528556a8293133acc15fea8884b253153af"} Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.373078 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" event={"ID":"979b34eb-586a-4d86-8e2d-7937614c714a","Type":"ContainerStarted","Data":"91c785b931fb62b2da3608f0aa73d5e6baefb65455cf6c98670fd099e47dadea"} Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.374695 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-595686fb49-hx4rx" event={"ID":"9c41cdd1-29dd-4252-b988-1efaeed01573","Type":"ContainerStarted","Data":"7f507f8806f3620662ec6c97dfb37b2207335089aa04c38748f1bedd1a21fb14"} Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.374732 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-595686fb49-hx4rx" event={"ID":"9c41cdd1-29dd-4252-b988-1efaeed01573","Type":"ContainerStarted","Data":"c21e7c516389f65587a579c829742a5469afc9fa2598a4abc47d8616960692bf"} Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.376039 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:26 crc kubenswrapper[4885]: I0308 21:13:26.398884 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-595686fb49-hx4rx" podStartSLOduration=2.398865514 podStartE2EDuration="2.398865514s" podCreationTimestamp="2026-03-08 21:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:13:26.395287988 +0000 UTC m=+6107.791342021" watchObservedRunningTime="2026-03-08 21:13:26.398865514 +0000 UTC m=+6107.794919537" Mar 08 21:13:27 crc kubenswrapper[4885]: I0308 21:13:27.016233 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5b7cfb69fc-bhpx4" Mar 08 21:13:27 crc kubenswrapper[4885]: I0308 21:13:27.085818 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dbdd8c5b9-56mvx"] Mar 08 21:13:27 crc kubenswrapper[4885]: I0308 21:13:27.086077 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-dbdd8c5b9-56mvx" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon-log" containerID="cri-o://321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097" gracePeriod=30 Mar 08 21:13:27 crc kubenswrapper[4885]: I0308 21:13:27.086457 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-dbdd8c5b9-56mvx" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" containerID="cri-o://1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0" gracePeriod=30 Mar 08 21:13:29 crc kubenswrapper[4885]: I0308 21:13:29.403053 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-796f99d566-r2p9d" event={"ID":"78788c18-3ce2-4e27-841d-e7d380fbab71","Type":"ContainerStarted","Data":"c2da5e55a35b2266f9517072b83648e3551522b93ed4406e1f6e2fcc4d9fad09"} Mar 08 21:13:29 crc kubenswrapper[4885]: I0308 21:13:29.403556 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:29 crc kubenswrapper[4885]: I0308 21:13:29.414894 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" event={"ID":"979b34eb-586a-4d86-8e2d-7937614c714a","Type":"ContainerStarted","Data":"2a21c27e79ecf343e59c62664e37c488787c555bf3cd743c9d60f9b0f163d52b"} Mar 08 21:13:29 crc kubenswrapper[4885]: I0308 21:13:29.417431 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:29 crc kubenswrapper[4885]: I0308 21:13:29.442863 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-796f99d566-r2p9d" podStartSLOduration=2.2876640630000002 podStartE2EDuration="5.442839272s" podCreationTimestamp="2026-03-08 21:13:24 +0000 UTC" firstStartedPulling="2026-03-08 21:13:25.824800697 +0000 UTC m=+6107.220854720" lastFinishedPulling="2026-03-08 21:13:28.979975896 +0000 UTC m=+6110.376029929" observedRunningTime="2026-03-08 21:13:29.430168252 +0000 UTC m=+6110.826222305" watchObservedRunningTime="2026-03-08 21:13:29.442839272 +0000 UTC m=+6110.838893295" Mar 08 21:13:29 crc kubenswrapper[4885]: I0308 21:13:29.456836 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" podStartSLOduration=2.2790091710000002 podStartE2EDuration="5.456809676s" podCreationTimestamp="2026-03-08 21:13:24 +0000 UTC" firstStartedPulling="2026-03-08 21:13:25.797996839 +0000 UTC m=+6107.194050862" lastFinishedPulling="2026-03-08 21:13:28.975797344 +0000 UTC m=+6110.371851367" observedRunningTime="2026-03-08 21:13:29.447859356 +0000 UTC m=+6110.843913379" watchObservedRunningTime="2026-03-08 21:13:29.456809676 +0000 UTC m=+6110.852863699" Mar 08 21:13:30 crc kubenswrapper[4885]: I0308 21:13:30.430270 4885 generic.go:334] "Generic (PLEG): container finished" podID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerID="1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0" exitCode=0 Mar 08 21:13:30 crc kubenswrapper[4885]: I0308 21:13:30.430328 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dbdd8c5b9-56mvx" event={"ID":"4d4063e1-1725-43e5-bf87-422d8e4a0e5b","Type":"ContainerDied","Data":"1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0"} Mar 08 21:13:36 crc kubenswrapper[4885]: I0308 21:13:36.037700 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-dbdd8c5b9-56mvx" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.158:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8080: connect: connection refused" Mar 08 21:13:36 crc kubenswrapper[4885]: I0308 21:13:36.508707 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-fffd5d5b8-82pm2" Mar 08 21:13:36 crc kubenswrapper[4885]: I0308 21:13:36.677621 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-796f99d566-r2p9d" Mar 08 21:13:38 crc kubenswrapper[4885]: I0308 21:13:38.058665 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-plxkf"] Mar 08 21:13:38 crc kubenswrapper[4885]: I0308 21:13:38.073635 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-75f7-account-create-update-7q9gr"] Mar 08 21:13:38 crc kubenswrapper[4885]: I0308 21:13:38.084889 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-75f7-account-create-update-7q9gr"] Mar 08 21:13:38 crc kubenswrapper[4885]: I0308 21:13:38.098271 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-plxkf"] Mar 08 21:13:38 crc kubenswrapper[4885]: I0308 21:13:38.368652 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:13:38 crc kubenswrapper[4885]: E0308 21:13:38.369075 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:13:39 crc kubenswrapper[4885]: I0308 21:13:39.388049 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d" path="/var/lib/kubelet/pods/af1dfb4f-f4a2-4d9a-bf1d-0b267d61592d/volumes" Mar 08 21:13:39 crc kubenswrapper[4885]: I0308 21:13:39.392090 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7ace920-d540-4598-82db-315caa467acb" path="/var/lib/kubelet/pods/e7ace920-d540-4598-82db-315caa467acb/volumes" Mar 08 21:13:45 crc kubenswrapper[4885]: I0308 21:13:45.007077 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-595686fb49-hx4rx" Mar 08 21:13:46 crc kubenswrapper[4885]: I0308 21:13:46.039640 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-dbdd8c5b9-56mvx" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.158:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8080: connect: connection refused" Mar 08 21:13:46 crc kubenswrapper[4885]: I0308 21:13:46.042115 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-m792h"] Mar 08 21:13:46 crc kubenswrapper[4885]: I0308 21:13:46.057194 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-m792h"] Mar 08 21:13:47 crc kubenswrapper[4885]: I0308 21:13:47.391958 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07fa5fd1-f6b9-4206-809d-c1f04533cab4" path="/var/lib/kubelet/pods/07fa5fd1-f6b9-4206-809d-c1f04533cab4/volumes" Mar 08 21:13:52 crc kubenswrapper[4885]: I0308 21:13:52.369058 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:13:52 crc kubenswrapper[4885]: E0308 21:13:52.370046 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:13:56 crc kubenswrapper[4885]: I0308 21:13:56.036881 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-dbdd8c5b9-56mvx" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.158:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.158:8080: connect: connection refused" Mar 08 21:13:56 crc kubenswrapper[4885]: I0308 21:13:56.037585 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.531131 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.637418 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-config-data\") pod \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.637767 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2bq7\" (UniqueName: \"kubernetes.io/projected/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-kube-api-access-x2bq7\") pod \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.637873 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-horizon-secret-key\") pod \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.638000 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-scripts\") pod \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.638053 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-logs\") pod \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\" (UID: \"4d4063e1-1725-43e5-bf87-422d8e4a0e5b\") " Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.638364 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-logs" (OuterVolumeSpecName: "logs") pod "4d4063e1-1725-43e5-bf87-422d8e4a0e5b" (UID: "4d4063e1-1725-43e5-bf87-422d8e4a0e5b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.639388 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.642848 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4d4063e1-1725-43e5-bf87-422d8e4a0e5b" (UID: "4d4063e1-1725-43e5-bf87-422d8e4a0e5b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.644035 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-kube-api-access-x2bq7" (OuterVolumeSpecName: "kube-api-access-x2bq7") pod "4d4063e1-1725-43e5-bf87-422d8e4a0e5b" (UID: "4d4063e1-1725-43e5-bf87-422d8e4a0e5b"). InnerVolumeSpecName "kube-api-access-x2bq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.666477 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-config-data" (OuterVolumeSpecName: "config-data") pod "4d4063e1-1725-43e5-bf87-422d8e4a0e5b" (UID: "4d4063e1-1725-43e5-bf87-422d8e4a0e5b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.671284 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-scripts" (OuterVolumeSpecName: "scripts") pod "4d4063e1-1725-43e5-bf87-422d8e4a0e5b" (UID: "4d4063e1-1725-43e5-bf87-422d8e4a0e5b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.741671 4885 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.741702 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.741711 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.741721 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2bq7\" (UniqueName: \"kubernetes.io/projected/4d4063e1-1725-43e5-bf87-422d8e4a0e5b-kube-api-access-x2bq7\") on node \"crc\" DevicePath \"\"" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.748501 4885 generic.go:334] "Generic (PLEG): container finished" podID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerID="321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097" exitCode=137 Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.748541 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dbdd8c5b9-56mvx" event={"ID":"4d4063e1-1725-43e5-bf87-422d8e4a0e5b","Type":"ContainerDied","Data":"321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097"} Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.748570 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-dbdd8c5b9-56mvx" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.748591 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-dbdd8c5b9-56mvx" event={"ID":"4d4063e1-1725-43e5-bf87-422d8e4a0e5b","Type":"ContainerDied","Data":"e892f970f210beda78d54563132ede0defb9bfa40842e20b8ca03cfb2cdffe13"} Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.748611 4885 scope.go:117] "RemoveContainer" containerID="1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.781296 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-dbdd8c5b9-56mvx"] Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.788692 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-dbdd8c5b9-56mvx"] Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.951165 4885 scope.go:117] "RemoveContainer" containerID="321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.977968 4885 scope.go:117] "RemoveContainer" containerID="1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0" Mar 08 21:13:57 crc kubenswrapper[4885]: E0308 21:13:57.978454 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0\": container with ID starting with 1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0 not found: ID does not exist" containerID="1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.978519 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0"} err="failed to get container status \"1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0\": rpc error: code = NotFound desc = could not find container \"1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0\": container with ID starting with 1636fed38ae16db0eab9ae478bf1af7b7ab7c7a6b567927b6ba4f853148781f0 not found: ID does not exist" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.978560 4885 scope.go:117] "RemoveContainer" containerID="321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097" Mar 08 21:13:57 crc kubenswrapper[4885]: E0308 21:13:57.978992 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097\": container with ID starting with 321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097 not found: ID does not exist" containerID="321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097" Mar 08 21:13:57 crc kubenswrapper[4885]: I0308 21:13:57.979039 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097"} err="failed to get container status \"321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097\": rpc error: code = NotFound desc = could not find container \"321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097\": container with ID starting with 321421e3bcaedf27ec0611444555d685a3e11715ead584656ecc613e34f2d097 not found: ID does not exist" Mar 08 21:13:59 crc kubenswrapper[4885]: I0308 21:13:59.396726 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" path="/var/lib/kubelet/pods/4d4063e1-1725-43e5-bf87-422d8e4a0e5b/volumes" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.142316 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550074-lqgkm"] Mar 08 21:14:00 crc kubenswrapper[4885]: E0308 21:14:00.143094 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.143127 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" Mar 08 21:14:00 crc kubenswrapper[4885]: E0308 21:14:00.143165 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon-log" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.143178 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon-log" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.143529 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.143585 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d4063e1-1725-43e5-bf87-422d8e4a0e5b" containerName="horizon-log" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.144707 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.147010 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.147263 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.150630 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550074-lqgkm"] Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.151316 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.193464 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7mwg\" (UniqueName: \"kubernetes.io/projected/3a0dd2e1-2283-49bf-b5d1-deb889245d93-kube-api-access-n7mwg\") pod \"auto-csr-approver-29550074-lqgkm\" (UID: \"3a0dd2e1-2283-49bf-b5d1-deb889245d93\") " pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.296287 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7mwg\" (UniqueName: \"kubernetes.io/projected/3a0dd2e1-2283-49bf-b5d1-deb889245d93-kube-api-access-n7mwg\") pod \"auto-csr-approver-29550074-lqgkm\" (UID: \"3a0dd2e1-2283-49bf-b5d1-deb889245d93\") " pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.315247 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7mwg\" (UniqueName: \"kubernetes.io/projected/3a0dd2e1-2283-49bf-b5d1-deb889245d93-kube-api-access-n7mwg\") pod \"auto-csr-approver-29550074-lqgkm\" (UID: \"3a0dd2e1-2283-49bf-b5d1-deb889245d93\") " pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.468481 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:00 crc kubenswrapper[4885]: I0308 21:14:00.994770 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550074-lqgkm"] Mar 08 21:14:01 crc kubenswrapper[4885]: W0308 21:14:01.005646 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a0dd2e1_2283_49bf_b5d1_deb889245d93.slice/crio-13e865863c7bb43548ed0cd82ed64fcd90420c4348da5ab20b5fad90e38abcb5 WatchSource:0}: Error finding container 13e865863c7bb43548ed0cd82ed64fcd90420c4348da5ab20b5fad90e38abcb5: Status 404 returned error can't find the container with id 13e865863c7bb43548ed0cd82ed64fcd90420c4348da5ab20b5fad90e38abcb5 Mar 08 21:14:01 crc kubenswrapper[4885]: I0308 21:14:01.789226 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" event={"ID":"3a0dd2e1-2283-49bf-b5d1-deb889245d93","Type":"ContainerStarted","Data":"13e865863c7bb43548ed0cd82ed64fcd90420c4348da5ab20b5fad90e38abcb5"} Mar 08 21:14:02 crc kubenswrapper[4885]: I0308 21:14:02.804662 4885 generic.go:334] "Generic (PLEG): container finished" podID="3a0dd2e1-2283-49bf-b5d1-deb889245d93" containerID="49b008c658a6440bfe62e05cf707f768838f42546076db4ab0906a7cc3f15598" exitCode=0 Mar 08 21:14:02 crc kubenswrapper[4885]: I0308 21:14:02.804735 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" event={"ID":"3a0dd2e1-2283-49bf-b5d1-deb889245d93","Type":"ContainerDied","Data":"49b008c658a6440bfe62e05cf707f768838f42546076db4ab0906a7cc3f15598"} Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.240973 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.401750 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7mwg\" (UniqueName: \"kubernetes.io/projected/3a0dd2e1-2283-49bf-b5d1-deb889245d93-kube-api-access-n7mwg\") pod \"3a0dd2e1-2283-49bf-b5d1-deb889245d93\" (UID: \"3a0dd2e1-2283-49bf-b5d1-deb889245d93\") " Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.409783 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a0dd2e1-2283-49bf-b5d1-deb889245d93-kube-api-access-n7mwg" (OuterVolumeSpecName: "kube-api-access-n7mwg") pod "3a0dd2e1-2283-49bf-b5d1-deb889245d93" (UID: "3a0dd2e1-2283-49bf-b5d1-deb889245d93"). InnerVolumeSpecName "kube-api-access-n7mwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.504871 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7mwg\" (UniqueName: \"kubernetes.io/projected/3a0dd2e1-2283-49bf-b5d1-deb889245d93-kube-api-access-n7mwg\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.830269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" event={"ID":"3a0dd2e1-2283-49bf-b5d1-deb889245d93","Type":"ContainerDied","Data":"13e865863c7bb43548ed0cd82ed64fcd90420c4348da5ab20b5fad90e38abcb5"} Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.830784 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13e865863c7bb43548ed0cd82ed64fcd90420c4348da5ab20b5fad90e38abcb5" Mar 08 21:14:04 crc kubenswrapper[4885]: I0308 21:14:04.830381 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550074-lqgkm" Mar 08 21:14:05 crc kubenswrapper[4885]: I0308 21:14:05.370878 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:14:05 crc kubenswrapper[4885]: E0308 21:14:05.371167 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:14:05 crc kubenswrapper[4885]: I0308 21:14:05.403718 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550068-8fldh"] Mar 08 21:14:05 crc kubenswrapper[4885]: I0308 21:14:05.421064 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550068-8fldh"] Mar 08 21:14:07 crc kubenswrapper[4885]: I0308 21:14:07.389555 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a1f5d5d-f061-4187-a9ed-720b291774e5" path="/var/lib/kubelet/pods/2a1f5d5d-f061-4187-a9ed-720b291774e5/volumes" Mar 08 21:14:07 crc kubenswrapper[4885]: I0308 21:14:07.490145 4885 scope.go:117] "RemoveContainer" containerID="dff338c2a2d2f522bdec5e9f4d11ce93afde12127a49bc5918d50f6e48f1aa67" Mar 08 21:14:07 crc kubenswrapper[4885]: I0308 21:14:07.521114 4885 scope.go:117] "RemoveContainer" containerID="238463b8258e15f4cb33c673abe4bc3d05f4cb5a4961563b7dbb833ea2602b95" Mar 08 21:14:07 crc kubenswrapper[4885]: I0308 21:14:07.589711 4885 scope.go:117] "RemoveContainer" containerID="dcaa0b7048fe6fee9e8064fda1b6f6cbab5d7f0172b9d8b64c22f15e682b913a" Mar 08 21:14:07 crc kubenswrapper[4885]: I0308 21:14:07.661203 4885 scope.go:117] "RemoveContainer" containerID="a8dfd7b1b0ea895893398ba92cb9f076303594f9c38eca7bff04c272e28927af" Mar 08 21:14:14 crc kubenswrapper[4885]: I0308 21:14:14.893154 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm"] Mar 08 21:14:14 crc kubenswrapper[4885]: E0308 21:14:14.894088 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a0dd2e1-2283-49bf-b5d1-deb889245d93" containerName="oc" Mar 08 21:14:14 crc kubenswrapper[4885]: I0308 21:14:14.894104 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a0dd2e1-2283-49bf-b5d1-deb889245d93" containerName="oc" Mar 08 21:14:14 crc kubenswrapper[4885]: I0308 21:14:14.894297 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a0dd2e1-2283-49bf-b5d1-deb889245d93" containerName="oc" Mar 08 21:14:14 crc kubenswrapper[4885]: I0308 21:14:14.895736 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:14 crc kubenswrapper[4885]: I0308 21:14:14.898080 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 08 21:14:14 crc kubenswrapper[4885]: I0308 21:14:14.925337 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm"] Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.063415 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.064011 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmkf2\" (UniqueName: \"kubernetes.io/projected/99cf706b-d380-4027-ad93-af7f1e5f8a36-kube-api-access-hmkf2\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.064125 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.165900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmkf2\" (UniqueName: \"kubernetes.io/projected/99cf706b-d380-4027-ad93-af7f1e5f8a36-kube-api-access-hmkf2\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.165967 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.166019 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.166575 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.166795 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.185566 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmkf2\" (UniqueName: \"kubernetes.io/projected/99cf706b-d380-4027-ad93-af7f1e5f8a36-kube-api-access-hmkf2\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.227668 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.550281 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm"] Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.974682 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" event={"ID":"99cf706b-d380-4027-ad93-af7f1e5f8a36","Type":"ContainerStarted","Data":"64e7ae94816bf73cc80344e8edd25febc79b249fff3bbc0c09e9e009ae87967d"} Mar 08 21:14:15 crc kubenswrapper[4885]: I0308 21:14:15.975414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" event={"ID":"99cf706b-d380-4027-ad93-af7f1e5f8a36","Type":"ContainerStarted","Data":"0bf11f7445c675cdc8e89c3998b067d0c8ebc1c2a6a45d885ffa84b05adcb9e9"} Mar 08 21:14:17 crc kubenswrapper[4885]: I0308 21:14:17.006796 4885 generic.go:334] "Generic (PLEG): container finished" podID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerID="64e7ae94816bf73cc80344e8edd25febc79b249fff3bbc0c09e9e009ae87967d" exitCode=0 Mar 08 21:14:17 crc kubenswrapper[4885]: I0308 21:14:17.007466 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" event={"ID":"99cf706b-d380-4027-ad93-af7f1e5f8a36","Type":"ContainerDied","Data":"64e7ae94816bf73cc80344e8edd25febc79b249fff3bbc0c09e9e009ae87967d"} Mar 08 21:14:18 crc kubenswrapper[4885]: I0308 21:14:18.054368 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-txw9w"] Mar 08 21:14:18 crc kubenswrapper[4885]: I0308 21:14:18.067579 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5511-account-create-update-fvjhm"] Mar 08 21:14:18 crc kubenswrapper[4885]: I0308 21:14:18.081385 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-txw9w"] Mar 08 21:14:18 crc kubenswrapper[4885]: I0308 21:14:18.089876 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5511-account-create-update-fvjhm"] Mar 08 21:14:19 crc kubenswrapper[4885]: I0308 21:14:19.067759 4885 generic.go:334] "Generic (PLEG): container finished" podID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerID="1c3cdaf8a46d6162f9d363ef6cea18ba7753346787d097ff49ef105b81ca0a44" exitCode=0 Mar 08 21:14:19 crc kubenswrapper[4885]: I0308 21:14:19.068244 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" event={"ID":"99cf706b-d380-4027-ad93-af7f1e5f8a36","Type":"ContainerDied","Data":"1c3cdaf8a46d6162f9d363ef6cea18ba7753346787d097ff49ef105b81ca0a44"} Mar 08 21:14:19 crc kubenswrapper[4885]: I0308 21:14:19.380804 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14dd4829-951f-4e19-885f-f466dcbf9d1b" path="/var/lib/kubelet/pods/14dd4829-951f-4e19-885f-f466dcbf9d1b/volumes" Mar 08 21:14:19 crc kubenswrapper[4885]: I0308 21:14:19.382286 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8664af8f-0cf2-4ef8-a701-adbaba058240" path="/var/lib/kubelet/pods/8664af8f-0cf2-4ef8-a701-adbaba058240/volumes" Mar 08 21:14:20 crc kubenswrapper[4885]: I0308 21:14:20.085685 4885 generic.go:334] "Generic (PLEG): container finished" podID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerID="04dca4eea9d10aed627c2799e958e63d43e3b8b9874d48f81f61e7ac77836dac" exitCode=0 Mar 08 21:14:20 crc kubenswrapper[4885]: I0308 21:14:20.085993 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" event={"ID":"99cf706b-d380-4027-ad93-af7f1e5f8a36","Type":"ContainerDied","Data":"04dca4eea9d10aed627c2799e958e63d43e3b8b9874d48f81f61e7ac77836dac"} Mar 08 21:14:20 crc kubenswrapper[4885]: I0308 21:14:20.368129 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:14:20 crc kubenswrapper[4885]: E0308 21:14:20.368379 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.537051 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.617054 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-util\") pod \"99cf706b-d380-4027-ad93-af7f1e5f8a36\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.617254 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-bundle\") pod \"99cf706b-d380-4027-ad93-af7f1e5f8a36\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.617304 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmkf2\" (UniqueName: \"kubernetes.io/projected/99cf706b-d380-4027-ad93-af7f1e5f8a36-kube-api-access-hmkf2\") pod \"99cf706b-d380-4027-ad93-af7f1e5f8a36\" (UID: \"99cf706b-d380-4027-ad93-af7f1e5f8a36\") " Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.618687 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-bundle" (OuterVolumeSpecName: "bundle") pod "99cf706b-d380-4027-ad93-af7f1e5f8a36" (UID: "99cf706b-d380-4027-ad93-af7f1e5f8a36"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.631864 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-util" (OuterVolumeSpecName: "util") pod "99cf706b-d380-4027-ad93-af7f1e5f8a36" (UID: "99cf706b-d380-4027-ad93-af7f1e5f8a36"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.706085 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99cf706b-d380-4027-ad93-af7f1e5f8a36-kube-api-access-hmkf2" (OuterVolumeSpecName: "kube-api-access-hmkf2") pod "99cf706b-d380-4027-ad93-af7f1e5f8a36" (UID: "99cf706b-d380-4027-ad93-af7f1e5f8a36"). InnerVolumeSpecName "kube-api-access-hmkf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.720452 4885 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-util\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.720485 4885 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/99cf706b-d380-4027-ad93-af7f1e5f8a36-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:21 crc kubenswrapper[4885]: I0308 21:14:21.720498 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmkf2\" (UniqueName: \"kubernetes.io/projected/99cf706b-d380-4027-ad93-af7f1e5f8a36-kube-api-access-hmkf2\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:22 crc kubenswrapper[4885]: I0308 21:14:22.117832 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" event={"ID":"99cf706b-d380-4027-ad93-af7f1e5f8a36","Type":"ContainerDied","Data":"0bf11f7445c675cdc8e89c3998b067d0c8ebc1c2a6a45d885ffa84b05adcb9e9"} Mar 08 21:14:22 crc kubenswrapper[4885]: I0308 21:14:22.117905 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0bf11f7445c675cdc8e89c3998b067d0c8ebc1c2a6a45d885ffa84b05adcb9e9" Mar 08 21:14:22 crc kubenswrapper[4885]: I0308 21:14:22.118058 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm" Mar 08 21:14:24 crc kubenswrapper[4885]: I0308 21:14:24.051623 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-q8g6n"] Mar 08 21:14:24 crc kubenswrapper[4885]: I0308 21:14:24.064902 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-q8g6n"] Mar 08 21:14:25 crc kubenswrapper[4885]: I0308 21:14:25.387432 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aad146d-597d-436f-ba72-59a57f223ad0" path="/var/lib/kubelet/pods/1aad146d-597d-436f-ba72-59a57f223ad0/volumes" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.740646 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z"] Mar 08 21:14:32 crc kubenswrapper[4885]: E0308 21:14:32.741420 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="extract" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.741432 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="extract" Mar 08 21:14:32 crc kubenswrapper[4885]: E0308 21:14:32.741450 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="util" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.741457 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="util" Mar 08 21:14:32 crc kubenswrapper[4885]: E0308 21:14:32.741479 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="pull" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.741485 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="pull" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.741679 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="99cf706b-d380-4027-ad93-af7f1e5f8a36" containerName="extract" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.742320 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.744047 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.745112 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.745283 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-x4rm8" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.755717 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z"] Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.854598 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfp2k\" (UniqueName: \"kubernetes.io/projected/c9864aac-5821-4f9b-bcc8-f07752f987b7-kube-api-access-lfp2k\") pod \"obo-prometheus-operator-68bc856cb9-brf5z\" (UID: \"c9864aac-5821-4f9b-bcc8-f07752f987b7\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.859618 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7"] Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.860782 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.863283 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-4xjdl" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.878635 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.883066 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429"] Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.884236 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.892982 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7"] Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.940019 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429"] Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.958309 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe4d43f-e037-431e-98e3-d50194963def-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7\" (UID: \"0fe4d43f-e037-431e-98e3-d50194963def\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.958570 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65ea3078-ccec-4913-9ce0-873ad93efd0e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429\" (UID: \"65ea3078-ccec-4913-9ce0-873ad93efd0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.958682 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65ea3078-ccec-4913-9ce0-873ad93efd0e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429\" (UID: \"65ea3078-ccec-4913-9ce0-873ad93efd0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.958738 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfp2k\" (UniqueName: \"kubernetes.io/projected/c9864aac-5821-4f9b-bcc8-f07752f987b7-kube-api-access-lfp2k\") pod \"obo-prometheus-operator-68bc856cb9-brf5z\" (UID: \"c9864aac-5821-4f9b-bcc8-f07752f987b7\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" Mar 08 21:14:32 crc kubenswrapper[4885]: I0308 21:14:32.958896 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe4d43f-e037-431e-98e3-d50194963def-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7\" (UID: \"0fe4d43f-e037-431e-98e3-d50194963def\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.019849 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfp2k\" (UniqueName: \"kubernetes.io/projected/c9864aac-5821-4f9b-bcc8-f07752f987b7-kube-api-access-lfp2k\") pod \"obo-prometheus-operator-68bc856cb9-brf5z\" (UID: \"c9864aac-5821-4f9b-bcc8-f07752f987b7\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.034837 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-qfwg5"] Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.037099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.040537 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.040738 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-22zwk" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.047632 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-qfwg5"] Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.062309 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe4d43f-e037-431e-98e3-d50194963def-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7\" (UID: \"0fe4d43f-e037-431e-98e3-d50194963def\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.062394 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe4d43f-e037-431e-98e3-d50194963def-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7\" (UID: \"0fe4d43f-e037-431e-98e3-d50194963def\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.062470 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65ea3078-ccec-4913-9ce0-873ad93efd0e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429\" (UID: \"65ea3078-ccec-4913-9ce0-873ad93efd0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.062518 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65ea3078-ccec-4913-9ce0-873ad93efd0e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429\" (UID: \"65ea3078-ccec-4913-9ce0-873ad93efd0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.066425 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/65ea3078-ccec-4913-9ce0-873ad93efd0e-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429\" (UID: \"65ea3078-ccec-4913-9ce0-873ad93efd0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.067859 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/65ea3078-ccec-4913-9ce0-873ad93efd0e-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429\" (UID: \"65ea3078-ccec-4913-9ce0-873ad93efd0e\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.069446 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.077439 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe4d43f-e037-431e-98e3-d50194963def-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7\" (UID: \"0fe4d43f-e037-431e-98e3-d50194963def\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.086548 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe4d43f-e037-431e-98e3-d50194963def-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7\" (UID: \"0fe4d43f-e037-431e-98e3-d50194963def\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.101297 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-m8k65"] Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.102564 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.106748 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-6lrsm" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.140870 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-m8k65"] Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.168202 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ks99\" (UniqueName: \"kubernetes.io/projected/482d7874-16e6-4043-95b1-59222dab9edc-kube-api-access-6ks99\") pod \"observability-operator-59bdc8b94-qfwg5\" (UID: \"482d7874-16e6-4043-95b1-59222dab9edc\") " pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.168254 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/482d7874-16e6-4043-95b1-59222dab9edc-observability-operator-tls\") pod \"observability-operator-59bdc8b94-qfwg5\" (UID: \"482d7874-16e6-4043-95b1-59222dab9edc\") " pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.176857 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.214118 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.273112 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/482d7874-16e6-4043-95b1-59222dab9edc-observability-operator-tls\") pod \"observability-operator-59bdc8b94-qfwg5\" (UID: \"482d7874-16e6-4043-95b1-59222dab9edc\") " pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.273166 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8cqt\" (UniqueName: \"kubernetes.io/projected/062a5ba6-b2c8-4b0c-95e1-d51c1196f367-kube-api-access-l8cqt\") pod \"perses-operator-5bf474d74f-m8k65\" (UID: \"062a5ba6-b2c8-4b0c-95e1-d51c1196f367\") " pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.273205 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/062a5ba6-b2c8-4b0c-95e1-d51c1196f367-openshift-service-ca\") pod \"perses-operator-5bf474d74f-m8k65\" (UID: \"062a5ba6-b2c8-4b0c-95e1-d51c1196f367\") " pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.273347 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ks99\" (UniqueName: \"kubernetes.io/projected/482d7874-16e6-4043-95b1-59222dab9edc-kube-api-access-6ks99\") pod \"observability-operator-59bdc8b94-qfwg5\" (UID: \"482d7874-16e6-4043-95b1-59222dab9edc\") " pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.277861 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/482d7874-16e6-4043-95b1-59222dab9edc-observability-operator-tls\") pod \"observability-operator-59bdc8b94-qfwg5\" (UID: \"482d7874-16e6-4043-95b1-59222dab9edc\") " pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.302525 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ks99\" (UniqueName: \"kubernetes.io/projected/482d7874-16e6-4043-95b1-59222dab9edc-kube-api-access-6ks99\") pod \"observability-operator-59bdc8b94-qfwg5\" (UID: \"482d7874-16e6-4043-95b1-59222dab9edc\") " pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.375136 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8cqt\" (UniqueName: \"kubernetes.io/projected/062a5ba6-b2c8-4b0c-95e1-d51c1196f367-kube-api-access-l8cqt\") pod \"perses-operator-5bf474d74f-m8k65\" (UID: \"062a5ba6-b2c8-4b0c-95e1-d51c1196f367\") " pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.375575 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/062a5ba6-b2c8-4b0c-95e1-d51c1196f367-openshift-service-ca\") pod \"perses-operator-5bf474d74f-m8k65\" (UID: \"062a5ba6-b2c8-4b0c-95e1-d51c1196f367\") " pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.376479 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/062a5ba6-b2c8-4b0c-95e1-d51c1196f367-openshift-service-ca\") pod \"perses-operator-5bf474d74f-m8k65\" (UID: \"062a5ba6-b2c8-4b0c-95e1-d51c1196f367\") " pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.397553 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8cqt\" (UniqueName: \"kubernetes.io/projected/062a5ba6-b2c8-4b0c-95e1-d51c1196f367-kube-api-access-l8cqt\") pod \"perses-operator-5bf474d74f-m8k65\" (UID: \"062a5ba6-b2c8-4b0c-95e1-d51c1196f367\") " pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.498571 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.560123 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:33 crc kubenswrapper[4885]: W0308 21:14:33.686826 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9864aac_5821_4f9b_bcc8_f07752f987b7.slice/crio-f3781dd6d9b1c8f094ce04de153a2bdbb35c0c024a0ebd5851e2b45dc2ab9da6 WatchSource:0}: Error finding container f3781dd6d9b1c8f094ce04de153a2bdbb35c0c024a0ebd5851e2b45dc2ab9da6: Status 404 returned error can't find the container with id f3781dd6d9b1c8f094ce04de153a2bdbb35c0c024a0ebd5851e2b45dc2ab9da6 Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.691144 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z"] Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.692697 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.796805 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7"] Mar 08 21:14:33 crc kubenswrapper[4885]: W0308 21:14:33.827633 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fe4d43f_e037_431e_98e3_d50194963def.slice/crio-d9af8a6628efffdc52a9f0ca233345602b4641916f08f1aca9d1aa6f7526f180 WatchSource:0}: Error finding container d9af8a6628efffdc52a9f0ca233345602b4641916f08f1aca9d1aa6f7526f180: Status 404 returned error can't find the container with id d9af8a6628efffdc52a9f0ca233345602b4641916f08f1aca9d1aa6f7526f180 Mar 08 21:14:33 crc kubenswrapper[4885]: I0308 21:14:33.876172 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429"] Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.072292 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-qfwg5"] Mar 08 21:14:34 crc kubenswrapper[4885]: W0308 21:14:34.087853 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482d7874_16e6_4043_95b1_59222dab9edc.slice/crio-eed385eec624d1dfea2d50cbc741530d73579febe0eb62e388b12aef1c6815f6 WatchSource:0}: Error finding container eed385eec624d1dfea2d50cbc741530d73579febe0eb62e388b12aef1c6815f6: Status 404 returned error can't find the container with id eed385eec624d1dfea2d50cbc741530d73579febe0eb62e388b12aef1c6815f6 Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.212314 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-m8k65"] Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.226699 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" event={"ID":"482d7874-16e6-4043-95b1-59222dab9edc","Type":"ContainerStarted","Data":"eed385eec624d1dfea2d50cbc741530d73579febe0eb62e388b12aef1c6815f6"} Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.228800 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" event={"ID":"0fe4d43f-e037-431e-98e3-d50194963def","Type":"ContainerStarted","Data":"d9af8a6628efffdc52a9f0ca233345602b4641916f08f1aca9d1aa6f7526f180"} Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.229776 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" event={"ID":"062a5ba6-b2c8-4b0c-95e1-d51c1196f367","Type":"ContainerStarted","Data":"391898f0dd9350cc4c92a0e882e4ded906044960b83c0c12659555eb5adaf87b"} Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.231178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" event={"ID":"65ea3078-ccec-4913-9ce0-873ad93efd0e","Type":"ContainerStarted","Data":"a9102119a5c4610006f8fd7fd0d3a6dc25ed7d1a481e874c246b07c16df5b041"} Mar 08 21:14:34 crc kubenswrapper[4885]: I0308 21:14:34.232227 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" event={"ID":"c9864aac-5821-4f9b-bcc8-f07752f987b7","Type":"ContainerStarted","Data":"f3781dd6d9b1c8f094ce04de153a2bdbb35c0c024a0ebd5851e2b45dc2ab9da6"} Mar 08 21:14:35 crc kubenswrapper[4885]: I0308 21:14:35.368101 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:14:35 crc kubenswrapper[4885]: E0308 21:14:35.368647 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.419451 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fk9ll"] Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.424552 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.438231 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fk9ll"] Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.569803 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-catalog-content\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.569905 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-utilities\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.569942 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bwkz\" (UniqueName: \"kubernetes.io/projected/613a3b7b-ebce-483d-b67d-3c9310c4604d-kube-api-access-6bwkz\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.673411 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-catalog-content\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.673550 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-utilities\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.673579 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bwkz\" (UniqueName: \"kubernetes.io/projected/613a3b7b-ebce-483d-b67d-3c9310c4604d-kube-api-access-6bwkz\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.674012 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-catalog-content\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.674262 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-utilities\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.696788 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bwkz\" (UniqueName: \"kubernetes.io/projected/613a3b7b-ebce-483d-b67d-3c9310c4604d-kube-api-access-6bwkz\") pod \"community-operators-fk9ll\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:40 crc kubenswrapper[4885]: I0308 21:14:40.795818 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:14:46 crc kubenswrapper[4885]: I0308 21:14:46.394974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" event={"ID":"0fe4d43f-e037-431e-98e3-d50194963def","Type":"ContainerStarted","Data":"0bec2e78a1f6db95fc14e9c10e3cfa2221801ab222a50e6e6e8b4165bc3f1580"} Mar 08 21:14:46 crc kubenswrapper[4885]: I0308 21:14:46.457651 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7" podStartSLOduration=2.42315509 podStartE2EDuration="14.457633431s" podCreationTimestamp="2026-03-08 21:14:32 +0000 UTC" firstStartedPulling="2026-03-08 21:14:33.83178195 +0000 UTC m=+6175.227835973" lastFinishedPulling="2026-03-08 21:14:45.866260281 +0000 UTC m=+6187.262314314" observedRunningTime="2026-03-08 21:14:46.414960818 +0000 UTC m=+6187.811014841" watchObservedRunningTime="2026-03-08 21:14:46.457633431 +0000 UTC m=+6187.853687454" Mar 08 21:14:46 crc kubenswrapper[4885]: I0308 21:14:46.459650 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fk9ll"] Mar 08 21:14:46 crc kubenswrapper[4885]: W0308 21:14:46.487413 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod613a3b7b_ebce_483d_b67d_3c9310c4604d.slice/crio-959f3b59f5078928407d48f4de045117a3898b45a52b5bb95261aebb664945ba WatchSource:0}: Error finding container 959f3b59f5078928407d48f4de045117a3898b45a52b5bb95261aebb664945ba: Status 404 returned error can't find the container with id 959f3b59f5078928407d48f4de045117a3898b45a52b5bb95261aebb664945ba Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.413909 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" event={"ID":"482d7874-16e6-4043-95b1-59222dab9edc","Type":"ContainerStarted","Data":"dc346109ab017eea4cfc43b0424b8670bd1c301c50a94d88baf86363cd4813ce"} Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.415303 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.416980 4885 generic.go:334] "Generic (PLEG): container finished" podID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerID="6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4" exitCode=0 Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.417108 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerDied","Data":"6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4"} Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.417184 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerStarted","Data":"959f3b59f5078928407d48f4de045117a3898b45a52b5bb95261aebb664945ba"} Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.421076 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" event={"ID":"062a5ba6-b2c8-4b0c-95e1-d51c1196f367","Type":"ContainerStarted","Data":"356632c80c5662f9b6d1f23a5fc866ba11029da9ffb54c4b6abae369029cb11c"} Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.421869 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.426490 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" event={"ID":"65ea3078-ccec-4913-9ce0-873ad93efd0e","Type":"ContainerStarted","Data":"fc0c2487f4508b7df7ce3bc93c23366165a739b59177f333ee66b4ab6654a443"} Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.428412 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" event={"ID":"c9864aac-5821-4f9b-bcc8-f07752f987b7","Type":"ContainerStarted","Data":"3056034978a15a214f614d3a7a9485587e77c48288f9e4fa8470247f9d4f48d9"} Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.445254 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" podStartSLOduration=3.597096586 podStartE2EDuration="15.445236793s" podCreationTimestamp="2026-03-08 21:14:32 +0000 UTC" firstStartedPulling="2026-03-08 21:14:34.09109461 +0000 UTC m=+6175.487148633" lastFinishedPulling="2026-03-08 21:14:45.939234817 +0000 UTC m=+6187.335288840" observedRunningTime="2026-03-08 21:14:47.438635265 +0000 UTC m=+6188.834689288" watchObservedRunningTime="2026-03-08 21:14:47.445236793 +0000 UTC m=+6188.841290816" Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.461603 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-qfwg5" Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.462804 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429" podStartSLOduration=3.459478337 podStartE2EDuration="15.462786263s" podCreationTimestamp="2026-03-08 21:14:32 +0000 UTC" firstStartedPulling="2026-03-08 21:14:33.880161056 +0000 UTC m=+6175.276215079" lastFinishedPulling="2026-03-08 21:14:45.883468982 +0000 UTC m=+6187.279523005" observedRunningTime="2026-03-08 21:14:47.456159055 +0000 UTC m=+6188.852213078" watchObservedRunningTime="2026-03-08 21:14:47.462786263 +0000 UTC m=+6188.858840286" Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.490136 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-brf5z" podStartSLOduration=3.296816267 podStartE2EDuration="15.490115856s" podCreationTimestamp="2026-03-08 21:14:32 +0000 UTC" firstStartedPulling="2026-03-08 21:14:33.692458745 +0000 UTC m=+6175.088512768" lastFinishedPulling="2026-03-08 21:14:45.885758334 +0000 UTC m=+6187.281812357" observedRunningTime="2026-03-08 21:14:47.486350214 +0000 UTC m=+6188.882404237" watchObservedRunningTime="2026-03-08 21:14:47.490115856 +0000 UTC m=+6188.886169889" Mar 08 21:14:47 crc kubenswrapper[4885]: I0308 21:14:47.550232 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" podStartSLOduration=2.879088644 podStartE2EDuration="14.550209056s" podCreationTimestamp="2026-03-08 21:14:33 +0000 UTC" firstStartedPulling="2026-03-08 21:14:34.213133871 +0000 UTC m=+6175.609187894" lastFinishedPulling="2026-03-08 21:14:45.884254263 +0000 UTC m=+6187.280308306" observedRunningTime="2026-03-08 21:14:47.544005649 +0000 UTC m=+6188.940059672" watchObservedRunningTime="2026-03-08 21:14:47.550209056 +0000 UTC m=+6188.946263089" Mar 08 21:14:48 crc kubenswrapper[4885]: I0308 21:14:48.368140 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:14:48 crc kubenswrapper[4885]: E0308 21:14:48.368695 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:14:48 crc kubenswrapper[4885]: I0308 21:14:48.439524 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerStarted","Data":"cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1"} Mar 08 21:14:50 crc kubenswrapper[4885]: I0308 21:14:50.460616 4885 generic.go:334] "Generic (PLEG): container finished" podID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerID="cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1" exitCode=0 Mar 08 21:14:50 crc kubenswrapper[4885]: I0308 21:14:50.460709 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerDied","Data":"cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1"} Mar 08 21:14:51 crc kubenswrapper[4885]: I0308 21:14:51.473970 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerStarted","Data":"0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5"} Mar 08 21:14:51 crc kubenswrapper[4885]: I0308 21:14:51.505952 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fk9ll" podStartSLOduration=8.032965296 podStartE2EDuration="11.50590822s" podCreationTimestamp="2026-03-08 21:14:40 +0000 UTC" firstStartedPulling="2026-03-08 21:14:47.418946298 +0000 UTC m=+6188.815000321" lastFinishedPulling="2026-03-08 21:14:50.891889222 +0000 UTC m=+6192.287943245" observedRunningTime="2026-03-08 21:14:51.493386595 +0000 UTC m=+6192.889440628" watchObservedRunningTime="2026-03-08 21:14:51.50590822 +0000 UTC m=+6192.901962263" Mar 08 21:14:53 crc kubenswrapper[4885]: I0308 21:14:53.564138 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-m8k65" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.163573 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.164172 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="f7e4501e-3805-4590-b759-f520d3f98787" containerName="openstackclient" containerID="cri-o://974e17a17c8c2918732ff271aeb4290a267934c8e410394a48e09833b501694e" gracePeriod=2 Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.195595 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.243860 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 08 21:14:56 crc kubenswrapper[4885]: E0308 21:14:56.244382 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e4501e-3805-4590-b759-f520d3f98787" containerName="openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.244395 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e4501e-3805-4590-b759-f520d3f98787" containerName="openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.244574 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e4501e-3805-4590-b759-f520d3f98787" containerName="openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.245308 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.257308 4885 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="f7e4501e-3805-4590-b759-f520d3f98787" podUID="beb866d8-13cb-4dd6-9ce8-a2dad0935453" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.270242 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.335049 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjc7t\" (UniqueName: \"kubernetes.io/projected/beb866d8-13cb-4dd6-9ce8-a2dad0935453-kube-api-access-sjc7t\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.335198 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/beb866d8-13cb-4dd6-9ce8-a2dad0935453-openstack-config\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.335227 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/beb866d8-13cb-4dd6-9ce8-a2dad0935453-openstack-config-secret\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.436600 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/beb866d8-13cb-4dd6-9ce8-a2dad0935453-openstack-config\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.436655 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/beb866d8-13cb-4dd6-9ce8-a2dad0935453-openstack-config-secret\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.436711 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjc7t\" (UniqueName: \"kubernetes.io/projected/beb866d8-13cb-4dd6-9ce8-a2dad0935453-kube-api-access-sjc7t\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.437497 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/beb866d8-13cb-4dd6-9ce8-a2dad0935453-openstack-config\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.444383 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/beb866d8-13cb-4dd6-9ce8-a2dad0935453-openstack-config-secret\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.466817 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjc7t\" (UniqueName: \"kubernetes.io/projected/beb866d8-13cb-4dd6-9ce8-a2dad0935453-kube-api-access-sjc7t\") pod \"openstackclient\" (UID: \"beb866d8-13cb-4dd6-9ce8-a2dad0935453\") " pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.507880 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.509066 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.520429 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-mbf6w" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.534159 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.586551 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.651389 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxz8s\" (UniqueName: \"kubernetes.io/projected/d54c8104-6382-4373-a672-8e2ac804ebba-kube-api-access-sxz8s\") pod \"kube-state-metrics-0\" (UID: \"d54c8104-6382-4373-a672-8e2ac804ebba\") " pod="openstack/kube-state-metrics-0" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.753567 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxz8s\" (UniqueName: \"kubernetes.io/projected/d54c8104-6382-4373-a672-8e2ac804ebba-kube-api-access-sxz8s\") pod \"kube-state-metrics-0\" (UID: \"d54c8104-6382-4373-a672-8e2ac804ebba\") " pod="openstack/kube-state-metrics-0" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.791844 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxz8s\" (UniqueName: \"kubernetes.io/projected/d54c8104-6382-4373-a672-8e2ac804ebba-kube-api-access-sxz8s\") pod \"kube-state-metrics-0\" (UID: \"d54c8104-6382-4373-a672-8e2ac804ebba\") " pod="openstack/kube-state-metrics-0" Mar 08 21:14:56 crc kubenswrapper[4885]: I0308 21:14:56.848638 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 08 21:14:57 crc kubenswrapper[4885]: I0308 21:14:57.454743 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 08 21:14:57 crc kubenswrapper[4885]: I0308 21:14:57.504164 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 08 21:14:57 crc kubenswrapper[4885]: I0308 21:14:57.546092 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"beb866d8-13cb-4dd6-9ce8-a2dad0935453","Type":"ContainerStarted","Data":"e6e0d10d5fcd8acc86c54d3629931fc145e0cce2b5b8a547512bea2f630148dd"} Mar 08 21:14:57 crc kubenswrapper[4885]: I0308 21:14:57.547730 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d54c8104-6382-4373-a672-8e2ac804ebba","Type":"ContainerStarted","Data":"911bb8f5c9d1ee0e418d9cedc73f66b111248dddf43de71e7743d3b3f5fef206"} Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.559113 4885 generic.go:334] "Generic (PLEG): container finished" podID="f7e4501e-3805-4590-b759-f520d3f98787" containerID="974e17a17c8c2918732ff271aeb4290a267934c8e410394a48e09833b501694e" exitCode=137 Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.562645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d54c8104-6382-4373-a672-8e2ac804ebba","Type":"ContainerStarted","Data":"b8672606b0c2bd7251a86f9aab2de4d6445651aaddfe82a993d6a5b63cc19382"} Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.563843 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.565439 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"beb866d8-13cb-4dd6-9ce8-a2dad0935453","Type":"ContainerStarted","Data":"c790888d5b3e8f9b62707fb63c279939546f530dd9f1a42ab63cc7bd52c72d51"} Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.644758 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.644742883 podStartE2EDuration="2.644742883s" podCreationTimestamp="2026-03-08 21:14:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:14:58.638222268 +0000 UTC m=+6200.034276291" watchObservedRunningTime="2026-03-08 21:14:58.644742883 +0000 UTC m=+6200.040796906" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.648484 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.21186234 podStartE2EDuration="2.648464162s" podCreationTimestamp="2026-03-08 21:14:56 +0000 UTC" firstStartedPulling="2026-03-08 21:14:57.479637724 +0000 UTC m=+6198.875691747" lastFinishedPulling="2026-03-08 21:14:57.916239546 +0000 UTC m=+6199.312293569" observedRunningTime="2026-03-08 21:14:58.60320885 +0000 UTC m=+6199.999262873" watchObservedRunningTime="2026-03-08 21:14:58.648464162 +0000 UTC m=+6200.044518185" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.786214 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.795283 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.797268 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.805404 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.805840 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.806768 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.807061 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-bxpws" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.807338 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.863058 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.917841 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2tgf\" (UniqueName: \"kubernetes.io/projected/f7e4501e-3805-4590-b759-f520d3f98787-kube-api-access-v2tgf\") pod \"f7e4501e-3805-4590-b759-f520d3f98787\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.920464 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config-secret\") pod \"f7e4501e-3805-4590-b759-f520d3f98787\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.921184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config\") pod \"f7e4501e-3805-4590-b759-f520d3f98787\" (UID: \"f7e4501e-3805-4590-b759-f520d3f98787\") " Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.970850 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.970934 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2cc2\" (UniqueName: \"kubernetes.io/projected/55b083d5-789c-424a-8e11-f5e2e4bc51b0-kube-api-access-l2cc2\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.971071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/55b083d5-789c-424a-8e11-f5e2e4bc51b0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.971222 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.971251 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/55b083d5-789c-424a-8e11-f5e2e4bc51b0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.971293 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/55b083d5-789c-424a-8e11-f5e2e4bc51b0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.971344 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:58 crc kubenswrapper[4885]: I0308 21:14:58.971633 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e4501e-3805-4590-b759-f520d3f98787-kube-api-access-v2tgf" (OuterVolumeSpecName: "kube-api-access-v2tgf") pod "f7e4501e-3805-4590-b759-f520d3f98787" (UID: "f7e4501e-3805-4590-b759-f520d3f98787"). InnerVolumeSpecName "kube-api-access-v2tgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.039102 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "f7e4501e-3805-4590-b759-f520d3f98787" (UID: "f7e4501e-3805-4590-b759-f520d3f98787"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073665 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073724 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2cc2\" (UniqueName: \"kubernetes.io/projected/55b083d5-789c-424a-8e11-f5e2e4bc51b0-kube-api-access-l2cc2\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073784 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/55b083d5-789c-424a-8e11-f5e2e4bc51b0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073848 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073868 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/55b083d5-789c-424a-8e11-f5e2e4bc51b0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073892 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/55b083d5-789c-424a-8e11-f5e2e4bc51b0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.073948 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.074021 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.074033 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2tgf\" (UniqueName: \"kubernetes.io/projected/f7e4501e-3805-4590-b759-f520d3f98787-kube-api-access-v2tgf\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.075686 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/55b083d5-789c-424a-8e11-f5e2e4bc51b0-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.079898 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.080569 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/55b083d5-789c-424a-8e11-f5e2e4bc51b0-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.081163 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.082219 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/55b083d5-789c-424a-8e11-f5e2e4bc51b0-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.085440 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/55b083d5-789c-424a-8e11-f5e2e4bc51b0-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.095547 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "f7e4501e-3805-4590-b759-f520d3f98787" (UID: "f7e4501e-3805-4590-b759-f520d3f98787"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.096760 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2cc2\" (UniqueName: \"kubernetes.io/projected/55b083d5-789c-424a-8e11-f5e2e4bc51b0-kube-api-access-l2cc2\") pod \"alertmanager-metric-storage-0\" (UID: \"55b083d5-789c-424a-8e11-f5e2e4bc51b0\") " pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.175411 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f7e4501e-3805-4590-b759-f520d3f98787-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.273739 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.405157 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7e4501e-3805-4590-b759-f520d3f98787" path="/var/lib/kubelet/pods/f7e4501e-3805-4590-b759-f520d3f98787/volumes" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.431165 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.438459 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443305 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443451 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-v95cc" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443526 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443625 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443642 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443318 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443767 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.443820 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.454987 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.573559 4885 scope.go:117] "RemoveContainer" containerID="974e17a17c8c2918732ff271aeb4290a267934c8e410394a48e09833b501694e" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.573732 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.587726 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlpqk\" (UniqueName: \"kubernetes.io/projected/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-kube-api-access-hlpqk\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.590002 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-config\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.606264 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.606388 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.606497 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.606577 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.606666 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.608000 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.608048 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.608150 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.710374 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.710719 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.710769 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.710824 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.710874 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.710941 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlpqk\" (UniqueName: \"kubernetes.io/projected/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-kube-api-access-hlpqk\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711104 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711129 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-config\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711181 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711636 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711710 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.711796 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.720970 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.723311 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.735755 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlpqk\" (UniqueName: \"kubernetes.io/projected/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-kube-api-access-hlpqk\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.737272 4885 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.737310 4885 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/66297ffd1e0f2d6f527b2446228b4ca8e7c611bc8c1afd6b5737c292872e8be2/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.748340 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.752360 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.753413 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/efd2a302-0f57-40e0-9a28-0a1cdfabfc5e-config\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.800050 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1a85d3e7-4f6e-4156-9b90-a81473fee128\") pod \"prometheus-metric-storage-0\" (UID: \"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e\") " pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.813098 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 08 21:14:59 crc kubenswrapper[4885]: I0308 21:14:59.828529 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.138740 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb"] Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.140710 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.143307 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.143517 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.152337 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb"] Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.291385 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 08 21:15:00 crc kubenswrapper[4885]: W0308 21:15:00.293191 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefd2a302_0f57_40e0_9a28_0a1cdfabfc5e.slice/crio-59495d3e6487083aec80beffdf27c36e1c672dc6fcce1b1690dea80a74678d89 WatchSource:0}: Error finding container 59495d3e6487083aec80beffdf27c36e1c672dc6fcce1b1690dea80a74678d89: Status 404 returned error can't find the container with id 59495d3e6487083aec80beffdf27c36e1c672dc6fcce1b1690dea80a74678d89 Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.324575 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff6jr\" (UniqueName: \"kubernetes.io/projected/e7dec2e5-804e-4bc5-99cc-370c31d352e0-kube-api-access-ff6jr\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.324794 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7dec2e5-804e-4bc5-99cc-370c31d352e0-config-volume\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.324965 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7dec2e5-804e-4bc5-99cc-370c31d352e0-secret-volume\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.426244 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7dec2e5-804e-4bc5-99cc-370c31d352e0-config-volume\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.426366 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7dec2e5-804e-4bc5-99cc-370c31d352e0-secret-volume\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.426419 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff6jr\" (UniqueName: \"kubernetes.io/projected/e7dec2e5-804e-4bc5-99cc-370c31d352e0-kube-api-access-ff6jr\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.427208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7dec2e5-804e-4bc5-99cc-370c31d352e0-config-volume\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.432905 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7dec2e5-804e-4bc5-99cc-370c31d352e0-secret-volume\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.446611 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff6jr\" (UniqueName: \"kubernetes.io/projected/e7dec2e5-804e-4bc5-99cc-370c31d352e0-kube-api-access-ff6jr\") pod \"collect-profiles-29550075-dnttb\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.471877 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.586024 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55b083d5-789c-424a-8e11-f5e2e4bc51b0","Type":"ContainerStarted","Data":"e2bdefa7a0d169d5f24846d48c1cd06ee1ecc3894618d5040aeab1d1cc77ffad"} Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.588005 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e","Type":"ContainerStarted","Data":"59495d3e6487083aec80beffdf27c36e1c672dc6fcce1b1690dea80a74678d89"} Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.802074 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.803238 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:15:00 crc kubenswrapper[4885]: I0308 21:15:00.866019 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:15:01 crc kubenswrapper[4885]: I0308 21:15:01.025225 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb"] Mar 08 21:15:01 crc kubenswrapper[4885]: W0308 21:15:01.027624 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7dec2e5_804e_4bc5_99cc_370c31d352e0.slice/crio-aa9704e95aa370485df58d32c005844fecb8c8ae210dd0f7778054d148eea1e7 WatchSource:0}: Error finding container aa9704e95aa370485df58d32c005844fecb8c8ae210dd0f7778054d148eea1e7: Status 404 returned error can't find the container with id aa9704e95aa370485df58d32c005844fecb8c8ae210dd0f7778054d148eea1e7 Mar 08 21:15:01 crc kubenswrapper[4885]: I0308 21:15:01.597104 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" event={"ID":"e7dec2e5-804e-4bc5-99cc-370c31d352e0","Type":"ContainerStarted","Data":"b737af02da1f4abbd613f83608c2ed474264bbee77babc5263321db11c1a06ed"} Mar 08 21:15:01 crc kubenswrapper[4885]: I0308 21:15:01.597397 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" event={"ID":"e7dec2e5-804e-4bc5-99cc-370c31d352e0","Type":"ContainerStarted","Data":"aa9704e95aa370485df58d32c005844fecb8c8ae210dd0f7778054d148eea1e7"} Mar 08 21:15:01 crc kubenswrapper[4885]: I0308 21:15:01.632812 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" podStartSLOduration=1.6327943120000001 podStartE2EDuration="1.632794312s" podCreationTimestamp="2026-03-08 21:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:15:01.621150429 +0000 UTC m=+6203.017204442" watchObservedRunningTime="2026-03-08 21:15:01.632794312 +0000 UTC m=+6203.028848335" Mar 08 21:15:01 crc kubenswrapper[4885]: I0308 21:15:01.683288 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:15:02 crc kubenswrapper[4885]: I0308 21:15:02.215517 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fk9ll"] Mar 08 21:15:02 crc kubenswrapper[4885]: I0308 21:15:02.368644 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:15:02 crc kubenswrapper[4885]: E0308 21:15:02.369212 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:15:02 crc kubenswrapper[4885]: I0308 21:15:02.620538 4885 generic.go:334] "Generic (PLEG): container finished" podID="e7dec2e5-804e-4bc5-99cc-370c31d352e0" containerID="b737af02da1f4abbd613f83608c2ed474264bbee77babc5263321db11c1a06ed" exitCode=0 Mar 08 21:15:02 crc kubenswrapper[4885]: I0308 21:15:02.621913 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" event={"ID":"e7dec2e5-804e-4bc5-99cc-370c31d352e0","Type":"ContainerDied","Data":"b737af02da1f4abbd613f83608c2ed474264bbee77babc5263321db11c1a06ed"} Mar 08 21:15:03 crc kubenswrapper[4885]: I0308 21:15:03.631294 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fk9ll" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="registry-server" containerID="cri-o://0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5" gracePeriod=2 Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.247318 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.253401 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.349415 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff6jr\" (UniqueName: \"kubernetes.io/projected/e7dec2e5-804e-4bc5-99cc-370c31d352e0-kube-api-access-ff6jr\") pod \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.349532 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7dec2e5-804e-4bc5-99cc-370c31d352e0-config-volume\") pod \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.349774 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-utilities\") pod \"613a3b7b-ebce-483d-b67d-3c9310c4604d\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.349808 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-catalog-content\") pod \"613a3b7b-ebce-483d-b67d-3c9310c4604d\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.349868 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7dec2e5-804e-4bc5-99cc-370c31d352e0-secret-volume\") pod \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\" (UID: \"e7dec2e5-804e-4bc5-99cc-370c31d352e0\") " Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.349998 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bwkz\" (UniqueName: \"kubernetes.io/projected/613a3b7b-ebce-483d-b67d-3c9310c4604d-kube-api-access-6bwkz\") pod \"613a3b7b-ebce-483d-b67d-3c9310c4604d\" (UID: \"613a3b7b-ebce-483d-b67d-3c9310c4604d\") " Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.350725 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7dec2e5-804e-4bc5-99cc-370c31d352e0-config-volume" (OuterVolumeSpecName: "config-volume") pod "e7dec2e5-804e-4bc5-99cc-370c31d352e0" (UID: "e7dec2e5-804e-4bc5-99cc-370c31d352e0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.358322 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7dec2e5-804e-4bc5-99cc-370c31d352e0-kube-api-access-ff6jr" (OuterVolumeSpecName: "kube-api-access-ff6jr") pod "e7dec2e5-804e-4bc5-99cc-370c31d352e0" (UID: "e7dec2e5-804e-4bc5-99cc-370c31d352e0"). InnerVolumeSpecName "kube-api-access-ff6jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.358396 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/613a3b7b-ebce-483d-b67d-3c9310c4604d-kube-api-access-6bwkz" (OuterVolumeSpecName: "kube-api-access-6bwkz") pod "613a3b7b-ebce-483d-b67d-3c9310c4604d" (UID: "613a3b7b-ebce-483d-b67d-3c9310c4604d"). InnerVolumeSpecName "kube-api-access-6bwkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.358423 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dec2e5-804e-4bc5-99cc-370c31d352e0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e7dec2e5-804e-4bc5-99cc-370c31d352e0" (UID: "e7dec2e5-804e-4bc5-99cc-370c31d352e0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.366568 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-utilities" (OuterVolumeSpecName: "utilities") pod "613a3b7b-ebce-483d-b67d-3c9310c4604d" (UID: "613a3b7b-ebce-483d-b67d-3c9310c4604d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.396005 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "613a3b7b-ebce-483d-b67d-3c9310c4604d" (UID: "613a3b7b-ebce-483d-b67d-3c9310c4604d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.453410 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bwkz\" (UniqueName: \"kubernetes.io/projected/613a3b7b-ebce-483d-b67d-3c9310c4604d-kube-api-access-6bwkz\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.453444 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff6jr\" (UniqueName: \"kubernetes.io/projected/e7dec2e5-804e-4bc5-99cc-370c31d352e0-kube-api-access-ff6jr\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.453458 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e7dec2e5-804e-4bc5-99cc-370c31d352e0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.453471 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.453483 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/613a3b7b-ebce-483d-b67d-3c9310c4604d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.453494 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e7dec2e5-804e-4bc5-99cc-370c31d352e0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.645440 4885 generic.go:334] "Generic (PLEG): container finished" podID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerID="0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5" exitCode=0 Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.645582 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerDied","Data":"0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5"} Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.645623 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fk9ll" event={"ID":"613a3b7b-ebce-483d-b67d-3c9310c4604d","Type":"ContainerDied","Data":"959f3b59f5078928407d48f4de045117a3898b45a52b5bb95261aebb664945ba"} Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.645653 4885 scope.go:117] "RemoveContainer" containerID="0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.645860 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fk9ll" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.656664 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" event={"ID":"e7dec2e5-804e-4bc5-99cc-370c31d352e0","Type":"ContainerDied","Data":"aa9704e95aa370485df58d32c005844fecb8c8ae210dd0f7778054d148eea1e7"} Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.656699 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa9704e95aa370485df58d32c005844fecb8c8ae210dd0f7778054d148eea1e7" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.656805 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.691212 4885 scope.go:117] "RemoveContainer" containerID="cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.718598 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b"] Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.720395 4885 scope.go:117] "RemoveContainer" containerID="6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.731978 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fk9ll"] Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.739103 4885 scope.go:117] "RemoveContainer" containerID="0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5" Mar 08 21:15:04 crc kubenswrapper[4885]: E0308 21:15:04.739449 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5\": container with ID starting with 0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5 not found: ID does not exist" containerID="0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.739482 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5"} err="failed to get container status \"0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5\": rpc error: code = NotFound desc = could not find container \"0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5\": container with ID starting with 0e5fd04300a1efa898b3a45a6a669014932025fb12c7584b8650be27ab6e50d5 not found: ID does not exist" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.739501 4885 scope.go:117] "RemoveContainer" containerID="cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1" Mar 08 21:15:04 crc kubenswrapper[4885]: E0308 21:15:04.740100 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1\": container with ID starting with cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1 not found: ID does not exist" containerID="cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.740158 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1"} err="failed to get container status \"cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1\": rpc error: code = NotFound desc = could not find container \"cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1\": container with ID starting with cd77f7473e7de0f8c89fde3bac666d1521b82aecbf959ecec7ea52b91da03eb1 not found: ID does not exist" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.740200 4885 scope.go:117] "RemoveContainer" containerID="6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.740294 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550030-n2t6b"] Mar 08 21:15:04 crc kubenswrapper[4885]: E0308 21:15:04.740503 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4\": container with ID starting with 6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4 not found: ID does not exist" containerID="6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.740526 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4"} err="failed to get container status \"6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4\": rpc error: code = NotFound desc = could not find container \"6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4\": container with ID starting with 6cce09fe7115095b05149105724537c9c903a19c911886dfc8cd2c35344c1ab4 not found: ID does not exist" Mar 08 21:15:04 crc kubenswrapper[4885]: I0308 21:15:04.749430 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fk9ll"] Mar 08 21:15:05 crc kubenswrapper[4885]: I0308 21:15:05.388820 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" path="/var/lib/kubelet/pods/613a3b7b-ebce-483d-b67d-3c9310c4604d/volumes" Mar 08 21:15:05 crc kubenswrapper[4885]: I0308 21:15:05.390221 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c873212d-4c8c-4d2c-ad89-be5ff96db764" path="/var/lib/kubelet/pods/c873212d-4c8c-4d2c-ad89-be5ff96db764/volumes" Mar 08 21:15:06 crc kubenswrapper[4885]: I0308 21:15:06.689560 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55b083d5-789c-424a-8e11-f5e2e4bc51b0","Type":"ContainerStarted","Data":"274a7144e1aac5f2ec96a6fc64db084ea1afaf1a3f51956a214dcff2746684b9"} Mar 08 21:15:06 crc kubenswrapper[4885]: I0308 21:15:06.857053 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 08 21:15:07 crc kubenswrapper[4885]: I0308 21:15:07.713790 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e","Type":"ContainerStarted","Data":"4d100beefd1aa185ac08371a351c60d0ea1b1149bf7ccddb8acd0f5e80c81fd4"} Mar 08 21:15:07 crc kubenswrapper[4885]: I0308 21:15:07.798046 4885 scope.go:117] "RemoveContainer" containerID="ce1aec0cb989ce899ff18178129c96b0d95ae41f48f578bbb62ca6f679d83d8f" Mar 08 21:15:07 crc kubenswrapper[4885]: I0308 21:15:07.849238 4885 scope.go:117] "RemoveContainer" containerID="5a8b5b45c081a377860a6fc52da869749d2af03a3b4e62e944ef9b2a484b5105" Mar 08 21:15:07 crc kubenswrapper[4885]: I0308 21:15:07.899040 4885 scope.go:117] "RemoveContainer" containerID="38c31fe4cedc7f6d10ab5073880d121f8be2e24d735953c27d9d2bfdad42cb59" Mar 08 21:15:07 crc kubenswrapper[4885]: I0308 21:15:07.936072 4885 scope.go:117] "RemoveContainer" containerID="ee3f4f74f30598ca5d1ebc5c4a12e553e2064229a545cf14384e548c26e071ad" Mar 08 21:15:14 crc kubenswrapper[4885]: I0308 21:15:14.369359 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:15:14 crc kubenswrapper[4885]: E0308 21:15:14.370736 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:15:14 crc kubenswrapper[4885]: I0308 21:15:14.800666 4885 generic.go:334] "Generic (PLEG): container finished" podID="55b083d5-789c-424a-8e11-f5e2e4bc51b0" containerID="274a7144e1aac5f2ec96a6fc64db084ea1afaf1a3f51956a214dcff2746684b9" exitCode=0 Mar 08 21:15:14 crc kubenswrapper[4885]: I0308 21:15:14.800884 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55b083d5-789c-424a-8e11-f5e2e4bc51b0","Type":"ContainerDied","Data":"274a7144e1aac5f2ec96a6fc64db084ea1afaf1a3f51956a214dcff2746684b9"} Mar 08 21:15:14 crc kubenswrapper[4885]: I0308 21:15:14.803368 4885 generic.go:334] "Generic (PLEG): container finished" podID="efd2a302-0f57-40e0-9a28-0a1cdfabfc5e" containerID="4d100beefd1aa185ac08371a351c60d0ea1b1149bf7ccddb8acd0f5e80c81fd4" exitCode=0 Mar 08 21:15:14 crc kubenswrapper[4885]: I0308 21:15:14.803393 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e","Type":"ContainerDied","Data":"4d100beefd1aa185ac08371a351c60d0ea1b1149bf7ccddb8acd0f5e80c81fd4"} Mar 08 21:15:14 crc kubenswrapper[4885]: E0308 21:15:14.862468 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b083d5_789c_424a_8e11_f5e2e4bc51b0.slice/crio-274a7144e1aac5f2ec96a6fc64db084ea1afaf1a3f51956a214dcff2746684b9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b083d5_789c_424a_8e11_f5e2e4bc51b0.slice/crio-conmon-274a7144e1aac5f2ec96a6fc64db084ea1afaf1a3f51956a214dcff2746684b9.scope\": RecentStats: unable to find data in memory cache]" Mar 08 21:15:18 crc kubenswrapper[4885]: I0308 21:15:18.845108 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55b083d5-789c-424a-8e11-f5e2e4bc51b0","Type":"ContainerStarted","Data":"657c612348a2f491e6f75e8e6c3ff9ef83c529e0953572ff0f884efe8b814589"} Mar 08 21:15:22 crc kubenswrapper[4885]: I0308 21:15:22.936948 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"55b083d5-789c-424a-8e11-f5e2e4bc51b0","Type":"ContainerStarted","Data":"7a6f7c64ec98c538a7b7d12657e48310811462987ecb9920e1e0dc0ec259cf52"} Mar 08 21:15:22 crc kubenswrapper[4885]: I0308 21:15:22.937708 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Mar 08 21:15:22 crc kubenswrapper[4885]: I0308 21:15:22.939291 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e","Type":"ContainerStarted","Data":"5e24498f6f6c1be3bbf0a8e4bf7e7e4c681dc4e986a6d227ba792cd00cbe832f"} Mar 08 21:15:22 crc kubenswrapper[4885]: I0308 21:15:22.943193 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Mar 08 21:15:22 crc kubenswrapper[4885]: I0308 21:15:22.965118 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=7.086727298 podStartE2EDuration="24.965085713s" podCreationTimestamp="2026-03-08 21:14:58 +0000 UTC" firstStartedPulling="2026-03-08 21:14:59.851844807 +0000 UTC m=+6201.247898830" lastFinishedPulling="2026-03-08 21:15:17.730203222 +0000 UTC m=+6219.126257245" observedRunningTime="2026-03-08 21:15:22.9586393 +0000 UTC m=+6224.354693363" watchObservedRunningTime="2026-03-08 21:15:22.965085713 +0000 UTC m=+6224.361139766" Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.071225 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-9f6d-account-create-update-jn2lc"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.086677 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-qgblt"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.098601 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-58ntc"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.113116 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-9f6d-account-create-update-jn2lc"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.121169 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-nk6qt"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.128578 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-58ntc"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.136940 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-eab3-account-create-update-8r4nd"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.147704 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-nk6qt"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.158289 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-eab3-account-create-update-8r4nd"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.166805 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-qgblt"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.174153 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-cd23-account-create-update-5qvdh"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.185891 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-cd23-account-create-update-5qvdh"] Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.383306 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="636cf333-497f-4fcf-9d2d-ebfe48c81d75" path="/var/lib/kubelet/pods/636cf333-497f-4fcf-9d2d-ebfe48c81d75/volumes" Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.383903 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ea00b6-97bd-459b-ad43-bbfc5862cc4c" path="/var/lib/kubelet/pods/64ea00b6-97bd-459b-ad43-bbfc5862cc4c/volumes" Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.384536 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869febc8-e7d9-4723-bc87-567e08849a27" path="/var/lib/kubelet/pods/869febc8-e7d9-4723-bc87-567e08849a27/volumes" Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.385226 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a6e4793-0be0-4d9f-b96a-c8877648415e" path="/var/lib/kubelet/pods/9a6e4793-0be0-4d9f-b96a-c8877648415e/volumes" Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.386452 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b81f14-560e-4a64-88c7-164fbb0b4f8b" path="/var/lib/kubelet/pods/b7b81f14-560e-4a64-88c7-164fbb0b4f8b/volumes" Mar 08 21:15:25 crc kubenswrapper[4885]: I0308 21:15:25.387189 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bafc9a8b-2cbe-465d-8055-e6c2675b80a4" path="/var/lib/kubelet/pods/bafc9a8b-2cbe-465d-8055-e6c2675b80a4/volumes" Mar 08 21:15:26 crc kubenswrapper[4885]: I0308 21:15:26.368956 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:15:26 crc kubenswrapper[4885]: E0308 21:15:26.369746 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:15:26 crc kubenswrapper[4885]: I0308 21:15:26.992812 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e","Type":"ContainerStarted","Data":"53a623f6ce2d7149679dbc2084be8deda15fe862e5a06fd0f00bcfae38424432"} Mar 08 21:15:31 crc kubenswrapper[4885]: I0308 21:15:31.040697 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"efd2a302-0f57-40e0-9a28-0a1cdfabfc5e","Type":"ContainerStarted","Data":"e80b916b9d03c9c88aa8821bf0a20149e5a099858202885cabc9651d7f25da60"} Mar 08 21:15:31 crc kubenswrapper[4885]: I0308 21:15:31.094891 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=3.246340173 podStartE2EDuration="33.094852905s" podCreationTimestamp="2026-03-08 21:14:58 +0000 UTC" firstStartedPulling="2026-03-08 21:15:00.295266292 +0000 UTC m=+6201.691320305" lastFinishedPulling="2026-03-08 21:15:30.143779014 +0000 UTC m=+6231.539833037" observedRunningTime="2026-03-08 21:15:31.084076026 +0000 UTC m=+6232.480130089" watchObservedRunningTime="2026-03-08 21:15:31.094852905 +0000 UTC m=+6232.490906978" Mar 08 21:15:34 crc kubenswrapper[4885]: I0308 21:15:34.814403 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 08 21:15:35 crc kubenswrapper[4885]: I0308 21:15:35.064321 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7pj4r"] Mar 08 21:15:35 crc kubenswrapper[4885]: I0308 21:15:35.103849 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-7pj4r"] Mar 08 21:15:35 crc kubenswrapper[4885]: I0308 21:15:35.387998 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b06fac1b-774d-4b4d-afd9-58024d9e5903" path="/var/lib/kubelet/pods/b06fac1b-774d-4b4d-afd9-58024d9e5903/volumes" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.452662 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:15:38 crc kubenswrapper[4885]: E0308 21:15:38.453564 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="registry-server" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.453581 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="registry-server" Mar 08 21:15:38 crc kubenswrapper[4885]: E0308 21:15:38.453602 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="extract-utilities" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.453610 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="extract-utilities" Mar 08 21:15:38 crc kubenswrapper[4885]: E0308 21:15:38.453635 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="extract-content" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.453643 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="extract-content" Mar 08 21:15:38 crc kubenswrapper[4885]: E0308 21:15:38.453656 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7dec2e5-804e-4bc5-99cc-370c31d352e0" containerName="collect-profiles" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.453665 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7dec2e5-804e-4bc5-99cc-370c31d352e0" containerName="collect-profiles" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.453949 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7dec2e5-804e-4bc5-99cc-370c31d352e0" containerName="collect-profiles" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.453968 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="613a3b7b-ebce-483d-b67d-3c9310c4604d" containerName="registry-server" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.456407 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.461486 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.461686 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.462451 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.651382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-log-httpd\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.652266 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-run-httpd\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.652414 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.652508 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-config-data\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.652616 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls9p8\" (UniqueName: \"kubernetes.io/projected/4b866cf8-8618-4c89-baa5-b47d10251b3a-kube-api-access-ls9p8\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.652698 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.652773 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-scripts\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756146 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-config-data\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756256 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls9p8\" (UniqueName: \"kubernetes.io/projected/4b866cf8-8618-4c89-baa5-b47d10251b3a-kube-api-access-ls9p8\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756336 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756415 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-scripts\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756567 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-log-httpd\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756632 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-run-httpd\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.756694 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.765859 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-scripts\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.766897 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.767397 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-log-httpd\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.769913 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-config-data\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.770385 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.780373 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-run-httpd\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:38 crc kubenswrapper[4885]: I0308 21:15:38.794039 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls9p8\" (UniqueName: \"kubernetes.io/projected/4b866cf8-8618-4c89-baa5-b47d10251b3a-kube-api-access-ls9p8\") pod \"ceilometer-0\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " pod="openstack/ceilometer-0" Mar 08 21:15:39 crc kubenswrapper[4885]: I0308 21:15:39.079154 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:15:42 crc kubenswrapper[4885]: I0308 21:15:41.369555 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:15:42 crc kubenswrapper[4885]: E0308 21:15:41.370225 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:15:43 crc kubenswrapper[4885]: I0308 21:15:43.023319 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:15:43 crc kubenswrapper[4885]: W0308 21:15:43.042875 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b866cf8_8618_4c89_baa5_b47d10251b3a.slice/crio-6e8205593d7f835249ba0a7533525a04e8d6f972a864dcef29634dfd80f33c1a WatchSource:0}: Error finding container 6e8205593d7f835249ba0a7533525a04e8d6f972a864dcef29634dfd80f33c1a: Status 404 returned error can't find the container with id 6e8205593d7f835249ba0a7533525a04e8d6f972a864dcef29634dfd80f33c1a Mar 08 21:15:43 crc kubenswrapper[4885]: I0308 21:15:43.231367 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerStarted","Data":"6e8205593d7f835249ba0a7533525a04e8d6f972a864dcef29634dfd80f33c1a"} Mar 08 21:15:44 crc kubenswrapper[4885]: I0308 21:15:44.242122 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerStarted","Data":"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c"} Mar 08 21:15:44 crc kubenswrapper[4885]: I0308 21:15:44.814611 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 08 21:15:44 crc kubenswrapper[4885]: I0308 21:15:44.816863 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 08 21:15:45 crc kubenswrapper[4885]: I0308 21:15:45.257740 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerStarted","Data":"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559"} Mar 08 21:15:45 crc kubenswrapper[4885]: I0308 21:15:45.258408 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 08 21:15:46 crc kubenswrapper[4885]: I0308 21:15:46.268514 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerStarted","Data":"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7"} Mar 08 21:15:48 crc kubenswrapper[4885]: I0308 21:15:48.290465 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerStarted","Data":"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb"} Mar 08 21:15:48 crc kubenswrapper[4885]: I0308 21:15:48.293392 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 21:15:48 crc kubenswrapper[4885]: I0308 21:15:48.323118 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.81003254 podStartE2EDuration="10.323096694s" podCreationTimestamp="2026-03-08 21:15:38 +0000 UTC" firstStartedPulling="2026-03-08 21:15:43.050846612 +0000 UTC m=+6244.446900645" lastFinishedPulling="2026-03-08 21:15:47.563910756 +0000 UTC m=+6248.959964799" observedRunningTime="2026-03-08 21:15:48.314815492 +0000 UTC m=+6249.710869515" watchObservedRunningTime="2026-03-08 21:15:48.323096694 +0000 UTC m=+6249.719150727" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.368203 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:15:52 crc kubenswrapper[4885]: E0308 21:15:52.368983 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.480199 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-jfnt7"] Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.482327 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.496004 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-jfnt7"] Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.584025 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-148a-account-create-update-vw6hm"] Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.585475 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.592493 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.594606 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-148a-account-create-update-vw6hm"] Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.616576 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpmck\" (UniqueName: \"kubernetes.io/projected/b454a1c4-958a-40a9-8c50-9154281574fd-kube-api-access-lpmck\") pod \"aodh-db-create-jfnt7\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.616719 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b454a1c4-958a-40a9-8c50-9154281574fd-operator-scripts\") pod \"aodh-db-create-jfnt7\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.719102 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpmck\" (UniqueName: \"kubernetes.io/projected/b454a1c4-958a-40a9-8c50-9154281574fd-kube-api-access-lpmck\") pod \"aodh-db-create-jfnt7\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.719168 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cde86b-0d50-444d-b116-e32fbf5004f9-operator-scripts\") pod \"aodh-148a-account-create-update-vw6hm\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.719296 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b454a1c4-958a-40a9-8c50-9154281574fd-operator-scripts\") pod \"aodh-db-create-jfnt7\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.719327 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x7v7\" (UniqueName: \"kubernetes.io/projected/92cde86b-0d50-444d-b116-e32fbf5004f9-kube-api-access-2x7v7\") pod \"aodh-148a-account-create-update-vw6hm\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.719975 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b454a1c4-958a-40a9-8c50-9154281574fd-operator-scripts\") pod \"aodh-db-create-jfnt7\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.735877 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpmck\" (UniqueName: \"kubernetes.io/projected/b454a1c4-958a-40a9-8c50-9154281574fd-kube-api-access-lpmck\") pod \"aodh-db-create-jfnt7\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.799975 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.821268 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x7v7\" (UniqueName: \"kubernetes.io/projected/92cde86b-0d50-444d-b116-e32fbf5004f9-kube-api-access-2x7v7\") pod \"aodh-148a-account-create-update-vw6hm\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.821529 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cde86b-0d50-444d-b116-e32fbf5004f9-operator-scripts\") pod \"aodh-148a-account-create-update-vw6hm\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.822735 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cde86b-0d50-444d-b116-e32fbf5004f9-operator-scripts\") pod \"aodh-148a-account-create-update-vw6hm\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.844242 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x7v7\" (UniqueName: \"kubernetes.io/projected/92cde86b-0d50-444d-b116-e32fbf5004f9-kube-api-access-2x7v7\") pod \"aodh-148a-account-create-update-vw6hm\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:52 crc kubenswrapper[4885]: I0308 21:15:52.903990 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:53 crc kubenswrapper[4885]: I0308 21:15:53.044764 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ps8dx"] Mar 08 21:15:53 crc kubenswrapper[4885]: I0308 21:15:53.058862 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-ps8dx"] Mar 08 21:15:53 crc kubenswrapper[4885]: I0308 21:15:53.378015 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eed9605f-3b77-4800-9534-6d8f2654f392" path="/var/lib/kubelet/pods/eed9605f-3b77-4800-9534-6d8f2654f392/volumes" Mar 08 21:15:53 crc kubenswrapper[4885]: I0308 21:15:53.494218 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-148a-account-create-update-vw6hm"] Mar 08 21:15:53 crc kubenswrapper[4885]: I0308 21:15:53.515846 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-jfnt7"] Mar 08 21:15:53 crc kubenswrapper[4885]: W0308 21:15:53.519807 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92cde86b_0d50_444d_b116_e32fbf5004f9.slice/crio-04a122acddde424448a916c04183cdb336cd5189d4fb00a6f8321546583b71ba WatchSource:0}: Error finding container 04a122acddde424448a916c04183cdb336cd5189d4fb00a6f8321546583b71ba: Status 404 returned error can't find the container with id 04a122acddde424448a916c04183cdb336cd5189d4fb00a6f8321546583b71ba Mar 08 21:15:53 crc kubenswrapper[4885]: W0308 21:15:53.534113 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb454a1c4_958a_40a9_8c50_9154281574fd.slice/crio-c5f55e1c4500107d146ee8b0a3f291cab0995b5b5a7af45fe8069d3b3d638104 WatchSource:0}: Error finding container c5f55e1c4500107d146ee8b0a3f291cab0995b5b5a7af45fe8069d3b3d638104: Status 404 returned error can't find the container with id c5f55e1c4500107d146ee8b0a3f291cab0995b5b5a7af45fe8069d3b3d638104 Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.029031 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vk668"] Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.038832 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vk668"] Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.356257 4885 generic.go:334] "Generic (PLEG): container finished" podID="b454a1c4-958a-40a9-8c50-9154281574fd" containerID="a35a9b66ff3babcb2662995c86d66b4cb67b7df0bec572a2fefb5352c1e090cb" exitCode=0 Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.356333 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jfnt7" event={"ID":"b454a1c4-958a-40a9-8c50-9154281574fd","Type":"ContainerDied","Data":"a35a9b66ff3babcb2662995c86d66b4cb67b7df0bec572a2fefb5352c1e090cb"} Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.356384 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jfnt7" event={"ID":"b454a1c4-958a-40a9-8c50-9154281574fd","Type":"ContainerStarted","Data":"c5f55e1c4500107d146ee8b0a3f291cab0995b5b5a7af45fe8069d3b3d638104"} Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.358036 4885 generic.go:334] "Generic (PLEG): container finished" podID="92cde86b-0d50-444d-b116-e32fbf5004f9" containerID="870ff46cb1f6250fba56c9497a2a58f99777f85302f8adb2a09cd3289b27392e" exitCode=0 Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.358062 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-148a-account-create-update-vw6hm" event={"ID":"92cde86b-0d50-444d-b116-e32fbf5004f9","Type":"ContainerDied","Data":"870ff46cb1f6250fba56c9497a2a58f99777f85302f8adb2a09cd3289b27392e"} Mar 08 21:15:54 crc kubenswrapper[4885]: I0308 21:15:54.358077 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-148a-account-create-update-vw6hm" event={"ID":"92cde86b-0d50-444d-b116-e32fbf5004f9","Type":"ContainerStarted","Data":"04a122acddde424448a916c04183cdb336cd5189d4fb00a6f8321546583b71ba"} Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.391875 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9535a5b-072e-4a1f-b9e4-89942ba9e800" path="/var/lib/kubelet/pods/a9535a5b-072e-4a1f-b9e4-89942ba9e800/volumes" Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.857345 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.866250 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.990733 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x7v7\" (UniqueName: \"kubernetes.io/projected/92cde86b-0d50-444d-b116-e32fbf5004f9-kube-api-access-2x7v7\") pod \"92cde86b-0d50-444d-b116-e32fbf5004f9\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.990816 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cde86b-0d50-444d-b116-e32fbf5004f9-operator-scripts\") pod \"92cde86b-0d50-444d-b116-e32fbf5004f9\" (UID: \"92cde86b-0d50-444d-b116-e32fbf5004f9\") " Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.990909 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b454a1c4-958a-40a9-8c50-9154281574fd-operator-scripts\") pod \"b454a1c4-958a-40a9-8c50-9154281574fd\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.991085 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpmck\" (UniqueName: \"kubernetes.io/projected/b454a1c4-958a-40a9-8c50-9154281574fd-kube-api-access-lpmck\") pod \"b454a1c4-958a-40a9-8c50-9154281574fd\" (UID: \"b454a1c4-958a-40a9-8c50-9154281574fd\") " Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.991697 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b454a1c4-958a-40a9-8c50-9154281574fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b454a1c4-958a-40a9-8c50-9154281574fd" (UID: "b454a1c4-958a-40a9-8c50-9154281574fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:15:55 crc kubenswrapper[4885]: I0308 21:15:55.991706 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92cde86b-0d50-444d-b116-e32fbf5004f9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92cde86b-0d50-444d-b116-e32fbf5004f9" (UID: "92cde86b-0d50-444d-b116-e32fbf5004f9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.003595 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92cde86b-0d50-444d-b116-e32fbf5004f9-kube-api-access-2x7v7" (OuterVolumeSpecName: "kube-api-access-2x7v7") pod "92cde86b-0d50-444d-b116-e32fbf5004f9" (UID: "92cde86b-0d50-444d-b116-e32fbf5004f9"). InnerVolumeSpecName "kube-api-access-2x7v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.004678 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b454a1c4-958a-40a9-8c50-9154281574fd-kube-api-access-lpmck" (OuterVolumeSpecName: "kube-api-access-lpmck") pod "b454a1c4-958a-40a9-8c50-9154281574fd" (UID: "b454a1c4-958a-40a9-8c50-9154281574fd"). InnerVolumeSpecName "kube-api-access-lpmck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.093844 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpmck\" (UniqueName: \"kubernetes.io/projected/b454a1c4-958a-40a9-8c50-9154281574fd-kube-api-access-lpmck\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.093879 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x7v7\" (UniqueName: \"kubernetes.io/projected/92cde86b-0d50-444d-b116-e32fbf5004f9-kube-api-access-2x7v7\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.093895 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92cde86b-0d50-444d-b116-e32fbf5004f9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.093935 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b454a1c4-958a-40a9-8c50-9154281574fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.384293 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-jfnt7" event={"ID":"b454a1c4-958a-40a9-8c50-9154281574fd","Type":"ContainerDied","Data":"c5f55e1c4500107d146ee8b0a3f291cab0995b5b5a7af45fe8069d3b3d638104"} Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.384326 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-jfnt7" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.384346 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5f55e1c4500107d146ee8b0a3f291cab0995b5b5a7af45fe8069d3b3d638104" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.387649 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-148a-account-create-update-vw6hm" event={"ID":"92cde86b-0d50-444d-b116-e32fbf5004f9","Type":"ContainerDied","Data":"04a122acddde424448a916c04183cdb336cd5189d4fb00a6f8321546583b71ba"} Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.387688 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04a122acddde424448a916c04183cdb336cd5189d4fb00a6f8321546583b71ba" Mar 08 21:15:56 crc kubenswrapper[4885]: I0308 21:15:56.387764 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-148a-account-create-update-vw6hm" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.048960 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-hj6ng"] Mar 08 21:15:58 crc kubenswrapper[4885]: E0308 21:15:58.049634 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b454a1c4-958a-40a9-8c50-9154281574fd" containerName="mariadb-database-create" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.049648 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b454a1c4-958a-40a9-8c50-9154281574fd" containerName="mariadb-database-create" Mar 08 21:15:58 crc kubenswrapper[4885]: E0308 21:15:58.049673 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92cde86b-0d50-444d-b116-e32fbf5004f9" containerName="mariadb-account-create-update" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.049681 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="92cde86b-0d50-444d-b116-e32fbf5004f9" containerName="mariadb-account-create-update" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.049941 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b454a1c4-958a-40a9-8c50-9154281574fd" containerName="mariadb-database-create" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.049974 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="92cde86b-0d50-444d-b116-e32fbf5004f9" containerName="mariadb-account-create-update" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.050679 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.053448 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.053800 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.054127 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.054276 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-sjjm8" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.064906 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-hj6ng"] Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.178675 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-combined-ca-bundle\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.179018 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v2ph\" (UniqueName: \"kubernetes.io/projected/ddddf0b1-83be-4ebb-8318-9d40522a3efb-kube-api-access-8v2ph\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.179318 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-scripts\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.179569 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-config-data\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.281375 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-combined-ca-bundle\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.281447 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v2ph\" (UniqueName: \"kubernetes.io/projected/ddddf0b1-83be-4ebb-8318-9d40522a3efb-kube-api-access-8v2ph\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.281512 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-scripts\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.281571 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-config-data\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.288290 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-config-data\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.297861 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-combined-ca-bundle\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.300580 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-scripts\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.301901 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v2ph\" (UniqueName: \"kubernetes.io/projected/ddddf0b1-83be-4ebb-8318-9d40522a3efb-kube-api-access-8v2ph\") pod \"aodh-db-sync-hj6ng\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.376470 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:15:58 crc kubenswrapper[4885]: I0308 21:15:58.945827 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-hj6ng"] Mar 08 21:15:59 crc kubenswrapper[4885]: I0308 21:15:59.438510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hj6ng" event={"ID":"ddddf0b1-83be-4ebb-8318-9d40522a3efb","Type":"ContainerStarted","Data":"379eb3f97d8a6dc650da41a4a095a30908cad88e08aa021d0aaf2eb851809ee9"} Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.140252 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550076-qtxj8"] Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.143006 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.146387 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.146771 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.147165 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.150203 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550076-qtxj8"] Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.219518 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdd2m\" (UniqueName: \"kubernetes.io/projected/6963ac4b-0b7b-489f-a98a-7bad7270d510-kube-api-access-xdd2m\") pod \"auto-csr-approver-29550076-qtxj8\" (UID: \"6963ac4b-0b7b-489f-a98a-7bad7270d510\") " pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.322861 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdd2m\" (UniqueName: \"kubernetes.io/projected/6963ac4b-0b7b-489f-a98a-7bad7270d510-kube-api-access-xdd2m\") pod \"auto-csr-approver-29550076-qtxj8\" (UID: \"6963ac4b-0b7b-489f-a98a-7bad7270d510\") " pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.350052 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdd2m\" (UniqueName: \"kubernetes.io/projected/6963ac4b-0b7b-489f-a98a-7bad7270d510-kube-api-access-xdd2m\") pod \"auto-csr-approver-29550076-qtxj8\" (UID: \"6963ac4b-0b7b-489f-a98a-7bad7270d510\") " pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.470305 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:00 crc kubenswrapper[4885]: I0308 21:16:00.934992 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550076-qtxj8"] Mar 08 21:16:01 crc kubenswrapper[4885]: I0308 21:16:01.463273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" event={"ID":"6963ac4b-0b7b-489f-a98a-7bad7270d510","Type":"ContainerStarted","Data":"ba804b590126d6d6c76dc5b1f8649d5d3916e29998d69f32c2e772c4710233d2"} Mar 08 21:16:04 crc kubenswrapper[4885]: I0308 21:16:04.370171 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:16:04 crc kubenswrapper[4885]: E0308 21:16:04.371340 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:16:05 crc kubenswrapper[4885]: I0308 21:16:05.517424 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hj6ng" event={"ID":"ddddf0b1-83be-4ebb-8318-9d40522a3efb","Type":"ContainerStarted","Data":"50f56d5baf9ae0368c43d2d5d7b045c4c64547d6f51fe21432b96e232f3f2393"} Mar 08 21:16:05 crc kubenswrapper[4885]: I0308 21:16:05.521645 4885 generic.go:334] "Generic (PLEG): container finished" podID="6963ac4b-0b7b-489f-a98a-7bad7270d510" containerID="e527652c3f32f5179c847a20bd0a6dafb8df7997ca784a705d3328979c68ce90" exitCode=0 Mar 08 21:16:05 crc kubenswrapper[4885]: I0308 21:16:05.521687 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" event={"ID":"6963ac4b-0b7b-489f-a98a-7bad7270d510","Type":"ContainerDied","Data":"e527652c3f32f5179c847a20bd0a6dafb8df7997ca784a705d3328979c68ce90"} Mar 08 21:16:05 crc kubenswrapper[4885]: I0308 21:16:05.546428 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-hj6ng" podStartSLOduration=2.065984669 podStartE2EDuration="7.546405771s" podCreationTimestamp="2026-03-08 21:15:58 +0000 UTC" firstStartedPulling="2026-03-08 21:15:58.965902455 +0000 UTC m=+6260.361956478" lastFinishedPulling="2026-03-08 21:16:04.446323547 +0000 UTC m=+6265.842377580" observedRunningTime="2026-03-08 21:16:05.538654244 +0000 UTC m=+6266.934708277" watchObservedRunningTime="2026-03-08 21:16:05.546405771 +0000 UTC m=+6266.942459794" Mar 08 21:16:06 crc kubenswrapper[4885]: I0308 21:16:06.936888 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.118221 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdd2m\" (UniqueName: \"kubernetes.io/projected/6963ac4b-0b7b-489f-a98a-7bad7270d510-kube-api-access-xdd2m\") pod \"6963ac4b-0b7b-489f-a98a-7bad7270d510\" (UID: \"6963ac4b-0b7b-489f-a98a-7bad7270d510\") " Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.127244 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6963ac4b-0b7b-489f-a98a-7bad7270d510-kube-api-access-xdd2m" (OuterVolumeSpecName: "kube-api-access-xdd2m") pod "6963ac4b-0b7b-489f-a98a-7bad7270d510" (UID: "6963ac4b-0b7b-489f-a98a-7bad7270d510"). InnerVolumeSpecName "kube-api-access-xdd2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.220499 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdd2m\" (UniqueName: \"kubernetes.io/projected/6963ac4b-0b7b-489f-a98a-7bad7270d510-kube-api-access-xdd2m\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.552014 4885 generic.go:334] "Generic (PLEG): container finished" podID="ddddf0b1-83be-4ebb-8318-9d40522a3efb" containerID="50f56d5baf9ae0368c43d2d5d7b045c4c64547d6f51fe21432b96e232f3f2393" exitCode=0 Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.552124 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hj6ng" event={"ID":"ddddf0b1-83be-4ebb-8318-9d40522a3efb","Type":"ContainerDied","Data":"50f56d5baf9ae0368c43d2d5d7b045c4c64547d6f51fe21432b96e232f3f2393"} Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.556506 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" event={"ID":"6963ac4b-0b7b-489f-a98a-7bad7270d510","Type":"ContainerDied","Data":"ba804b590126d6d6c76dc5b1f8649d5d3916e29998d69f32c2e772c4710233d2"} Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.556558 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550076-qtxj8" Mar 08 21:16:07 crc kubenswrapper[4885]: I0308 21:16:07.556565 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba804b590126d6d6c76dc5b1f8649d5d3916e29998d69f32c2e772c4710233d2" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.041273 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550070-gjrwt"] Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.054026 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550070-gjrwt"] Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.063811 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-8crmt"] Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.073432 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-8crmt"] Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.153262 4885 scope.go:117] "RemoveContainer" containerID="3f3b93600a59d7fdfedddb2e79ea7fb7eee2ed381b6d60e917ab50e93241509a" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.179479 4885 scope.go:117] "RemoveContainer" containerID="85ace4bbc263d67af4ff24cc59994a076cc980df82df1e2ae92a9834af20ce31" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.233410 4885 scope.go:117] "RemoveContainer" containerID="57d34301e8cc7f8e0b2d448fe0ecba13af188f594f144adc104fb3b5dabb2f60" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.267044 4885 scope.go:117] "RemoveContainer" containerID="8602697feac478750bd9bf6e693b70c9e3f1df0afea0deb7c2804af9bf248c24" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.311223 4885 scope.go:117] "RemoveContainer" containerID="293fcdf5f1f3770069df599650a0ea581f09c4e28effba9f99eb6879ddb6a2a4" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.357535 4885 scope.go:117] "RemoveContainer" containerID="9053532705caa4a801f382164c347679058c8a5255c223b315fac67e8c18e8ef" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.397324 4885 scope.go:117] "RemoveContainer" containerID="085db1d51848063091ed8cc366e74589bc9b1a67399db7aae932f752c5c7bcca" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.432865 4885 scope.go:117] "RemoveContainer" containerID="08cb7392dd836d2cf5e583b01bad8a88a737b02245c6ec9a4a8e07b52e00a8cf" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.458707 4885 scope.go:117] "RemoveContainer" containerID="da2f1a91c9bb51280241448d964496931abf663a64970bb68efa8e74c760d038" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.490655 4885 scope.go:117] "RemoveContainer" containerID="c4c2ca7970045efef5435938f6bb44bd5446c5ce53a852adfb403510ee1a79c2" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.520578 4885 scope.go:117] "RemoveContainer" containerID="2a5e8c0d61eedd0069d39190cdfa7686da395e0e45e1d4b7133ef0d8e637e513" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.542610 4885 scope.go:117] "RemoveContainer" containerID="bdd2c701bb858773f060623b06a914478bf58cb8470912a63df694c3493b2a12" Mar 08 21:16:08 crc kubenswrapper[4885]: I0308 21:16:08.940076 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.060759 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-config-data\") pod \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.061159 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8v2ph\" (UniqueName: \"kubernetes.io/projected/ddddf0b1-83be-4ebb-8318-9d40522a3efb-kube-api-access-8v2ph\") pod \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.061215 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-combined-ca-bundle\") pod \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.061397 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-scripts\") pod \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\" (UID: \"ddddf0b1-83be-4ebb-8318-9d40522a3efb\") " Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.066444 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-scripts" (OuterVolumeSpecName: "scripts") pod "ddddf0b1-83be-4ebb-8318-9d40522a3efb" (UID: "ddddf0b1-83be-4ebb-8318-9d40522a3efb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.066469 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddddf0b1-83be-4ebb-8318-9d40522a3efb-kube-api-access-8v2ph" (OuterVolumeSpecName: "kube-api-access-8v2ph") pod "ddddf0b1-83be-4ebb-8318-9d40522a3efb" (UID: "ddddf0b1-83be-4ebb-8318-9d40522a3efb"). InnerVolumeSpecName "kube-api-access-8v2ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.089241 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.092089 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-config-data" (OuterVolumeSpecName: "config-data") pod "ddddf0b1-83be-4ebb-8318-9d40522a3efb" (UID: "ddddf0b1-83be-4ebb-8318-9d40522a3efb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.092795 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddddf0b1-83be-4ebb-8318-9d40522a3efb" (UID: "ddddf0b1-83be-4ebb-8318-9d40522a3efb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.163628 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.163664 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8v2ph\" (UniqueName: \"kubernetes.io/projected/ddddf0b1-83be-4ebb-8318-9d40522a3efb-kube-api-access-8v2ph\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.163677 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.163690 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddddf0b1-83be-4ebb-8318-9d40522a3efb-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.382383 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3405968f-173e-4ab2-a8ac-699fdaaad4d3" path="/var/lib/kubelet/pods/3405968f-173e-4ab2-a8ac-699fdaaad4d3/volumes" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.383062 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4497ae4b-d188-4afa-9546-11fbe209a9a7" path="/var/lib/kubelet/pods/4497ae4b-d188-4afa-9546-11fbe209a9a7/volumes" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.626290 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-hj6ng" event={"ID":"ddddf0b1-83be-4ebb-8318-9d40522a3efb","Type":"ContainerDied","Data":"379eb3f97d8a6dc650da41a4a095a30908cad88e08aa021d0aaf2eb851809ee9"} Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.626342 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="379eb3f97d8a6dc650da41a4a095a30908cad88e08aa021d0aaf2eb851809ee9" Mar 08 21:16:09 crc kubenswrapper[4885]: I0308 21:16:09.626423 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-hj6ng" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.248009 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 08 21:16:13 crc kubenswrapper[4885]: E0308 21:16:13.249290 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddddf0b1-83be-4ebb-8318-9d40522a3efb" containerName="aodh-db-sync" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.249313 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddddf0b1-83be-4ebb-8318-9d40522a3efb" containerName="aodh-db-sync" Mar 08 21:16:13 crc kubenswrapper[4885]: E0308 21:16:13.249353 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6963ac4b-0b7b-489f-a98a-7bad7270d510" containerName="oc" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.249364 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="6963ac4b-0b7b-489f-a98a-7bad7270d510" containerName="oc" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.249718 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddddf0b1-83be-4ebb-8318-9d40522a3efb" containerName="aodh-db-sync" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.249740 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="6963ac4b-0b7b-489f-a98a-7bad7270d510" containerName="oc" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.253002 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.261965 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.262156 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.262722 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.269892 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-sjjm8" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.356126 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-scripts\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.356198 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-combined-ca-bundle\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.356489 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-config-data\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.356596 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58ql5\" (UniqueName: \"kubernetes.io/projected/737065cc-3153-4e0c-b4ee-4ad587c8d494-kube-api-access-58ql5\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.459090 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-combined-ca-bundle\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.459316 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-config-data\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.459381 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58ql5\" (UniqueName: \"kubernetes.io/projected/737065cc-3153-4e0c-b4ee-4ad587c8d494-kube-api-access-58ql5\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.459502 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-scripts\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.465098 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-config-data\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.468279 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-scripts\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.470556 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/737065cc-3153-4e0c-b4ee-4ad587c8d494-combined-ca-bundle\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.483863 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58ql5\" (UniqueName: \"kubernetes.io/projected/737065cc-3153-4e0c-b4ee-4ad587c8d494-kube-api-access-58ql5\") pod \"aodh-0\" (UID: \"737065cc-3153-4e0c-b4ee-4ad587c8d494\") " pod="openstack/aodh-0" Mar 08 21:16:13 crc kubenswrapper[4885]: I0308 21:16:13.578084 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 08 21:16:14 crc kubenswrapper[4885]: I0308 21:16:14.059503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 08 21:16:14 crc kubenswrapper[4885]: W0308 21:16:14.062877 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod737065cc_3153_4e0c_b4ee_4ad587c8d494.slice/crio-b5e1f8b6f4fd3af2e56f7ed61c4dc2894f51cfcbf9d5f39f093a1abad8aa5f97 WatchSource:0}: Error finding container b5e1f8b6f4fd3af2e56f7ed61c4dc2894f51cfcbf9d5f39f093a1abad8aa5f97: Status 404 returned error can't find the container with id b5e1f8b6f4fd3af2e56f7ed61c4dc2894f51cfcbf9d5f39f093a1abad8aa5f97 Mar 08 21:16:14 crc kubenswrapper[4885]: I0308 21:16:14.714658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"737065cc-3153-4e0c-b4ee-4ad587c8d494","Type":"ContainerStarted","Data":"b5e1f8b6f4fd3af2e56f7ed61c4dc2894f51cfcbf9d5f39f093a1abad8aa5f97"} Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.435943 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.436583 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-central-agent" containerID="cri-o://b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c" gracePeriod=30 Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.436661 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="sg-core" containerID="cri-o://6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7" gracePeriod=30 Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.436685 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-notification-agent" containerID="cri-o://92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559" gracePeriod=30 Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.436807 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="proxy-httpd" containerID="cri-o://a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb" gracePeriod=30 Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.726557 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerID="a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb" exitCode=0 Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.726588 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerID="6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7" exitCode=2 Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.726631 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerDied","Data":"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb"} Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.726662 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerDied","Data":"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7"} Mar 08 21:16:15 crc kubenswrapper[4885]: I0308 21:16:15.728480 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"737065cc-3153-4e0c-b4ee-4ad587c8d494","Type":"ContainerStarted","Data":"9e724998056529f62b65f6d60184fb5702abd86897e3dd367c0e895004fbb4ff"} Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.302750 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.368672 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.369052 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432146 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-sg-core-conf-yaml\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432187 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-run-httpd\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432250 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-combined-ca-bundle\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432333 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-config-data\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432378 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-log-httpd\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432518 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls9p8\" (UniqueName: \"kubernetes.io/projected/4b866cf8-8618-4c89-baa5-b47d10251b3a-kube-api-access-ls9p8\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.432544 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-scripts\") pod \"4b866cf8-8618-4c89-baa5-b47d10251b3a\" (UID: \"4b866cf8-8618-4c89-baa5-b47d10251b3a\") " Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.433546 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.434796 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.438746 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-scripts" (OuterVolumeSpecName: "scripts") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.441357 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b866cf8-8618-4c89-baa5-b47d10251b3a-kube-api-access-ls9p8" (OuterVolumeSpecName: "kube-api-access-ls9p8") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "kube-api-access-ls9p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.463536 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.518364 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.535873 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls9p8\" (UniqueName: \"kubernetes.io/projected/4b866cf8-8618-4c89-baa5-b47d10251b3a-kube-api-access-ls9p8\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.535907 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.535935 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.535949 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.535958 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.535967 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b866cf8-8618-4c89-baa5-b47d10251b3a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.542812 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-config-data" (OuterVolumeSpecName: "config-data") pod "4b866cf8-8618-4c89-baa5-b47d10251b3a" (UID: "4b866cf8-8618-4c89-baa5-b47d10251b3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.638416 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b866cf8-8618-4c89-baa5-b47d10251b3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.743640 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerID="92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559" exitCode=0 Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.743684 4885 generic.go:334] "Generic (PLEG): container finished" podID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerID="b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c" exitCode=0 Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.743707 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerDied","Data":"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559"} Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.743738 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerDied","Data":"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c"} Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.743751 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b866cf8-8618-4c89-baa5-b47d10251b3a","Type":"ContainerDied","Data":"6e8205593d7f835249ba0a7533525a04e8d6f972a864dcef29634dfd80f33c1a"} Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.743769 4885 scope.go:117] "RemoveContainer" containerID="a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.744126 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.790411 4885 scope.go:117] "RemoveContainer" containerID="6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.790542 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.800265 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.823802 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.824407 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="proxy-httpd" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824425 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="proxy-httpd" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.824444 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="sg-core" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824453 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="sg-core" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.824469 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-notification-agent" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824478 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-notification-agent" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.824499 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-central-agent" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824506 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-central-agent" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824780 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-central-agent" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824804 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="proxy-httpd" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824817 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="ceilometer-notification-agent" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.824826 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" containerName="sg-core" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.827579 4885 scope.go:117] "RemoveContainer" containerID="92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.838096 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.838233 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.844461 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.844497 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.881613 4885 scope.go:117] "RemoveContainer" containerID="b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.921915 4885 scope.go:117] "RemoveContainer" containerID="a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.922259 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb\": container with ID starting with a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb not found: ID does not exist" containerID="a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.922302 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb"} err="failed to get container status \"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb\": rpc error: code = NotFound desc = could not find container \"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb\": container with ID starting with a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.922332 4885 scope.go:117] "RemoveContainer" containerID="6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.922650 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7\": container with ID starting with 6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7 not found: ID does not exist" containerID="6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.922684 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7"} err="failed to get container status \"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7\": rpc error: code = NotFound desc = could not find container \"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7\": container with ID starting with 6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7 not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.922704 4885 scope.go:117] "RemoveContainer" containerID="92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.923048 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559\": container with ID starting with 92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559 not found: ID does not exist" containerID="92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923076 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559"} err="failed to get container status \"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559\": rpc error: code = NotFound desc = could not find container \"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559\": container with ID starting with 92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559 not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923093 4885 scope.go:117] "RemoveContainer" containerID="b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c" Mar 08 21:16:16 crc kubenswrapper[4885]: E0308 21:16:16.923280 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c\": container with ID starting with b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c not found: ID does not exist" containerID="b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923304 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c"} err="failed to get container status \"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c\": rpc error: code = NotFound desc = could not find container \"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c\": container with ID starting with b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923319 4885 scope.go:117] "RemoveContainer" containerID="a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923511 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb"} err="failed to get container status \"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb\": rpc error: code = NotFound desc = could not find container \"a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb\": container with ID starting with a632c597c88998a5cc1ecc7bbce3eeabbf840a32f883b6ea7fe4d951aab8e6bb not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923529 4885 scope.go:117] "RemoveContainer" containerID="6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923694 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7"} err="failed to get container status \"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7\": rpc error: code = NotFound desc = could not find container \"6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7\": container with ID starting with 6cddaa087db31dda01c28602be8e88d5973940d75fd6ea677d0c92c071d117a7 not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923711 4885 scope.go:117] "RemoveContainer" containerID="92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923958 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559"} err="failed to get container status \"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559\": rpc error: code = NotFound desc = could not find container \"92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559\": container with ID starting with 92113d5674988b8af0f448e05ad2c609678f09c5af6dfc02e0d3524429822559 not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.923977 4885 scope.go:117] "RemoveContainer" containerID="b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.924147 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c"} err="failed to get container status \"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c\": rpc error: code = NotFound desc = could not find container \"b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c\": container with ID starting with b1b3e62259ec71be2b4244ca0bb050a4636e7bf819ee216ab7aef792f7008f8c not found: ID does not exist" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.943902 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-config-data\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.944037 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.944124 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-log-httpd\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.944187 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-run-httpd\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.944206 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84gkp\" (UniqueName: \"kubernetes.io/projected/4100eecc-ed79-4489-8a72-ba6a55eec273-kube-api-access-84gkp\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.944222 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-scripts\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:16 crc kubenswrapper[4885]: I0308 21:16:16.944244 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046122 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-log-httpd\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046172 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-run-httpd\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046194 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84gkp\" (UniqueName: \"kubernetes.io/projected/4100eecc-ed79-4489-8a72-ba6a55eec273-kube-api-access-84gkp\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046218 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-scripts\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046247 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046325 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-config-data\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046435 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.046879 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-run-httpd\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.047270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-log-httpd\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.050267 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.052082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-config-data\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.059628 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-scripts\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.061388 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.079448 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84gkp\" (UniqueName: \"kubernetes.io/projected/4100eecc-ed79-4489-8a72-ba6a55eec273-kube-api-access-84gkp\") pod \"ceilometer-0\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.166115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.393162 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b866cf8-8618-4c89-baa5-b47d10251b3a" path="/var/lib/kubelet/pods/4b866cf8-8618-4c89-baa5-b47d10251b3a/volumes" Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.740234 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.755899 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerStarted","Data":"10f08b0d4e88a44af4a08cc4e989bead8ffc2d60df2b2606efa2e0ed7f81eb68"} Mar 08 21:16:17 crc kubenswrapper[4885]: I0308 21:16:17.759419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"737065cc-3153-4e0c-b4ee-4ad587c8d494","Type":"ContainerStarted","Data":"bac3d915e9924359acad5bb536ee4f2c4cabb5219af0b97ff034cdebebc70c73"} Mar 08 21:16:18 crc kubenswrapper[4885]: I0308 21:16:18.768380 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerStarted","Data":"502661d0a2ae8beac2345b6e58d84b4845d66a96d9e98eb6f82b068ae6c89e19"} Mar 08 21:16:19 crc kubenswrapper[4885]: I0308 21:16:19.793989 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerStarted","Data":"cded1028147ed518d3cc5fbc5d57f0c592438c500828d06d48dc484f460dbf22"} Mar 08 21:16:20 crc kubenswrapper[4885]: I0308 21:16:20.807014 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"737065cc-3153-4e0c-b4ee-4ad587c8d494","Type":"ContainerStarted","Data":"0dad9935e459a0b4b052bd27c236f7caa734323487a03d6a4ebb135ad0a89581"} Mar 08 21:16:20 crc kubenswrapper[4885]: I0308 21:16:20.809009 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerStarted","Data":"5e52625e338d70abee579a07cf8c7f1e26c7bb826680c63cfe32a93308e3446e"} Mar 08 21:16:22 crc kubenswrapper[4885]: I0308 21:16:22.831497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"737065cc-3153-4e0c-b4ee-4ad587c8d494","Type":"ContainerStarted","Data":"2d8e5c5442bec2dbdb49d12f71bc9d1b148f5f828eb04262ea4f52b92d8807ed"} Mar 08 21:16:22 crc kubenswrapper[4885]: I0308 21:16:22.871092 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.466749646 podStartE2EDuration="9.871069451s" podCreationTimestamp="2026-03-08 21:16:13 +0000 UTC" firstStartedPulling="2026-03-08 21:16:14.066104917 +0000 UTC m=+6275.462158940" lastFinishedPulling="2026-03-08 21:16:22.470424712 +0000 UTC m=+6283.866478745" observedRunningTime="2026-03-08 21:16:22.867152336 +0000 UTC m=+6284.263206369" watchObservedRunningTime="2026-03-08 21:16:22.871069451 +0000 UTC m=+6284.267123474" Mar 08 21:16:23 crc kubenswrapper[4885]: I0308 21:16:23.843949 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerStarted","Data":"da1dd78b37ae44446a818d4ca7a3cae6318a6e093e913892e1824d2141b2c980"} Mar 08 21:16:23 crc kubenswrapper[4885]: I0308 21:16:23.845205 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 21:16:23 crc kubenswrapper[4885]: I0308 21:16:23.869056 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.734823961 podStartE2EDuration="7.869034252s" podCreationTimestamp="2026-03-08 21:16:16 +0000 UTC" firstStartedPulling="2026-03-08 21:16:17.742644295 +0000 UTC m=+6279.138698308" lastFinishedPulling="2026-03-08 21:16:22.876854566 +0000 UTC m=+6284.272908599" observedRunningTime="2026-03-08 21:16:23.86596546 +0000 UTC m=+6285.262019783" watchObservedRunningTime="2026-03-08 21:16:23.869034252 +0000 UTC m=+6285.265088275" Mar 08 21:16:28 crc kubenswrapper[4885]: I0308 21:16:28.369043 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:16:28 crc kubenswrapper[4885]: E0308 21:16:28.370107 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.518657 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-wmgbb"] Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.520620 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.537498 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wmgbb"] Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.626763 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-b45a-account-create-update-zt9mb"] Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.628271 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.630194 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.647431 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-b45a-account-create-update-zt9mb"] Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.658115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ttq5\" (UniqueName: \"kubernetes.io/projected/451cc09f-d6aa-4930-be69-102ce5b86575-kube-api-access-7ttq5\") pod \"manila-db-create-wmgbb\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.658393 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451cc09f-d6aa-4930-be69-102ce5b86575-operator-scripts\") pod \"manila-db-create-wmgbb\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.761017 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835eb61f-3559-41d5-9891-23a6ecef9ed1-operator-scripts\") pod \"manila-b45a-account-create-update-zt9mb\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.761346 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ttq5\" (UniqueName: \"kubernetes.io/projected/451cc09f-d6aa-4930-be69-102ce5b86575-kube-api-access-7ttq5\") pod \"manila-db-create-wmgbb\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.761525 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6trw\" (UniqueName: \"kubernetes.io/projected/835eb61f-3559-41d5-9891-23a6ecef9ed1-kube-api-access-g6trw\") pod \"manila-b45a-account-create-update-zt9mb\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.761564 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451cc09f-d6aa-4930-be69-102ce5b86575-operator-scripts\") pod \"manila-db-create-wmgbb\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.762228 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451cc09f-d6aa-4930-be69-102ce5b86575-operator-scripts\") pod \"manila-db-create-wmgbb\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.798269 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ttq5\" (UniqueName: \"kubernetes.io/projected/451cc09f-d6aa-4930-be69-102ce5b86575-kube-api-access-7ttq5\") pod \"manila-db-create-wmgbb\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.840161 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.867365 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835eb61f-3559-41d5-9891-23a6ecef9ed1-operator-scripts\") pod \"manila-b45a-account-create-update-zt9mb\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.867904 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6trw\" (UniqueName: \"kubernetes.io/projected/835eb61f-3559-41d5-9891-23a6ecef9ed1-kube-api-access-g6trw\") pod \"manila-b45a-account-create-update-zt9mb\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.868739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835eb61f-3559-41d5-9891-23a6ecef9ed1-operator-scripts\") pod \"manila-b45a-account-create-update-zt9mb\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.885270 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6trw\" (UniqueName: \"kubernetes.io/projected/835eb61f-3559-41d5-9891-23a6ecef9ed1-kube-api-access-g6trw\") pod \"manila-b45a-account-create-update-zt9mb\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:29 crc kubenswrapper[4885]: I0308 21:16:29.946959 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.373537 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-wmgbb"] Mar 08 21:16:30 crc kubenswrapper[4885]: W0308 21:16:30.382541 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod451cc09f_d6aa_4930_be69_102ce5b86575.slice/crio-8b15b9ae9cef79803d78b4d667972db6c8e4784f9084e4bd7fdd781ba7ff23ff WatchSource:0}: Error finding container 8b15b9ae9cef79803d78b4d667972db6c8e4784f9084e4bd7fdd781ba7ff23ff: Status 404 returned error can't find the container with id 8b15b9ae9cef79803d78b4d667972db6c8e4784f9084e4bd7fdd781ba7ff23ff Mar 08 21:16:30 crc kubenswrapper[4885]: W0308 21:16:30.562020 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod835eb61f_3559_41d5_9891_23a6ecef9ed1.slice/crio-a8b4f5d7a4e6b8976a4c6d9f29feadf52df64808f889a7ec6136429faea246e6 WatchSource:0}: Error finding container a8b4f5d7a4e6b8976a4c6d9f29feadf52df64808f889a7ec6136429faea246e6: Status 404 returned error can't find the container with id a8b4f5d7a4e6b8976a4c6d9f29feadf52df64808f889a7ec6136429faea246e6 Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.562291 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-b45a-account-create-update-zt9mb"] Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.916191 4885 generic.go:334] "Generic (PLEG): container finished" podID="835eb61f-3559-41d5-9891-23a6ecef9ed1" containerID="8cfe6ba1dd8d427385a1015c78367bf6a932fa6920ddcb36d2679cfdab2e9416" exitCode=0 Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.916259 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b45a-account-create-update-zt9mb" event={"ID":"835eb61f-3559-41d5-9891-23a6ecef9ed1","Type":"ContainerDied","Data":"8cfe6ba1dd8d427385a1015c78367bf6a932fa6920ddcb36d2679cfdab2e9416"} Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.916287 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b45a-account-create-update-zt9mb" event={"ID":"835eb61f-3559-41d5-9891-23a6ecef9ed1","Type":"ContainerStarted","Data":"a8b4f5d7a4e6b8976a4c6d9f29feadf52df64808f889a7ec6136429faea246e6"} Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.918116 4885 generic.go:334] "Generic (PLEG): container finished" podID="451cc09f-d6aa-4930-be69-102ce5b86575" containerID="ab2bac58c78cebfa3dc65d3179c712fb4a25e9ae89fdc3f09281d9b68706ac0c" exitCode=0 Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.918163 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wmgbb" event={"ID":"451cc09f-d6aa-4930-be69-102ce5b86575","Type":"ContainerDied","Data":"ab2bac58c78cebfa3dc65d3179c712fb4a25e9ae89fdc3f09281d9b68706ac0c"} Mar 08 21:16:30 crc kubenswrapper[4885]: I0308 21:16:30.918186 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wmgbb" event={"ID":"451cc09f-d6aa-4930-be69-102ce5b86575","Type":"ContainerStarted","Data":"8b15b9ae9cef79803d78b4d667972db6c8e4784f9084e4bd7fdd781ba7ff23ff"} Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.452430 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.456891 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.524965 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835eb61f-3559-41d5-9891-23a6ecef9ed1-operator-scripts\") pod \"835eb61f-3559-41d5-9891-23a6ecef9ed1\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.525107 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6trw\" (UniqueName: \"kubernetes.io/projected/835eb61f-3559-41d5-9891-23a6ecef9ed1-kube-api-access-g6trw\") pod \"835eb61f-3559-41d5-9891-23a6ecef9ed1\" (UID: \"835eb61f-3559-41d5-9891-23a6ecef9ed1\") " Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.525388 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/835eb61f-3559-41d5-9891-23a6ecef9ed1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "835eb61f-3559-41d5-9891-23a6ecef9ed1" (UID: "835eb61f-3559-41d5-9891-23a6ecef9ed1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.525972 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451cc09f-d6aa-4930-be69-102ce5b86575-operator-scripts\") pod \"451cc09f-d6aa-4930-be69-102ce5b86575\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.526110 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ttq5\" (UniqueName: \"kubernetes.io/projected/451cc09f-d6aa-4930-be69-102ce5b86575-kube-api-access-7ttq5\") pod \"451cc09f-d6aa-4930-be69-102ce5b86575\" (UID: \"451cc09f-d6aa-4930-be69-102ce5b86575\") " Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.526258 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/451cc09f-d6aa-4930-be69-102ce5b86575-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "451cc09f-d6aa-4930-be69-102ce5b86575" (UID: "451cc09f-d6aa-4930-be69-102ce5b86575"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.526556 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/835eb61f-3559-41d5-9891-23a6ecef9ed1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.526574 4885 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/451cc09f-d6aa-4930-be69-102ce5b86575-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.531223 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/835eb61f-3559-41d5-9891-23a6ecef9ed1-kube-api-access-g6trw" (OuterVolumeSpecName: "kube-api-access-g6trw") pod "835eb61f-3559-41d5-9891-23a6ecef9ed1" (UID: "835eb61f-3559-41d5-9891-23a6ecef9ed1"). InnerVolumeSpecName "kube-api-access-g6trw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.531409 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/451cc09f-d6aa-4930-be69-102ce5b86575-kube-api-access-7ttq5" (OuterVolumeSpecName: "kube-api-access-7ttq5") pod "451cc09f-d6aa-4930-be69-102ce5b86575" (UID: "451cc09f-d6aa-4930-be69-102ce5b86575"). InnerVolumeSpecName "kube-api-access-7ttq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.632619 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ttq5\" (UniqueName: \"kubernetes.io/projected/451cc09f-d6aa-4930-be69-102ce5b86575-kube-api-access-7ttq5\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.632704 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6trw\" (UniqueName: \"kubernetes.io/projected/835eb61f-3559-41d5-9891-23a6ecef9ed1-kube-api-access-g6trw\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.941064 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-wmgbb" event={"ID":"451cc09f-d6aa-4930-be69-102ce5b86575","Type":"ContainerDied","Data":"8b15b9ae9cef79803d78b4d667972db6c8e4784f9084e4bd7fdd781ba7ff23ff"} Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.941431 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b15b9ae9cef79803d78b4d667972db6c8e4784f9084e4bd7fdd781ba7ff23ff" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.941254 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-wmgbb" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.943246 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b45a-account-create-update-zt9mb" event={"ID":"835eb61f-3559-41d5-9891-23a6ecef9ed1","Type":"ContainerDied","Data":"a8b4f5d7a4e6b8976a4c6d9f29feadf52df64808f889a7ec6136429faea246e6"} Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.943297 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8b4f5d7a4e6b8976a4c6d9f29feadf52df64808f889a7ec6136429faea246e6" Mar 08 21:16:32 crc kubenswrapper[4885]: I0308 21:16:32.943333 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b45a-account-create-update-zt9mb" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.921212 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-gtv5s"] Mar 08 21:16:34 crc kubenswrapper[4885]: E0308 21:16:34.922093 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="835eb61f-3559-41d5-9891-23a6ecef9ed1" containerName="mariadb-account-create-update" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.922122 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="835eb61f-3559-41d5-9891-23a6ecef9ed1" containerName="mariadb-account-create-update" Mar 08 21:16:34 crc kubenswrapper[4885]: E0308 21:16:34.922158 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451cc09f-d6aa-4930-be69-102ce5b86575" containerName="mariadb-database-create" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.922167 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="451cc09f-d6aa-4930-be69-102ce5b86575" containerName="mariadb-database-create" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.922414 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="835eb61f-3559-41d5-9891-23a6ecef9ed1" containerName="mariadb-account-create-update" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.922432 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="451cc09f-d6aa-4930-be69-102ce5b86575" containerName="mariadb-database-create" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.923489 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.925455 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-n7qf9" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.927053 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.933560 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-gtv5s"] Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.992087 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb929\" (UniqueName: \"kubernetes.io/projected/4393565c-775a-48fd-a497-602a556ff169-kube-api-access-lb929\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.992221 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-job-config-data\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.992303 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-config-data\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:34 crc kubenswrapper[4885]: I0308 21:16:34.992381 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-combined-ca-bundle\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.094636 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb929\" (UniqueName: \"kubernetes.io/projected/4393565c-775a-48fd-a497-602a556ff169-kube-api-access-lb929\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.094743 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-job-config-data\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.094789 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-config-data\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.094838 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-combined-ca-bundle\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.101102 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-job-config-data\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.101213 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-config-data\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.103624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-combined-ca-bundle\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.113959 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb929\" (UniqueName: \"kubernetes.io/projected/4393565c-775a-48fd-a497-602a556ff169-kube-api-access-lb929\") pod \"manila-db-sync-gtv5s\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.260527 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.956808 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-gtv5s"] Mar 08 21:16:35 crc kubenswrapper[4885]: I0308 21:16:35.993368 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-gtv5s" event={"ID":"4393565c-775a-48fd-a497-602a556ff169","Type":"ContainerStarted","Data":"bb3f562295247fff6acd82f785de1b44f006957949036c25586952cdd995f575"} Mar 08 21:16:41 crc kubenswrapper[4885]: I0308 21:16:41.368831 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:16:41 crc kubenswrapper[4885]: E0308 21:16:41.369685 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:16:42 crc kubenswrapper[4885]: I0308 21:16:42.063679 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-gtv5s" event={"ID":"4393565c-775a-48fd-a497-602a556ff169","Type":"ContainerStarted","Data":"ff2590b431e04ce466ce231540eb022968990845b1b8a9f29903a084f907a810"} Mar 08 21:16:42 crc kubenswrapper[4885]: I0308 21:16:42.096152 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-gtv5s" podStartSLOduration=2.964253065 podStartE2EDuration="8.096125433s" podCreationTimestamp="2026-03-08 21:16:34 +0000 UTC" firstStartedPulling="2026-03-08 21:16:35.976493258 +0000 UTC m=+6297.372547301" lastFinishedPulling="2026-03-08 21:16:41.108365636 +0000 UTC m=+6302.504419669" observedRunningTime="2026-03-08 21:16:42.085245411 +0000 UTC m=+6303.481299494" watchObservedRunningTime="2026-03-08 21:16:42.096125433 +0000 UTC m=+6303.492179496" Mar 08 21:16:44 crc kubenswrapper[4885]: I0308 21:16:44.087427 4885 generic.go:334] "Generic (PLEG): container finished" podID="4393565c-775a-48fd-a497-602a556ff169" containerID="ff2590b431e04ce466ce231540eb022968990845b1b8a9f29903a084f907a810" exitCode=0 Mar 08 21:16:44 crc kubenswrapper[4885]: I0308 21:16:44.087583 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-gtv5s" event={"ID":"4393565c-775a-48fd-a497-602a556ff169","Type":"ContainerDied","Data":"ff2590b431e04ce466ce231540eb022968990845b1b8a9f29903a084f907a810"} Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.688549 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.745887 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb929\" (UniqueName: \"kubernetes.io/projected/4393565c-775a-48fd-a497-602a556ff169-kube-api-access-lb929\") pod \"4393565c-775a-48fd-a497-602a556ff169\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.746299 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-combined-ca-bundle\") pod \"4393565c-775a-48fd-a497-602a556ff169\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.746451 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-job-config-data\") pod \"4393565c-775a-48fd-a497-602a556ff169\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.746513 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-config-data\") pod \"4393565c-775a-48fd-a497-602a556ff169\" (UID: \"4393565c-775a-48fd-a497-602a556ff169\") " Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.761320 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4393565c-775a-48fd-a497-602a556ff169-kube-api-access-lb929" (OuterVolumeSpecName: "kube-api-access-lb929") pod "4393565c-775a-48fd-a497-602a556ff169" (UID: "4393565c-775a-48fd-a497-602a556ff169"). InnerVolumeSpecName "kube-api-access-lb929". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.761453 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "4393565c-775a-48fd-a497-602a556ff169" (UID: "4393565c-775a-48fd-a497-602a556ff169"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.767482 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-config-data" (OuterVolumeSpecName: "config-data") pod "4393565c-775a-48fd-a497-602a556ff169" (UID: "4393565c-775a-48fd-a497-602a556ff169"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.807223 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4393565c-775a-48fd-a497-602a556ff169" (UID: "4393565c-775a-48fd-a497-602a556ff169"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.851838 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb929\" (UniqueName: \"kubernetes.io/projected/4393565c-775a-48fd-a497-602a556ff169-kube-api-access-lb929\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.851880 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.851898 4885 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-job-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:45 crc kubenswrapper[4885]: I0308 21:16:45.851914 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4393565c-775a-48fd-a497-602a556ff169-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.115436 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-gtv5s" event={"ID":"4393565c-775a-48fd-a497-602a556ff169","Type":"ContainerDied","Data":"bb3f562295247fff6acd82f785de1b44f006957949036c25586952cdd995f575"} Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.115475 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb3f562295247fff6acd82f785de1b44f006957949036c25586952cdd995f575" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.115531 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-gtv5s" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.516249 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Mar 08 21:16:46 crc kubenswrapper[4885]: E0308 21:16:46.516715 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4393565c-775a-48fd-a497-602a556ff169" containerName="manila-db-sync" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.516733 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4393565c-775a-48fd-a497-602a556ff169" containerName="manila-db-sync" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.516943 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4393565c-775a-48fd-a497-602a556ff169" containerName="manila-db-sync" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.518001 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.529521 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-n7qf9" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.537036 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.541412 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.542614 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.561959 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.602521 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.602811 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c0142342-e857-4238-b442-8e06ceb406e1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603004 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-config-data\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28llg\" (UniqueName: \"kubernetes.io/projected/c0142342-e857-4238-b442-8e06ceb406e1-kube-api-access-28llg\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603177 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c0142342-e857-4238-b442-8e06ceb406e1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603195 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c0142342-e857-4238-b442-8e06ceb406e1-ceph\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603305 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-scripts\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.603573 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.606308 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.622035 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.633068 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.703311 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c5b968869-pr98k"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704748 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj9dl\" (UniqueName: \"kubernetes.io/projected/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-kube-api-access-jj9dl\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704806 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704837 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704865 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c0142342-e857-4238-b442-8e06ceb406e1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704903 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-scripts\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704945 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-config-data\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.704985 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705006 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28llg\" (UniqueName: \"kubernetes.io/projected/c0142342-e857-4238-b442-8e06ceb406e1-kube-api-access-28llg\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705036 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c0142342-e857-4238-b442-8e06ceb406e1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705052 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c0142342-e857-4238-b442-8e06ceb406e1-ceph\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705070 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705107 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-config-data\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705130 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-scripts\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705137 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.705145 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.706350 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c0142342-e857-4238-b442-8e06ceb406e1-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.706438 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/c0142342-e857-4238-b442-8e06ceb406e1-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.711960 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c5b968869-pr98k"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.713250 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.713496 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/c0142342-e857-4238-b442-8e06ceb406e1-ceph\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.714171 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.718656 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-scripts\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.721594 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c0142342-e857-4238-b442-8e06ceb406e1-config-data\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.731644 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28llg\" (UniqueName: \"kubernetes.io/projected/c0142342-e857-4238-b442-8e06ceb406e1-kube-api-access-28llg\") pod \"manila-share-share1-0\" (UID: \"c0142342-e857-4238-b442-8e06ceb406e1\") " pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807399 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-scripts\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807487 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-nb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807509 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-dns-svc\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807569 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807611 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807637 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-config\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807663 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-config-data\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807679 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zkhb\" (UniqueName: \"kubernetes.io/projected/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-kube-api-access-6zkhb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807703 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807747 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj9dl\" (UniqueName: \"kubernetes.io/projected/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-kube-api-access-jj9dl\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.807768 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-sb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.811021 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.811822 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.813356 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-config-data\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.814624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-scripts\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.815123 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.834362 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.834947 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.835456 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj9dl\" (UniqueName: \"kubernetes.io/projected/637ae9d4-1fa5-48e0-87d7-5f6004e0352d-kube-api-access-jj9dl\") pod \"manila-scheduler-0\" (UID: \"637ae9d4-1fa5-48e0-87d7-5f6004e0352d\") " pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.836783 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.841069 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.849675 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911273 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss6zg\" (UniqueName: \"kubernetes.io/projected/6cffb553-3b2f-404c-a7da-d481d4635cfc-kube-api-access-ss6zg\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911320 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-nb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911341 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-dns-svc\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911360 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-scripts\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911414 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-config\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911442 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zkhb\" (UniqueName: \"kubernetes.io/projected/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-kube-api-access-6zkhb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911459 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-config-data-custom\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911479 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cffb553-3b2f-404c-a7da-d481d4635cfc-etc-machine-id\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911498 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cffb553-3b2f-404c-a7da-d481d4635cfc-logs\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911513 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911531 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-config-data\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.911572 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-sb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.912965 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-sb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.913072 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-nb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.913613 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-dns-svc\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.913653 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-config\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.928274 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.928624 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zkhb\" (UniqueName: \"kubernetes.io/projected/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-kube-api-access-6zkhb\") pod \"dnsmasq-dns-c5b968869-pr98k\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:46 crc kubenswrapper[4885]: I0308 21:16:46.952577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017102 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-config-data-custom\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017497 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cffb553-3b2f-404c-a7da-d481d4635cfc-etc-machine-id\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cffb553-3b2f-404c-a7da-d481d4635cfc-logs\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017574 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017598 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-config-data\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017900 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss6zg\" (UniqueName: \"kubernetes.io/projected/6cffb553-3b2f-404c-a7da-d481d4635cfc-kube-api-access-ss6zg\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.017992 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-scripts\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.018241 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6cffb553-3b2f-404c-a7da-d481d4635cfc-etc-machine-id\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.019362 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6cffb553-3b2f-404c-a7da-d481d4635cfc-logs\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.022171 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-config-data\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.026531 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.029503 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-scripts\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.032122 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6cffb553-3b2f-404c-a7da-d481d4635cfc-config-data-custom\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.041027 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss6zg\" (UniqueName: \"kubernetes.io/projected/6cffb553-3b2f-404c-a7da-d481d4635cfc-kube-api-access-ss6zg\") pod \"manila-api-0\" (UID: \"6cffb553-3b2f-404c-a7da-d481d4635cfc\") " pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.265625 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.452335 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.533018 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c5b968869-pr98k"] Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.615380 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Mar 08 21:16:47 crc kubenswrapper[4885]: W0308 21:16:47.628602 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0142342_e857_4238_b442_8e06ceb406e1.slice/crio-e62f2c8c3b2b512a95f4f947a05e28ebd06fac0f96c91c88bb91c12fba46b795 WatchSource:0}: Error finding container e62f2c8c3b2b512a95f4f947a05e28ebd06fac0f96c91c88bb91c12fba46b795: Status 404 returned error can't find the container with id e62f2c8c3b2b512a95f4f947a05e28ebd06fac0f96c91c88bb91c12fba46b795 Mar 08 21:16:47 crc kubenswrapper[4885]: I0308 21:16:47.752432 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Mar 08 21:16:48 crc kubenswrapper[4885]: I0308 21:16:48.145044 4885 generic.go:334] "Generic (PLEG): container finished" podID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerID="b94ccd4d45aaad15e9b13ae53af2356f201fb751b721183d822d6ea73417bf92" exitCode=0 Mar 08 21:16:48 crc kubenswrapper[4885]: I0308 21:16:48.145368 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c5b968869-pr98k" event={"ID":"f9f2857a-2ac8-4c56-8fa0-b153c52309f3","Type":"ContainerDied","Data":"b94ccd4d45aaad15e9b13ae53af2356f201fb751b721183d822d6ea73417bf92"} Mar 08 21:16:48 crc kubenswrapper[4885]: I0308 21:16:48.145395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c5b968869-pr98k" event={"ID":"f9f2857a-2ac8-4c56-8fa0-b153c52309f3","Type":"ContainerStarted","Data":"4bf88b828aac67327d42c79896ad6c9847535e8fc25a30b67b13b6ef37494170"} Mar 08 21:16:48 crc kubenswrapper[4885]: I0308 21:16:48.148809 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"637ae9d4-1fa5-48e0-87d7-5f6004e0352d","Type":"ContainerStarted","Data":"5908dd7b75b520c96c07074b8cd3cf5cc6697d77c8283c37b51c7417dc33522c"} Mar 08 21:16:48 crc kubenswrapper[4885]: I0308 21:16:48.154133 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c0142342-e857-4238-b442-8e06ceb406e1","Type":"ContainerStarted","Data":"e62f2c8c3b2b512a95f4f947a05e28ebd06fac0f96c91c88bb91c12fba46b795"} Mar 08 21:16:48 crc kubenswrapper[4885]: I0308 21:16:48.272081 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Mar 08 21:16:49 crc kubenswrapper[4885]: I0308 21:16:49.181026 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c5b968869-pr98k" event={"ID":"f9f2857a-2ac8-4c56-8fa0-b153c52309f3","Type":"ContainerStarted","Data":"4cb0262f2854e6a7701317c1b59e47291390cddca3f70f2c87b579f72182540b"} Mar 08 21:16:49 crc kubenswrapper[4885]: I0308 21:16:49.182497 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:49 crc kubenswrapper[4885]: I0308 21:16:49.188597 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"637ae9d4-1fa5-48e0-87d7-5f6004e0352d","Type":"ContainerStarted","Data":"4bc5e6142d8a64deed9eb23359c60a74e47cfa1b5932eeb09144f0dda2b6ebfe"} Mar 08 21:16:49 crc kubenswrapper[4885]: I0308 21:16:49.192596 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6cffb553-3b2f-404c-a7da-d481d4635cfc","Type":"ContainerStarted","Data":"bf33145555bba0904a0dd4847e5d62501369bbc4ca17bc741bde452205631e9a"} Mar 08 21:16:49 crc kubenswrapper[4885]: I0308 21:16:49.192629 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6cffb553-3b2f-404c-a7da-d481d4635cfc","Type":"ContainerStarted","Data":"d0e2a34e3ff9d70b1772a58693a392181e85a637a7fb9b1ec45d84c2a2c82ae1"} Mar 08 21:16:49 crc kubenswrapper[4885]: I0308 21:16:49.210336 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c5b968869-pr98k" podStartSLOduration=3.210304277 podStartE2EDuration="3.210304277s" podCreationTimestamp="2026-03-08 21:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:16:49.201779768 +0000 UTC m=+6310.597833791" watchObservedRunningTime="2026-03-08 21:16:49.210304277 +0000 UTC m=+6310.606358290" Mar 08 21:16:50 crc kubenswrapper[4885]: I0308 21:16:50.210440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6cffb553-3b2f-404c-a7da-d481d4635cfc","Type":"ContainerStarted","Data":"1ce386361920d0c7a6f99b8b942376ded729b83cebc5715fd1be978bf894402d"} Mar 08 21:16:50 crc kubenswrapper[4885]: I0308 21:16:50.211023 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Mar 08 21:16:50 crc kubenswrapper[4885]: I0308 21:16:50.214395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"637ae9d4-1fa5-48e0-87d7-5f6004e0352d","Type":"ContainerStarted","Data":"1793ffed25cc65f9adb660f214971358e5b4ada2129dd8c1de8fbfa2649c3b0e"} Mar 08 21:16:50 crc kubenswrapper[4885]: I0308 21:16:50.240843 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.240823929 podStartE2EDuration="4.240823929s" podCreationTimestamp="2026-03-08 21:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:16:50.23043002 +0000 UTC m=+6311.626484043" watchObservedRunningTime="2026-03-08 21:16:50.240823929 +0000 UTC m=+6311.636877952" Mar 08 21:16:50 crc kubenswrapper[4885]: I0308 21:16:50.262497 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.609069605 podStartE2EDuration="4.262471389s" podCreationTimestamp="2026-03-08 21:16:46 +0000 UTC" firstStartedPulling="2026-03-08 21:16:47.785839774 +0000 UTC m=+6309.181893807" lastFinishedPulling="2026-03-08 21:16:48.439241578 +0000 UTC m=+6309.835295591" observedRunningTime="2026-03-08 21:16:50.252361368 +0000 UTC m=+6311.648415391" watchObservedRunningTime="2026-03-08 21:16:50.262471389 +0000 UTC m=+6311.658525402" Mar 08 21:16:52 crc kubenswrapper[4885]: I0308 21:16:52.047245 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-cclgv"] Mar 08 21:16:52 crc kubenswrapper[4885]: I0308 21:16:52.068090 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-2e95-account-create-update-thqbx"] Mar 08 21:16:52 crc kubenswrapper[4885]: I0308 21:16:52.088076 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-2e95-account-create-update-thqbx"] Mar 08 21:16:52 crc kubenswrapper[4885]: I0308 21:16:52.099698 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-cclgv"] Mar 08 21:16:53 crc kubenswrapper[4885]: I0308 21:16:53.373133 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:16:53 crc kubenswrapper[4885]: E0308 21:16:53.374219 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:16:53 crc kubenswrapper[4885]: I0308 21:16:53.384122 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3159e4ac-64da-47ba-9c70-b23214e8b8ad" path="/var/lib/kubelet/pods/3159e4ac-64da-47ba-9c70-b23214e8b8ad/volumes" Mar 08 21:16:53 crc kubenswrapper[4885]: I0308 21:16:53.385787 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96e29afc-72d1-4b29-9528-1ed61feed290" path="/var/lib/kubelet/pods/96e29afc-72d1-4b29-9528-1ed61feed290/volumes" Mar 08 21:16:55 crc kubenswrapper[4885]: I0308 21:16:55.277279 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c0142342-e857-4238-b442-8e06ceb406e1","Type":"ContainerStarted","Data":"e05a3d48eeef9dbf5cb3e57980aa1eda6f610735ed34f2547835639c0a03c8a7"} Mar 08 21:16:56 crc kubenswrapper[4885]: I0308 21:16:56.295513 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"c0142342-e857-4238-b442-8e06ceb406e1","Type":"ContainerStarted","Data":"0ccfb4090278efacc729d490e7c6f0021e94814713d7f2b7387c65e812f838c2"} Mar 08 21:16:56 crc kubenswrapper[4885]: I0308 21:16:56.330747 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.460177233 podStartE2EDuration="10.330725617s" podCreationTimestamp="2026-03-08 21:16:46 +0000 UTC" firstStartedPulling="2026-03-08 21:16:47.63051156 +0000 UTC m=+6309.026565583" lastFinishedPulling="2026-03-08 21:16:54.501059944 +0000 UTC m=+6315.897113967" observedRunningTime="2026-03-08 21:16:56.324655904 +0000 UTC m=+6317.720709927" watchObservedRunningTime="2026-03-08 21:16:56.330725617 +0000 UTC m=+6317.726779640" Mar 08 21:16:56 crc kubenswrapper[4885]: I0308 21:16:56.835225 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Mar 08 21:16:56 crc kubenswrapper[4885]: I0308 21:16:56.928735 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Mar 08 21:16:56 crc kubenswrapper[4885]: I0308 21:16:56.954760 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.043077 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb747dd7-kqx6n"] Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.043307 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" podUID="938eebde-2664-4ae3-8289-e378affb1274" containerName="dnsmasq-dns" containerID="cri-o://196df1785b0b84f41c7ec30c506906a12662281699b2188f8f32887c1ea78555" gracePeriod=10 Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.311562 4885 generic.go:334] "Generic (PLEG): container finished" podID="938eebde-2664-4ae3-8289-e378affb1274" containerID="196df1785b0b84f41c7ec30c506906a12662281699b2188f8f32887c1ea78555" exitCode=0 Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.311624 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" event={"ID":"938eebde-2664-4ae3-8289-e378affb1274","Type":"ContainerDied","Data":"196df1785b0b84f41c7ec30c506906a12662281699b2188f8f32887c1ea78555"} Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.587476 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.659167 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvqpp\" (UniqueName: \"kubernetes.io/projected/938eebde-2664-4ae3-8289-e378affb1274-kube-api-access-wvqpp\") pod \"938eebde-2664-4ae3-8289-e378affb1274\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.659487 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-dns-svc\") pod \"938eebde-2664-4ae3-8289-e378affb1274\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.659581 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-sb\") pod \"938eebde-2664-4ae3-8289-e378affb1274\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.659709 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-nb\") pod \"938eebde-2664-4ae3-8289-e378affb1274\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.659819 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-config\") pod \"938eebde-2664-4ae3-8289-e378affb1274\" (UID: \"938eebde-2664-4ae3-8289-e378affb1274\") " Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.684740 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/938eebde-2664-4ae3-8289-e378affb1274-kube-api-access-wvqpp" (OuterVolumeSpecName: "kube-api-access-wvqpp") pod "938eebde-2664-4ae3-8289-e378affb1274" (UID: "938eebde-2664-4ae3-8289-e378affb1274"). InnerVolumeSpecName "kube-api-access-wvqpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.718717 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "938eebde-2664-4ae3-8289-e378affb1274" (UID: "938eebde-2664-4ae3-8289-e378affb1274"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.722663 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "938eebde-2664-4ae3-8289-e378affb1274" (UID: "938eebde-2664-4ae3-8289-e378affb1274"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.759226 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "938eebde-2664-4ae3-8289-e378affb1274" (UID: "938eebde-2664-4ae3-8289-e378affb1274"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.762129 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvqpp\" (UniqueName: \"kubernetes.io/projected/938eebde-2664-4ae3-8289-e378affb1274-kube-api-access-wvqpp\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.762160 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.762169 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.762178 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.790617 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-config" (OuterVolumeSpecName: "config") pod "938eebde-2664-4ae3-8289-e378affb1274" (UID: "938eebde-2664-4ae3-8289-e378affb1274"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.855691 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5qw5v"] Mar 08 21:16:57 crc kubenswrapper[4885]: E0308 21:16:57.856637 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938eebde-2664-4ae3-8289-e378affb1274" containerName="init" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.856659 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="938eebde-2664-4ae3-8289-e378affb1274" containerName="init" Mar 08 21:16:57 crc kubenswrapper[4885]: E0308 21:16:57.856676 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="938eebde-2664-4ae3-8289-e378affb1274" containerName="dnsmasq-dns" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.856684 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="938eebde-2664-4ae3-8289-e378affb1274" containerName="dnsmasq-dns" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.857030 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="938eebde-2664-4ae3-8289-e378affb1274" containerName="dnsmasq-dns" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.858727 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.864967 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-utilities\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.865448 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzkmc\" (UniqueName: \"kubernetes.io/projected/8c915ae7-cc70-438e-9578-6d8f75368746-kube-api-access-vzkmc\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.865754 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-catalog-content\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.866072 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/938eebde-2664-4ae3-8289-e378affb1274-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.870168 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qw5v"] Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.967611 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzkmc\" (UniqueName: \"kubernetes.io/projected/8c915ae7-cc70-438e-9578-6d8f75368746-kube-api-access-vzkmc\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.967870 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-catalog-content\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.968066 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-utilities\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.968622 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-catalog-content\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.968713 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-utilities\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:57 crc kubenswrapper[4885]: I0308 21:16:57.982646 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzkmc\" (UniqueName: \"kubernetes.io/projected/8c915ae7-cc70-438e-9578-6d8f75368746-kube-api-access-vzkmc\") pod \"redhat-operators-5qw5v\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.184827 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.334996 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" event={"ID":"938eebde-2664-4ae3-8289-e378affb1274","Type":"ContainerDied","Data":"52e08bc4fa9512096f0690af07df832fb34183cd6a6514d48ec0a3244015bcc9"} Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.335047 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75bb747dd7-kqx6n" Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.335270 4885 scope.go:117] "RemoveContainer" containerID="196df1785b0b84f41c7ec30c506906a12662281699b2188f8f32887c1ea78555" Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.395294 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75bb747dd7-kqx6n"] Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.413315 4885 scope.go:117] "RemoveContainer" containerID="12fb871f5a239d7fdcc6ca3f845e422dfc2911258d85c6c9852f5cbe4d01cbdc" Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.418250 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75bb747dd7-kqx6n"] Mar 08 21:16:58 crc kubenswrapper[4885]: I0308 21:16:58.689652 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5qw5v"] Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.348814 4885 generic.go:334] "Generic (PLEG): container finished" podID="8c915ae7-cc70-438e-9578-6d8f75368746" containerID="c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3" exitCode=0 Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.348917 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerDied","Data":"c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3"} Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.348974 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerStarted","Data":"423561b5afd152a5c60259d35c4878f3807b89bb6c081de0e3c7a84e7abff53a"} Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.393211 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="938eebde-2664-4ae3-8289-e378affb1274" path="/var/lib/kubelet/pods/938eebde-2664-4ae3-8289-e378affb1274/volumes" Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.805987 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.806293 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-central-agent" containerID="cri-o://502661d0a2ae8beac2345b6e58d84b4845d66a96d9e98eb6f82b068ae6c89e19" gracePeriod=30 Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.806733 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="proxy-httpd" containerID="cri-o://da1dd78b37ae44446a818d4ca7a3cae6318a6e093e913892e1824d2141b2c980" gracePeriod=30 Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.806797 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="sg-core" containerID="cri-o://5e52625e338d70abee579a07cf8c7f1e26c7bb826680c63cfe32a93308e3446e" gracePeriod=30 Mar 08 21:16:59 crc kubenswrapper[4885]: I0308 21:16:59.806839 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-notification-agent" containerID="cri-o://cded1028147ed518d3cc5fbc5d57f0c592438c500828d06d48dc484f460dbf22" gracePeriod=30 Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.043452 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-4k8w9"] Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.054302 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-4k8w9"] Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.362899 4885 generic.go:334] "Generic (PLEG): container finished" podID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerID="da1dd78b37ae44446a818d4ca7a3cae6318a6e093e913892e1824d2141b2c980" exitCode=0 Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.364290 4885 generic.go:334] "Generic (PLEG): container finished" podID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerID="5e52625e338d70abee579a07cf8c7f1e26c7bb826680c63cfe32a93308e3446e" exitCode=2 Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.364413 4885 generic.go:334] "Generic (PLEG): container finished" podID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerID="cded1028147ed518d3cc5fbc5d57f0c592438c500828d06d48dc484f460dbf22" exitCode=0 Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.364497 4885 generic.go:334] "Generic (PLEG): container finished" podID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerID="502661d0a2ae8beac2345b6e58d84b4845d66a96d9e98eb6f82b068ae6c89e19" exitCode=0 Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.363035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerDied","Data":"da1dd78b37ae44446a818d4ca7a3cae6318a6e093e913892e1824d2141b2c980"} Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.364671 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerDied","Data":"5e52625e338d70abee579a07cf8c7f1e26c7bb826680c63cfe32a93308e3446e"} Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.364783 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerDied","Data":"cded1028147ed518d3cc5fbc5d57f0c592438c500828d06d48dc484f460dbf22"} Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.364940 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerDied","Data":"502661d0a2ae8beac2345b6e58d84b4845d66a96d9e98eb6f82b068ae6c89e19"} Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.706715 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.731858 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-combined-ca-bundle\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732010 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-sg-core-conf-yaml\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732118 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-config-data\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732173 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-log-httpd\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732201 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84gkp\" (UniqueName: \"kubernetes.io/projected/4100eecc-ed79-4489-8a72-ba6a55eec273-kube-api-access-84gkp\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732273 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-run-httpd\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732351 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-scripts\") pod \"4100eecc-ed79-4489-8a72-ba6a55eec273\" (UID: \"4100eecc-ed79-4489-8a72-ba6a55eec273\") " Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.732781 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.733071 4885 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.733077 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.742322 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-scripts" (OuterVolumeSpecName: "scripts") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.768133 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4100eecc-ed79-4489-8a72-ba6a55eec273-kube-api-access-84gkp" (OuterVolumeSpecName: "kube-api-access-84gkp") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "kube-api-access-84gkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.782779 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.821453 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.837328 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84gkp\" (UniqueName: \"kubernetes.io/projected/4100eecc-ed79-4489-8a72-ba6a55eec273-kube-api-access-84gkp\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.837360 4885 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4100eecc-ed79-4489-8a72-ba6a55eec273-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.837370 4885 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-scripts\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.837380 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.837388 4885 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.875240 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-config-data" (OuterVolumeSpecName: "config-data") pod "4100eecc-ed79-4489-8a72-ba6a55eec273" (UID: "4100eecc-ed79-4489-8a72-ba6a55eec273"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:17:00 crc kubenswrapper[4885]: I0308 21:17:00.939554 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4100eecc-ed79-4489-8a72-ba6a55eec273-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.380255 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.381427 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6adbeb38-5e1d-43e0-a516-2cc65ad853aa" path="/var/lib/kubelet/pods/6adbeb38-5e1d-43e0-a516-2cc65ad853aa/volumes" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.382288 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerStarted","Data":"0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5"} Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.382308 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4100eecc-ed79-4489-8a72-ba6a55eec273","Type":"ContainerDied","Data":"10f08b0d4e88a44af4a08cc4e989bead8ffc2d60df2b2606efa2e0ed7f81eb68"} Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.382328 4885 scope.go:117] "RemoveContainer" containerID="da1dd78b37ae44446a818d4ca7a3cae6318a6e093e913892e1824d2141b2c980" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.407318 4885 scope.go:117] "RemoveContainer" containerID="5e52625e338d70abee579a07cf8c7f1e26c7bb826680c63cfe32a93308e3446e" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.432664 4885 scope.go:117] "RemoveContainer" containerID="cded1028147ed518d3cc5fbc5d57f0c592438c500828d06d48dc484f460dbf22" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.451148 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.467744 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.475816 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:17:01 crc kubenswrapper[4885]: E0308 21:17:01.476380 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="sg-core" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476404 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="sg-core" Mar 08 21:17:01 crc kubenswrapper[4885]: E0308 21:17:01.476424 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="proxy-httpd" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476434 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="proxy-httpd" Mar 08 21:17:01 crc kubenswrapper[4885]: E0308 21:17:01.476454 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-notification-agent" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476462 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-notification-agent" Mar 08 21:17:01 crc kubenswrapper[4885]: E0308 21:17:01.476495 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-central-agent" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476503 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-central-agent" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476780 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-central-agent" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476806 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="proxy-httpd" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476824 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="ceilometer-notification-agent" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.476837 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" containerName="sg-core" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.479301 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.481487 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.481650 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.483424 4885 scope.go:117] "RemoveContainer" containerID="502661d0a2ae8beac2345b6e58d84b4845d66a96d9e98eb6f82b068ae6c89e19" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.514718 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.552626 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661d1124-50bd-4ad4-95a4-ac90994383b3-run-httpd\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.552677 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.552858 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.553017 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-config-data\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.553151 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgjvg\" (UniqueName: \"kubernetes.io/projected/661d1124-50bd-4ad4-95a4-ac90994383b3-kube-api-access-kgjvg\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.553399 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661d1124-50bd-4ad4-95a4-ac90994383b3-log-httpd\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.553546 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-scripts\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.655831 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661d1124-50bd-4ad4-95a4-ac90994383b3-run-httpd\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656233 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656364 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656424 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-config-data\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656494 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgjvg\" (UniqueName: \"kubernetes.io/projected/661d1124-50bd-4ad4-95a4-ac90994383b3-kube-api-access-kgjvg\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656613 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661d1124-50bd-4ad4-95a4-ac90994383b3-log-httpd\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656708 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-scripts\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.656785 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661d1124-50bd-4ad4-95a4-ac90994383b3-run-httpd\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.657424 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/661d1124-50bd-4ad4-95a4-ac90994383b3-log-httpd\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.661569 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.661728 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.661840 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-config-data\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.663339 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/661d1124-50bd-4ad4-95a4-ac90994383b3-scripts\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.690457 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgjvg\" (UniqueName: \"kubernetes.io/projected/661d1124-50bd-4ad4-95a4-ac90994383b3-kube-api-access-kgjvg\") pod \"ceilometer-0\" (UID: \"661d1124-50bd-4ad4-95a4-ac90994383b3\") " pod="openstack/ceilometer-0" Mar 08 21:17:01 crc kubenswrapper[4885]: I0308 21:17:01.804462 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 08 21:17:02 crc kubenswrapper[4885]: I0308 21:17:02.327247 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 08 21:17:02 crc kubenswrapper[4885]: W0308 21:17:02.330273 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod661d1124_50bd_4ad4_95a4_ac90994383b3.slice/crio-032a98a7da7916a9c3259eeaf1325c74a49fdbca0e1e6040e7b4e9b965faef7c WatchSource:0}: Error finding container 032a98a7da7916a9c3259eeaf1325c74a49fdbca0e1e6040e7b4e9b965faef7c: Status 404 returned error can't find the container with id 032a98a7da7916a9c3259eeaf1325c74a49fdbca0e1e6040e7b4e9b965faef7c Mar 08 21:17:02 crc kubenswrapper[4885]: I0308 21:17:02.394343 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661d1124-50bd-4ad4-95a4-ac90994383b3","Type":"ContainerStarted","Data":"032a98a7da7916a9c3259eeaf1325c74a49fdbca0e1e6040e7b4e9b965faef7c"} Mar 08 21:17:03 crc kubenswrapper[4885]: I0308 21:17:03.385411 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4100eecc-ed79-4489-8a72-ba6a55eec273" path="/var/lib/kubelet/pods/4100eecc-ed79-4489-8a72-ba6a55eec273/volumes" Mar 08 21:17:03 crc kubenswrapper[4885]: I0308 21:17:03.407790 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661d1124-50bd-4ad4-95a4-ac90994383b3","Type":"ContainerStarted","Data":"6cf77d48c763816b610e164aea0d0c1598cd60036f5751f2d94ebbac3284180e"} Mar 08 21:17:04 crc kubenswrapper[4885]: I0308 21:17:04.421085 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661d1124-50bd-4ad4-95a4-ac90994383b3","Type":"ContainerStarted","Data":"83f40d359a5c5d9deff7036b1c4582694d09cce554a733effa13ee49e530b09c"} Mar 08 21:17:05 crc kubenswrapper[4885]: I0308 21:17:05.434235 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661d1124-50bd-4ad4-95a4-ac90994383b3","Type":"ContainerStarted","Data":"ea4a6acea05a037e949aedd208b75c95f102be9898efd093fbedbc89544b9c2f"} Mar 08 21:17:06 crc kubenswrapper[4885]: I0308 21:17:06.368390 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:17:06 crc kubenswrapper[4885]: E0308 21:17:06.369674 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:17:07 crc kubenswrapper[4885]: I0308 21:17:07.463105 4885 generic.go:334] "Generic (PLEG): container finished" podID="8c915ae7-cc70-438e-9578-6d8f75368746" containerID="0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5" exitCode=0 Mar 08 21:17:07 crc kubenswrapper[4885]: I0308 21:17:07.463190 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerDied","Data":"0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5"} Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.446436 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.513060 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"661d1124-50bd-4ad4-95a4-ac90994383b3","Type":"ContainerStarted","Data":"8a825c377de04827a2193f45be99eefbc3269492ffad4f010adc059deda60865"} Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.513445 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.522239 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerStarted","Data":"b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc"} Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.543778 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.275502039 podStartE2EDuration="7.543758583s" podCreationTimestamp="2026-03-08 21:17:01 +0000 UTC" firstStartedPulling="2026-03-08 21:17:02.332567815 +0000 UTC m=+6323.728621838" lastFinishedPulling="2026-03-08 21:17:07.600824359 +0000 UTC m=+6328.996878382" observedRunningTime="2026-03-08 21:17:08.533820817 +0000 UTC m=+6329.929874840" watchObservedRunningTime="2026-03-08 21:17:08.543758583 +0000 UTC m=+6329.939812606" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.559337 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5qw5v" podStartSLOduration=3.044326338 podStartE2EDuration="11.55932099s" podCreationTimestamp="2026-03-08 21:16:57 +0000 UTC" firstStartedPulling="2026-03-08 21:16:59.352115254 +0000 UTC m=+6320.748169297" lastFinishedPulling="2026-03-08 21:17:07.867109926 +0000 UTC m=+6329.263163949" observedRunningTime="2026-03-08 21:17:08.548354186 +0000 UTC m=+6329.944408209" watchObservedRunningTime="2026-03-08 21:17:08.55932099 +0000 UTC m=+6329.955375013" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.677442 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.831833 4885 scope.go:117] "RemoveContainer" containerID="81e3094200cf292808dcdd9d841162dea6305875cbf7d44e4dda3138e170a8d5" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.868741 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.875608 4885 scope.go:117] "RemoveContainer" containerID="d64f534661786890c391e49d5e099a75e38f19a2ce6774e8bd719218475416e7" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.917206 4885 scope.go:117] "RemoveContainer" containerID="8930c6f62095bcf638d08901a77c2574bea75083141756a22093eb3aac06abfe" Mar 08 21:17:08 crc kubenswrapper[4885]: I0308 21:17:08.980872 4885 scope.go:117] "RemoveContainer" containerID="b17a0512c9878aeaac8d1e7a329d963d016e56adb51b5c377e47629e2282f0c5" Mar 08 21:17:09 crc kubenswrapper[4885]: I0308 21:17:09.051330 4885 scope.go:117] "RemoveContainer" containerID="488ce5cd91e5332723ee20f8e8bbaf7d336b87f5ba2cbf84286a0e234f08758e" Mar 08 21:17:09 crc kubenswrapper[4885]: I0308 21:17:09.099105 4885 scope.go:117] "RemoveContainer" containerID="d5baab592079e81c7a0f9f2d2a048f773b79e232fa80526c94c402b1d3d147c9" Mar 08 21:17:18 crc kubenswrapper[4885]: I0308 21:17:18.184959 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:17:18 crc kubenswrapper[4885]: I0308 21:17:18.185731 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:17:18 crc kubenswrapper[4885]: I0308 21:17:18.368651 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:17:18 crc kubenswrapper[4885]: E0308 21:17:18.369504 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:17:19 crc kubenswrapper[4885]: I0308 21:17:19.231806 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5qw5v" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="registry-server" probeResult="failure" output=< Mar 08 21:17:19 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 21:17:19 crc kubenswrapper[4885]: > Mar 08 21:17:28 crc kubenswrapper[4885]: I0308 21:17:28.246374 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:17:28 crc kubenswrapper[4885]: I0308 21:17:28.301326 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:17:29 crc kubenswrapper[4885]: I0308 21:17:29.053565 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qw5v"] Mar 08 21:17:29 crc kubenswrapper[4885]: I0308 21:17:29.765108 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5qw5v" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="registry-server" containerID="cri-o://b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc" gracePeriod=2 Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.369054 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:17:30 crc kubenswrapper[4885]: E0308 21:17:30.370081 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.469266 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.503466 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzkmc\" (UniqueName: \"kubernetes.io/projected/8c915ae7-cc70-438e-9578-6d8f75368746-kube-api-access-vzkmc\") pod \"8c915ae7-cc70-438e-9578-6d8f75368746\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.503956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-catalog-content\") pod \"8c915ae7-cc70-438e-9578-6d8f75368746\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.504118 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-utilities\") pod \"8c915ae7-cc70-438e-9578-6d8f75368746\" (UID: \"8c915ae7-cc70-438e-9578-6d8f75368746\") " Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.507399 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-utilities" (OuterVolumeSpecName: "utilities") pod "8c915ae7-cc70-438e-9578-6d8f75368746" (UID: "8c915ae7-cc70-438e-9578-6d8f75368746"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.509227 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c915ae7-cc70-438e-9578-6d8f75368746-kube-api-access-vzkmc" (OuterVolumeSpecName: "kube-api-access-vzkmc") pod "8c915ae7-cc70-438e-9578-6d8f75368746" (UID: "8c915ae7-cc70-438e-9578-6d8f75368746"). InnerVolumeSpecName "kube-api-access-vzkmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.607439 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.607511 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzkmc\" (UniqueName: \"kubernetes.io/projected/8c915ae7-cc70-438e-9578-6d8f75368746-kube-api-access-vzkmc\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.652689 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8c915ae7-cc70-438e-9578-6d8f75368746" (UID: "8c915ae7-cc70-438e-9578-6d8f75368746"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.709376 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8c915ae7-cc70-438e-9578-6d8f75368746-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.777877 4885 generic.go:334] "Generic (PLEG): container finished" podID="8c915ae7-cc70-438e-9578-6d8f75368746" containerID="b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc" exitCode=0 Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.778031 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5qw5v" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.778045 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerDied","Data":"b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc"} Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.778198 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5qw5v" event={"ID":"8c915ae7-cc70-438e-9578-6d8f75368746","Type":"ContainerDied","Data":"423561b5afd152a5c60259d35c4878f3807b89bb6c081de0e3c7a84e7abff53a"} Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.778254 4885 scope.go:117] "RemoveContainer" containerID="b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.814476 4885 scope.go:117] "RemoveContainer" containerID="0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.840401 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5qw5v"] Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.856284 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5qw5v"] Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.867521 4885 scope.go:117] "RemoveContainer" containerID="c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.933253 4885 scope.go:117] "RemoveContainer" containerID="b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc" Mar 08 21:17:30 crc kubenswrapper[4885]: E0308 21:17:30.933788 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc\": container with ID starting with b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc not found: ID does not exist" containerID="b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.933839 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc"} err="failed to get container status \"b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc\": rpc error: code = NotFound desc = could not find container \"b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc\": container with ID starting with b883d6b1bc3b3b35f64a7448f2a75aa0cf6170ca793b4f147dd7d9da109615fc not found: ID does not exist" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.933874 4885 scope.go:117] "RemoveContainer" containerID="0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5" Mar 08 21:17:30 crc kubenswrapper[4885]: E0308 21:17:30.934405 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5\": container with ID starting with 0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5 not found: ID does not exist" containerID="0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.934443 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5"} err="failed to get container status \"0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5\": rpc error: code = NotFound desc = could not find container \"0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5\": container with ID starting with 0559f4e45b3011d15202bfe64e262c2b02e18bc096fe410323a6f19aa05604e5 not found: ID does not exist" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.934471 4885 scope.go:117] "RemoveContainer" containerID="c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3" Mar 08 21:17:30 crc kubenswrapper[4885]: E0308 21:17:30.934770 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3\": container with ID starting with c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3 not found: ID does not exist" containerID="c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3" Mar 08 21:17:30 crc kubenswrapper[4885]: I0308 21:17:30.934808 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3"} err="failed to get container status \"c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3\": rpc error: code = NotFound desc = could not find container \"c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3\": container with ID starting with c6afdaf5a03a656e6116f738ca44adca4753469352b9fff873c13c5c3ebca2a3 not found: ID does not exist" Mar 08 21:17:31 crc kubenswrapper[4885]: I0308 21:17:31.384892 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" path="/var/lib/kubelet/pods/8c915ae7-cc70-438e-9578-6d8f75368746/volumes" Mar 08 21:17:31 crc kubenswrapper[4885]: I0308 21:17:31.821326 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 08 21:17:43 crc kubenswrapper[4885]: I0308 21:17:43.369427 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:17:43 crc kubenswrapper[4885]: E0308 21:17:43.370530 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.288063 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85b9ffd899-kzvp9"] Mar 08 21:17:52 crc kubenswrapper[4885]: E0308 21:17:52.289091 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="extract-utilities" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.289108 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="extract-utilities" Mar 08 21:17:52 crc kubenswrapper[4885]: E0308 21:17:52.289137 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="registry-server" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.289145 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="registry-server" Mar 08 21:17:52 crc kubenswrapper[4885]: E0308 21:17:52.289182 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="extract-content" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.289191 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="extract-content" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.289455 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c915ae7-cc70-438e-9578-6d8f75368746" containerName="registry-server" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.290851 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.294364 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.329267 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-config\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.329326 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-sb\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.329401 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-openstack-cell1\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.329422 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd22h\" (UniqueName: \"kubernetes.io/projected/784dc7bd-2dba-4638-881d-7bb4b97fa26f-kube-api-access-jd22h\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.329506 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-dns-svc\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.329544 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-nb\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.352214 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85b9ffd899-kzvp9"] Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.432418 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-openstack-cell1\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.432471 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd22h\" (UniqueName: \"kubernetes.io/projected/784dc7bd-2dba-4638-881d-7bb4b97fa26f-kube-api-access-jd22h\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.432599 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-dns-svc\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.432654 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-nb\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.432687 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-config\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.432722 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-sb\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.433795 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-dns-svc\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.433832 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-config\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.434489 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-openstack-cell1\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.434696 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-nb\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.434742 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-sb\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.452690 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd22h\" (UniqueName: \"kubernetes.io/projected/784dc7bd-2dba-4638-881d-7bb4b97fa26f-kube-api-access-jd22h\") pod \"dnsmasq-dns-85b9ffd899-kzvp9\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:52 crc kubenswrapper[4885]: I0308 21:17:52.657189 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:53 crc kubenswrapper[4885]: I0308 21:17:53.185475 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85b9ffd899-kzvp9"] Mar 08 21:17:54 crc kubenswrapper[4885]: I0308 21:17:54.050878 4885 generic.go:334] "Generic (PLEG): container finished" podID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerID="cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516" exitCode=0 Mar 08 21:17:54 crc kubenswrapper[4885]: I0308 21:17:54.051052 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" event={"ID":"784dc7bd-2dba-4638-881d-7bb4b97fa26f","Type":"ContainerDied","Data":"cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516"} Mar 08 21:17:54 crc kubenswrapper[4885]: I0308 21:17:54.051279 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" event={"ID":"784dc7bd-2dba-4638-881d-7bb4b97fa26f","Type":"ContainerStarted","Data":"f44206b8ebbf0693d2d4ad48b7c945bd77de0d1345fd96143b142bf69f386619"} Mar 08 21:17:55 crc kubenswrapper[4885]: I0308 21:17:55.067124 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" event={"ID":"784dc7bd-2dba-4638-881d-7bb4b97fa26f","Type":"ContainerStarted","Data":"8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697"} Mar 08 21:17:55 crc kubenswrapper[4885]: I0308 21:17:55.067403 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:17:55 crc kubenswrapper[4885]: I0308 21:17:55.096831 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" podStartSLOduration=3.096809621 podStartE2EDuration="3.096809621s" podCreationTimestamp="2026-03-08 21:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:17:55.090452141 +0000 UTC m=+6376.486506174" watchObservedRunningTime="2026-03-08 21:17:55.096809621 +0000 UTC m=+6376.492863654" Mar 08 21:17:56 crc kubenswrapper[4885]: I0308 21:17:56.369113 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:17:56 crc kubenswrapper[4885]: E0308 21:17:56.369791 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.145251 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550078-gspxv"] Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.147553 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.149631 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.149649 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.150425 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.157733 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550078-gspxv"] Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.228779 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qlcw\" (UniqueName: \"kubernetes.io/projected/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee-kube-api-access-6qlcw\") pod \"auto-csr-approver-29550078-gspxv\" (UID: \"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee\") " pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.332977 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qlcw\" (UniqueName: \"kubernetes.io/projected/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee-kube-api-access-6qlcw\") pod \"auto-csr-approver-29550078-gspxv\" (UID: \"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee\") " pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.354268 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qlcw\" (UniqueName: \"kubernetes.io/projected/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee-kube-api-access-6qlcw\") pod \"auto-csr-approver-29550078-gspxv\" (UID: \"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee\") " pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:00 crc kubenswrapper[4885]: I0308 21:18:00.469146 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:01 crc kubenswrapper[4885]: I0308 21:18:01.032052 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550078-gspxv"] Mar 08 21:18:01 crc kubenswrapper[4885]: I0308 21:18:01.144379 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550078-gspxv" event={"ID":"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee","Type":"ContainerStarted","Data":"f23b90c06bbc8a98015e798c494ff9c9bd900ab4e2ab5f9f208b0410de047ce4"} Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.658073 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.795871 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c5b968869-pr98k"] Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.796402 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-c5b968869-pr98k" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerName="dnsmasq-dns" containerID="cri-o://4cb0262f2854e6a7701317c1b59e47291390cddca3f70f2c87b579f72182540b" gracePeriod=10 Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.885013 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-579b4494b9-nwf4n"] Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.887512 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.896365 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-579b4494b9-nwf4n"] Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.935707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-ovsdbserver-sb\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.935756 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-dns-svc\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.935809 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-config\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.935845 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-openstack-cell1\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.935889 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-ovsdbserver-nb\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:02 crc kubenswrapper[4885]: I0308 21:18:02.935973 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd5rw\" (UniqueName: \"kubernetes.io/projected/cb658095-55a6-4c1a-a84b-23ad21d14212-kube-api-access-zd5rw\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.037956 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-ovsdbserver-nb\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.038060 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd5rw\" (UniqueName: \"kubernetes.io/projected/cb658095-55a6-4c1a-a84b-23ad21d14212-kube-api-access-zd5rw\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.038188 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-ovsdbserver-sb\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.038216 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-dns-svc\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.038270 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-config\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.038333 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-openstack-cell1\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.038851 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-ovsdbserver-nb\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.039308 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-openstack-cell1\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.039444 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-config\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.039884 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-ovsdbserver-sb\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.040758 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb658095-55a6-4c1a-a84b-23ad21d14212-dns-svc\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.057784 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd5rw\" (UniqueName: \"kubernetes.io/projected/cb658095-55a6-4c1a-a84b-23ad21d14212-kube-api-access-zd5rw\") pod \"dnsmasq-dns-579b4494b9-nwf4n\" (UID: \"cb658095-55a6-4c1a-a84b-23ad21d14212\") " pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.179703 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550078-gspxv" event={"ID":"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee","Type":"ContainerStarted","Data":"cdd1e2f0b8ab5f05aa992613d4a0e2df88f958ed88a6e2f7da9cadebeb33bfdc"} Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.187544 4885 generic.go:334] "Generic (PLEG): container finished" podID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerID="4cb0262f2854e6a7701317c1b59e47291390cddca3f70f2c87b579f72182540b" exitCode=0 Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.187588 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c5b968869-pr98k" event={"ID":"f9f2857a-2ac8-4c56-8fa0-b153c52309f3","Type":"ContainerDied","Data":"4cb0262f2854e6a7701317c1b59e47291390cddca3f70f2c87b579f72182540b"} Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.209602 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550078-gspxv" podStartSLOduration=2.093888466 podStartE2EDuration="3.209583402s" podCreationTimestamp="2026-03-08 21:18:00 +0000 UTC" firstStartedPulling="2026-03-08 21:18:01.046659915 +0000 UTC m=+6382.442713948" lastFinishedPulling="2026-03-08 21:18:02.162354851 +0000 UTC m=+6383.558408884" observedRunningTime="2026-03-08 21:18:03.194979091 +0000 UTC m=+6384.591033114" watchObservedRunningTime="2026-03-08 21:18:03.209583402 +0000 UTC m=+6384.605637415" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.210651 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.465661 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.657014 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-dns-svc\") pod \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.657389 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-nb\") pod \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.657472 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-sb\") pod \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.657490 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zkhb\" (UniqueName: \"kubernetes.io/projected/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-kube-api-access-6zkhb\") pod \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.657574 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-config\") pod \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\" (UID: \"f9f2857a-2ac8-4c56-8fa0-b153c52309f3\") " Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.714274 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-kube-api-access-6zkhb" (OuterVolumeSpecName: "kube-api-access-6zkhb") pod "f9f2857a-2ac8-4c56-8fa0-b153c52309f3" (UID: "f9f2857a-2ac8-4c56-8fa0-b153c52309f3"). InnerVolumeSpecName "kube-api-access-6zkhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.726934 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-config" (OuterVolumeSpecName: "config") pod "f9f2857a-2ac8-4c56-8fa0-b153c52309f3" (UID: "f9f2857a-2ac8-4c56-8fa0-b153c52309f3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.736241 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9f2857a-2ac8-4c56-8fa0-b153c52309f3" (UID: "f9f2857a-2ac8-4c56-8fa0-b153c52309f3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.742968 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9f2857a-2ac8-4c56-8fa0-b153c52309f3" (UID: "f9f2857a-2ac8-4c56-8fa0-b153c52309f3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.748447 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9f2857a-2ac8-4c56-8fa0-b153c52309f3" (UID: "f9f2857a-2ac8-4c56-8fa0-b153c52309f3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.764755 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.764803 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.764816 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.764828 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zkhb\" (UniqueName: \"kubernetes.io/projected/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-kube-api-access-6zkhb\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:03 crc kubenswrapper[4885]: I0308 21:18:03.764837 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9f2857a-2ac8-4c56-8fa0-b153c52309f3-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:04 crc kubenswrapper[4885]: W0308 21:18:04.065043 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb658095_55a6_4c1a_a84b_23ad21d14212.slice/crio-60269b9e0eee1d01c821b8fa3a0bad871dc6d7b6b76bf7ea46d8310ccf88e2ae WatchSource:0}: Error finding container 60269b9e0eee1d01c821b8fa3a0bad871dc6d7b6b76bf7ea46d8310ccf88e2ae: Status 404 returned error can't find the container with id 60269b9e0eee1d01c821b8fa3a0bad871dc6d7b6b76bf7ea46d8310ccf88e2ae Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.067896 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-579b4494b9-nwf4n"] Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.199667 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c5b968869-pr98k" Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.199712 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c5b968869-pr98k" event={"ID":"f9f2857a-2ac8-4c56-8fa0-b153c52309f3","Type":"ContainerDied","Data":"4bf88b828aac67327d42c79896ad6c9847535e8fc25a30b67b13b6ef37494170"} Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.200092 4885 scope.go:117] "RemoveContainer" containerID="4cb0262f2854e6a7701317c1b59e47291390cddca3f70f2c87b579f72182540b" Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.202186 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" event={"ID":"cb658095-55a6-4c1a-a84b-23ad21d14212","Type":"ContainerStarted","Data":"60269b9e0eee1d01c821b8fa3a0bad871dc6d7b6b76bf7ea46d8310ccf88e2ae"} Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.238103 4885 scope.go:117] "RemoveContainer" containerID="b94ccd4d45aaad15e9b13ae53af2356f201fb751b721183d822d6ea73417bf92" Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.238560 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c5b968869-pr98k"] Mar 08 21:18:04 crc kubenswrapper[4885]: I0308 21:18:04.246981 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c5b968869-pr98k"] Mar 08 21:18:05 crc kubenswrapper[4885]: I0308 21:18:05.215572 4885 generic.go:334] "Generic (PLEG): container finished" podID="cb658095-55a6-4c1a-a84b-23ad21d14212" containerID="d901c636d39623e233e3469473700802dfec14237f6497e2b4cab381597ec10f" exitCode=0 Mar 08 21:18:05 crc kubenswrapper[4885]: I0308 21:18:05.215635 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" event={"ID":"cb658095-55a6-4c1a-a84b-23ad21d14212","Type":"ContainerDied","Data":"d901c636d39623e233e3469473700802dfec14237f6497e2b4cab381597ec10f"} Mar 08 21:18:05 crc kubenswrapper[4885]: I0308 21:18:05.220531 4885 generic.go:334] "Generic (PLEG): container finished" podID="1f0d7d35-3a1c-4505-8fa3-190d8ec038ee" containerID="cdd1e2f0b8ab5f05aa992613d4a0e2df88f958ed88a6e2f7da9cadebeb33bfdc" exitCode=0 Mar 08 21:18:05 crc kubenswrapper[4885]: I0308 21:18:05.220617 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550078-gspxv" event={"ID":"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee","Type":"ContainerDied","Data":"cdd1e2f0b8ab5f05aa992613d4a0e2df88f958ed88a6e2f7da9cadebeb33bfdc"} Mar 08 21:18:05 crc kubenswrapper[4885]: I0308 21:18:05.390228 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" path="/var/lib/kubelet/pods/f9f2857a-2ac8-4c56-8fa0-b153c52309f3/volumes" Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.237961 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" event={"ID":"cb658095-55a6-4c1a-a84b-23ad21d14212","Type":"ContainerStarted","Data":"04b0c5b1dd7e2164d5bcbebfe6ef55f4a69fc1d33f227cd891f00ba11679595d"} Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.238272 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.275868 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" podStartSLOduration=4.275841652 podStartE2EDuration="4.275841652s" podCreationTimestamp="2026-03-08 21:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:18:06.262156735 +0000 UTC m=+6387.658210798" watchObservedRunningTime="2026-03-08 21:18:06.275841652 +0000 UTC m=+6387.671895685" Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.691169 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.730842 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qlcw\" (UniqueName: \"kubernetes.io/projected/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee-kube-api-access-6qlcw\") pod \"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee\" (UID: \"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee\") " Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.737615 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee-kube-api-access-6qlcw" (OuterVolumeSpecName: "kube-api-access-6qlcw") pod "1f0d7d35-3a1c-4505-8fa3-190d8ec038ee" (UID: "1f0d7d35-3a1c-4505-8fa3-190d8ec038ee"). InnerVolumeSpecName "kube-api-access-6qlcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:18:06 crc kubenswrapper[4885]: I0308 21:18:06.834040 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qlcw\" (UniqueName: \"kubernetes.io/projected/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee-kube-api-access-6qlcw\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:07 crc kubenswrapper[4885]: I0308 21:18:07.253393 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550078-gspxv" Mar 08 21:18:07 crc kubenswrapper[4885]: I0308 21:18:07.254945 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550078-gspxv" event={"ID":"1f0d7d35-3a1c-4505-8fa3-190d8ec038ee","Type":"ContainerDied","Data":"f23b90c06bbc8a98015e798c494ff9c9bd900ab4e2ab5f9f208b0410de047ce4"} Mar 08 21:18:07 crc kubenswrapper[4885]: I0308 21:18:07.255112 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f23b90c06bbc8a98015e798c494ff9c9bd900ab4e2ab5f9f208b0410de047ce4" Mar 08 21:18:07 crc kubenswrapper[4885]: I0308 21:18:07.351941 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550072-7xh8c"] Mar 08 21:18:07 crc kubenswrapper[4885]: I0308 21:18:07.386320 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550072-7xh8c"] Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.663942 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt"] Mar 08 21:18:08 crc kubenswrapper[4885]: E0308 21:18:08.664332 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerName="dnsmasq-dns" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.664345 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerName="dnsmasq-dns" Mar 08 21:18:08 crc kubenswrapper[4885]: E0308 21:18:08.664354 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0d7d35-3a1c-4505-8fa3-190d8ec038ee" containerName="oc" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.664360 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0d7d35-3a1c-4505-8fa3-190d8ec038ee" containerName="oc" Mar 08 21:18:08 crc kubenswrapper[4885]: E0308 21:18:08.664401 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerName="init" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.664409 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerName="init" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.664591 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0d7d35-3a1c-4505-8fa3-190d8ec038ee" containerName="oc" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.664608 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f2857a-2ac8-4c56-8fa0-b153c52309f3" containerName="dnsmasq-dns" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.665272 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.667846 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.667971 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.667887 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.668269 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.673328 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.673420 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.673456 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.673491 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vzp2\" (UniqueName: \"kubernetes.io/projected/87894214-b974-4fc7-b23d-d739fde2466f-kube-api-access-8vzp2\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.673533 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.679101 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt"] Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.775392 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.775454 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.775503 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vzp2\" (UniqueName: \"kubernetes.io/projected/87894214-b974-4fc7-b23d-d739fde2466f-kube-api-access-8vzp2\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.775550 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.775690 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.793260 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.823376 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ssh-key-openstack-cell1\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.829529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.836339 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:08 crc kubenswrapper[4885]: I0308 21:18:08.837719 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vzp2\" (UniqueName: \"kubernetes.io/projected/87894214-b974-4fc7-b23d-d739fde2466f-kube-api-access-8vzp2\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:09 crc kubenswrapper[4885]: I0308 21:18:09.007283 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:09 crc kubenswrapper[4885]: I0308 21:18:09.362677 4885 scope.go:117] "RemoveContainer" containerID="b0909c7e110caec9a5e1944f3751a491442563718d6ad89214ea7bdc75d423aa" Mar 08 21:18:09 crc kubenswrapper[4885]: I0308 21:18:09.381635 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4225aab0-53fa-4aa3-ac19-6827ea262916" path="/var/lib/kubelet/pods/4225aab0-53fa-4aa3-ac19-6827ea262916/volumes" Mar 08 21:18:09 crc kubenswrapper[4885]: I0308 21:18:09.544488 4885 scope.go:117] "RemoveContainer" containerID="3931ee1c99d16a879ccfc12229692a2bae1b407bce8d0e34d3d46a2ffcee39dc" Mar 08 21:18:09 crc kubenswrapper[4885]: I0308 21:18:09.549689 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt"] Mar 08 21:18:09 crc kubenswrapper[4885]: W0308 21:18:09.551563 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87894214_b974_4fc7_b23d_d739fde2466f.slice/crio-e4721518e992ab047a24367c6906dad54f186361f584c8cda9abafbecbd7f10a WatchSource:0}: Error finding container e4721518e992ab047a24367c6906dad54f186361f584c8cda9abafbecbd7f10a: Status 404 returned error can't find the container with id e4721518e992ab047a24367c6906dad54f186361f584c8cda9abafbecbd7f10a Mar 08 21:18:09 crc kubenswrapper[4885]: I0308 21:18:09.614080 4885 scope.go:117] "RemoveContainer" containerID="37c658fc25b8a42ab3b33c1713dd08f3921d30fceba25de9d5cd0b6ec8c45fc8" Mar 08 21:18:10 crc kubenswrapper[4885]: I0308 21:18:10.281456 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" event={"ID":"87894214-b974-4fc7-b23d-d739fde2466f","Type":"ContainerStarted","Data":"e4721518e992ab047a24367c6906dad54f186361f584c8cda9abafbecbd7f10a"} Mar 08 21:18:11 crc kubenswrapper[4885]: I0308 21:18:11.369506 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:18:12 crc kubenswrapper[4885]: I0308 21:18:12.306637 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"17d10d0a74a5cbd79193584e373306e9a6f05fd494997dbeda172a5bfdd668a3"} Mar 08 21:18:13 crc kubenswrapper[4885]: I0308 21:18:13.212227 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-579b4494b9-nwf4n" Mar 08 21:18:13 crc kubenswrapper[4885]: I0308 21:18:13.307251 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85b9ffd899-kzvp9"] Mar 08 21:18:13 crc kubenswrapper[4885]: I0308 21:18:13.307805 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerName="dnsmasq-dns" containerID="cri-o://8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697" gracePeriod=10 Mar 08 21:18:13 crc kubenswrapper[4885]: I0308 21:18:13.871419 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.007176 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-nb\") pod \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.007221 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-openstack-cell1\") pod \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.007268 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-sb\") pod \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.007308 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd22h\" (UniqueName: \"kubernetes.io/projected/784dc7bd-2dba-4638-881d-7bb4b97fa26f-kube-api-access-jd22h\") pod \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.007469 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-config\") pod \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.007520 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-dns-svc\") pod \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\" (UID: \"784dc7bd-2dba-4638-881d-7bb4b97fa26f\") " Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.016049 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/784dc7bd-2dba-4638-881d-7bb4b97fa26f-kube-api-access-jd22h" (OuterVolumeSpecName: "kube-api-access-jd22h") pod "784dc7bd-2dba-4638-881d-7bb4b97fa26f" (UID: "784dc7bd-2dba-4638-881d-7bb4b97fa26f"). InnerVolumeSpecName "kube-api-access-jd22h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.061181 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "784dc7bd-2dba-4638-881d-7bb4b97fa26f" (UID: "784dc7bd-2dba-4638-881d-7bb4b97fa26f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.069436 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "784dc7bd-2dba-4638-881d-7bb4b97fa26f" (UID: "784dc7bd-2dba-4638-881d-7bb4b97fa26f"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.072755 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-config" (OuterVolumeSpecName: "config") pod "784dc7bd-2dba-4638-881d-7bb4b97fa26f" (UID: "784dc7bd-2dba-4638-881d-7bb4b97fa26f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.080028 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "784dc7bd-2dba-4638-881d-7bb4b97fa26f" (UID: "784dc7bd-2dba-4638-881d-7bb4b97fa26f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.083675 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "784dc7bd-2dba-4638-881d-7bb4b97fa26f" (UID: "784dc7bd-2dba-4638-881d-7bb4b97fa26f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.110569 4885 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-config\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.110599 4885 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.110611 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.110622 4885 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.110634 4885 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/784dc7bd-2dba-4638-881d-7bb4b97fa26f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.110645 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jd22h\" (UniqueName: \"kubernetes.io/projected/784dc7bd-2dba-4638-881d-7bb4b97fa26f-kube-api-access-jd22h\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.353623 4885 generic.go:334] "Generic (PLEG): container finished" podID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerID="8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697" exitCode=0 Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.353676 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" event={"ID":"784dc7bd-2dba-4638-881d-7bb4b97fa26f","Type":"ContainerDied","Data":"8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697"} Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.353708 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" event={"ID":"784dc7bd-2dba-4638-881d-7bb4b97fa26f","Type":"ContainerDied","Data":"f44206b8ebbf0693d2d4ad48b7c945bd77de0d1345fd96143b142bf69f386619"} Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.353727 4885 scope.go:117] "RemoveContainer" containerID="8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.353898 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85b9ffd899-kzvp9" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.410786 4885 scope.go:117] "RemoveContainer" containerID="cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.415698 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85b9ffd899-kzvp9"] Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.441045 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85b9ffd899-kzvp9"] Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.510949 4885 scope.go:117] "RemoveContainer" containerID="8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697" Mar 08 21:18:14 crc kubenswrapper[4885]: E0308 21:18:14.516603 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697\": container with ID starting with 8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697 not found: ID does not exist" containerID="8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.516638 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697"} err="failed to get container status \"8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697\": rpc error: code = NotFound desc = could not find container \"8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697\": container with ID starting with 8677575a9e92c59ae4078ba24b0af8110a25b709f09ea447a0357e1be0164697 not found: ID does not exist" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.516662 4885 scope.go:117] "RemoveContainer" containerID="cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516" Mar 08 21:18:14 crc kubenswrapper[4885]: E0308 21:18:14.517099 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516\": container with ID starting with cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516 not found: ID does not exist" containerID="cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516" Mar 08 21:18:14 crc kubenswrapper[4885]: I0308 21:18:14.517157 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516"} err="failed to get container status \"cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516\": rpc error: code = NotFound desc = could not find container \"cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516\": container with ID starting with cee08e4b1632bb9d3546477829279366c732ce99283de996e1d06fd8cfaa1516 not found: ID does not exist" Mar 08 21:18:15 crc kubenswrapper[4885]: I0308 21:18:15.384423 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" path="/var/lib/kubelet/pods/784dc7bd-2dba-4638-881d-7bb4b97fa26f/volumes" Mar 08 21:18:22 crc kubenswrapper[4885]: I0308 21:18:22.460903 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" event={"ID":"87894214-b974-4fc7-b23d-d739fde2466f","Type":"ContainerStarted","Data":"dc207e750510567356d9b178140ae440b16726ea2563a8ea1f73515bbd6991e8"} Mar 08 21:18:22 crc kubenswrapper[4885]: I0308 21:18:22.482679 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" podStartSLOduration=2.555009774 podStartE2EDuration="14.48265927s" podCreationTimestamp="2026-03-08 21:18:08 +0000 UTC" firstStartedPulling="2026-03-08 21:18:09.554524216 +0000 UTC m=+6390.950578249" lastFinishedPulling="2026-03-08 21:18:21.482173682 +0000 UTC m=+6402.878227745" observedRunningTime="2026-03-08 21:18:22.480318237 +0000 UTC m=+6403.876372260" watchObservedRunningTime="2026-03-08 21:18:22.48265927 +0000 UTC m=+6403.878713293" Mar 08 21:18:35 crc kubenswrapper[4885]: I0308 21:18:35.655677 4885 generic.go:334] "Generic (PLEG): container finished" podID="87894214-b974-4fc7-b23d-d739fde2466f" containerID="dc207e750510567356d9b178140ae440b16726ea2563a8ea1f73515bbd6991e8" exitCode=0 Mar 08 21:18:35 crc kubenswrapper[4885]: I0308 21:18:35.655753 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" event={"ID":"87894214-b974-4fc7-b23d-d739fde2466f","Type":"ContainerDied","Data":"dc207e750510567356d9b178140ae440b16726ea2563a8ea1f73515bbd6991e8"} Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.305383 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.420708 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-pre-adoption-validation-combined-ca-bundle\") pod \"87894214-b974-4fc7-b23d-d739fde2466f\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.420787 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ceph\") pod \"87894214-b974-4fc7-b23d-d739fde2466f\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.420824 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vzp2\" (UniqueName: \"kubernetes.io/projected/87894214-b974-4fc7-b23d-d739fde2466f-kube-api-access-8vzp2\") pod \"87894214-b974-4fc7-b23d-d739fde2466f\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.420912 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-inventory\") pod \"87894214-b974-4fc7-b23d-d739fde2466f\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.421278 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ssh-key-openstack-cell1\") pod \"87894214-b974-4fc7-b23d-d739fde2466f\" (UID: \"87894214-b974-4fc7-b23d-d739fde2466f\") " Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.430561 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87894214-b974-4fc7-b23d-d739fde2466f-kube-api-access-8vzp2" (OuterVolumeSpecName: "kube-api-access-8vzp2") pod "87894214-b974-4fc7-b23d-d739fde2466f" (UID: "87894214-b974-4fc7-b23d-d739fde2466f"). InnerVolumeSpecName "kube-api-access-8vzp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.432594 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "87894214-b974-4fc7-b23d-d739fde2466f" (UID: "87894214-b974-4fc7-b23d-d739fde2466f"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.432662 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ceph" (OuterVolumeSpecName: "ceph") pod "87894214-b974-4fc7-b23d-d739fde2466f" (UID: "87894214-b974-4fc7-b23d-d739fde2466f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.462004 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "87894214-b974-4fc7-b23d-d739fde2466f" (UID: "87894214-b974-4fc7-b23d-d739fde2466f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.465255 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-inventory" (OuterVolumeSpecName: "inventory") pod "87894214-b974-4fc7-b23d-d739fde2466f" (UID: "87894214-b974-4fc7-b23d-d739fde2466f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.526623 4885 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.526665 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.526683 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vzp2\" (UniqueName: \"kubernetes.io/projected/87894214-b974-4fc7-b23d-d739fde2466f-kube-api-access-8vzp2\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.526736 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.526746 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/87894214-b974-4fc7-b23d-d739fde2466f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.686725 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" event={"ID":"87894214-b974-4fc7-b23d-d739fde2466f","Type":"ContainerDied","Data":"e4721518e992ab047a24367c6906dad54f186361f584c8cda9abafbecbd7f10a"} Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.687385 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4721518e992ab047a24367c6906dad54f186361f584c8cda9abafbecbd7f10a" Mar 08 21:18:37 crc kubenswrapper[4885]: I0308 21:18:37.686799 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.522969 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6"] Mar 08 21:18:46 crc kubenswrapper[4885]: E0308 21:18:46.524912 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87894214-b974-4fc7-b23d-d739fde2466f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.524957 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="87894214-b974-4fc7-b23d-d739fde2466f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 08 21:18:46 crc kubenswrapper[4885]: E0308 21:18:46.524983 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerName="dnsmasq-dns" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.524992 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerName="dnsmasq-dns" Mar 08 21:18:46 crc kubenswrapper[4885]: E0308 21:18:46.525014 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerName="init" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.525022 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerName="init" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.525931 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="784dc7bd-2dba-4638-881d-7bb4b97fa26f" containerName="dnsmasq-dns" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.526179 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="87894214-b974-4fc7-b23d-d739fde2466f" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.528293 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.533684 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.534891 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.535104 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.535338 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.540264 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6"] Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.640406 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.640730 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.640841 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc9lt\" (UniqueName: \"kubernetes.io/projected/8be575f8-a741-4b5a-b7fa-c43e5dd65598-kube-api-access-pc9lt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.641115 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.641264 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.744609 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.744736 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.744788 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc9lt\" (UniqueName: \"kubernetes.io/projected/8be575f8-a741-4b5a-b7fa-c43e5dd65598-kube-api-access-pc9lt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.744973 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.745037 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.752726 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.752725 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.752984 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ssh-key-openstack-cell1\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.758704 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.775573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc9lt\" (UniqueName: \"kubernetes.io/projected/8be575f8-a741-4b5a-b7fa-c43e5dd65598-kube-api-access-pc9lt\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:46 crc kubenswrapper[4885]: I0308 21:18:46.854808 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:18:47 crc kubenswrapper[4885]: I0308 21:18:47.494045 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6"] Mar 08 21:18:47 crc kubenswrapper[4885]: I0308 21:18:47.816076 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" event={"ID":"8be575f8-a741-4b5a-b7fa-c43e5dd65598","Type":"ContainerStarted","Data":"fcaf57e77e3aeefacc328413ba9ef9243f80683c97c18bb5d80337ab11bd45a0"} Mar 08 21:18:48 crc kubenswrapper[4885]: I0308 21:18:48.833310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" event={"ID":"8be575f8-a741-4b5a-b7fa-c43e5dd65598","Type":"ContainerStarted","Data":"9a54c0c84a047f8db9c0abfba2cd8a399a25f81595ecb87af623107a04129487"} Mar 08 21:18:48 crc kubenswrapper[4885]: I0308 21:18:48.863436 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" podStartSLOduration=2.42773402 podStartE2EDuration="2.863414468s" podCreationTimestamp="2026-03-08 21:18:46 +0000 UTC" firstStartedPulling="2026-03-08 21:18:47.497364612 +0000 UTC m=+6428.893418645" lastFinishedPulling="2026-03-08 21:18:47.93304503 +0000 UTC m=+6429.329099093" observedRunningTime="2026-03-08 21:18:48.859801531 +0000 UTC m=+6430.255855594" watchObservedRunningTime="2026-03-08 21:18:48.863414468 +0000 UTC m=+6430.259468511" Mar 08 21:19:46 crc kubenswrapper[4885]: I0308 21:19:46.071852 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-l5ssn"] Mar 08 21:19:46 crc kubenswrapper[4885]: I0308 21:19:46.093616 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-l5ssn"] Mar 08 21:19:47 crc kubenswrapper[4885]: I0308 21:19:47.385370 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93292c62-a3f4-439e-98fd-85ff17958f38" path="/var/lib/kubelet/pods/93292c62-a3f4-439e-98fd-85ff17958f38/volumes" Mar 08 21:19:48 crc kubenswrapper[4885]: I0308 21:19:48.044690 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-208d-account-create-update-22tjb"] Mar 08 21:19:48 crc kubenswrapper[4885]: I0308 21:19:48.056780 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-208d-account-create-update-22tjb"] Mar 08 21:19:49 crc kubenswrapper[4885]: I0308 21:19:49.389163 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef43c9be-bb15-4927-b520-fe1b5ea3cabb" path="/var/lib/kubelet/pods/ef43c9be-bb15-4927-b520-fe1b5ea3cabb/volumes" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.380822 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qslr6"] Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.383818 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.394513 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qslr6"] Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.553890 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkrlh\" (UniqueName: \"kubernetes.io/projected/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-kube-api-access-dkrlh\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.553989 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-catalog-content\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.554076 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-utilities\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.656061 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkrlh\" (UniqueName: \"kubernetes.io/projected/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-kube-api-access-dkrlh\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.656153 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-catalog-content\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.656216 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-utilities\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.656652 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-catalog-content\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.656742 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-utilities\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.679484 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkrlh\" (UniqueName: \"kubernetes.io/projected/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-kube-api-access-dkrlh\") pod \"redhat-marketplace-qslr6\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:50 crc kubenswrapper[4885]: I0308 21:19:50.715856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:19:51 crc kubenswrapper[4885]: I0308 21:19:51.182280 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qslr6"] Mar 08 21:19:51 crc kubenswrapper[4885]: I0308 21:19:51.670588 4885 generic.go:334] "Generic (PLEG): container finished" podID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerID="a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164" exitCode=0 Mar 08 21:19:51 crc kubenswrapper[4885]: I0308 21:19:51.670645 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerDied","Data":"a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164"} Mar 08 21:19:51 crc kubenswrapper[4885]: I0308 21:19:51.670668 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerStarted","Data":"02a8a96a2f1f9be2101d93146b1848a38e54aef39da2adc96249ad0085caa00c"} Mar 08 21:19:51 crc kubenswrapper[4885]: I0308 21:19:51.672607 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:19:52 crc kubenswrapper[4885]: I0308 21:19:52.692296 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerStarted","Data":"2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e"} Mar 08 21:19:53 crc kubenswrapper[4885]: I0308 21:19:53.038975 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-kg2st"] Mar 08 21:19:53 crc kubenswrapper[4885]: I0308 21:19:53.048935 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-kg2st"] Mar 08 21:19:53 crc kubenswrapper[4885]: I0308 21:19:53.384802 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d6415b-d535-426e-a500-cd8e25255bde" path="/var/lib/kubelet/pods/c4d6415b-d535-426e-a500-cd8e25255bde/volumes" Mar 08 21:19:53 crc kubenswrapper[4885]: I0308 21:19:53.706996 4885 generic.go:334] "Generic (PLEG): container finished" podID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerID="2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e" exitCode=0 Mar 08 21:19:53 crc kubenswrapper[4885]: I0308 21:19:53.707086 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerDied","Data":"2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e"} Mar 08 21:19:54 crc kubenswrapper[4885]: I0308 21:19:54.075811 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-70a3-account-create-update-kvgcv"] Mar 08 21:19:54 crc kubenswrapper[4885]: I0308 21:19:54.085705 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-70a3-account-create-update-kvgcv"] Mar 08 21:19:54 crc kubenswrapper[4885]: I0308 21:19:54.718908 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerStarted","Data":"da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61"} Mar 08 21:19:54 crc kubenswrapper[4885]: I0308 21:19:54.752423 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qslr6" podStartSLOduration=2.332051724 podStartE2EDuration="4.752398891s" podCreationTimestamp="2026-03-08 21:19:50 +0000 UTC" firstStartedPulling="2026-03-08 21:19:51.672400442 +0000 UTC m=+6493.068454465" lastFinishedPulling="2026-03-08 21:19:54.092747569 +0000 UTC m=+6495.488801632" observedRunningTime="2026-03-08 21:19:54.737734088 +0000 UTC m=+6496.133788101" watchObservedRunningTime="2026-03-08 21:19:54.752398891 +0000 UTC m=+6496.148452944" Mar 08 21:19:55 crc kubenswrapper[4885]: I0308 21:19:55.379678 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddfd3421-88b0-49f2-b94e-fe31c3b5c12f" path="/var/lib/kubelet/pods/ddfd3421-88b0-49f2-b94e-fe31c3b5c12f/volumes" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.160179 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550080-q8z87"] Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.163116 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.166561 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.166867 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.167069 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.172083 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550080-q8z87"] Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.290426 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b62p8\" (UniqueName: \"kubernetes.io/projected/099da518-0e8c-4661-86bf-efcce5fd4f59-kube-api-access-b62p8\") pod \"auto-csr-approver-29550080-q8z87\" (UID: \"099da518-0e8c-4661-86bf-efcce5fd4f59\") " pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.393033 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b62p8\" (UniqueName: \"kubernetes.io/projected/099da518-0e8c-4661-86bf-efcce5fd4f59-kube-api-access-b62p8\") pod \"auto-csr-approver-29550080-q8z87\" (UID: \"099da518-0e8c-4661-86bf-efcce5fd4f59\") " pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.428667 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b62p8\" (UniqueName: \"kubernetes.io/projected/099da518-0e8c-4661-86bf-efcce5fd4f59-kube-api-access-b62p8\") pod \"auto-csr-approver-29550080-q8z87\" (UID: \"099da518-0e8c-4661-86bf-efcce5fd4f59\") " pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.492513 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.717221 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.717370 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.788561 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:20:00 crc kubenswrapper[4885]: I0308 21:20:00.847436 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:20:01 crc kubenswrapper[4885]: I0308 21:20:01.034234 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qslr6"] Mar 08 21:20:01 crc kubenswrapper[4885]: I0308 21:20:01.054739 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550080-q8z87"] Mar 08 21:20:01 crc kubenswrapper[4885]: I0308 21:20:01.808362 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550080-q8z87" event={"ID":"099da518-0e8c-4661-86bf-efcce5fd4f59","Type":"ContainerStarted","Data":"bd5550eb8bb8b9907536a5623038cd9d8f560a22bc77082293d5ea76e71eee9e"} Mar 08 21:20:02 crc kubenswrapper[4885]: I0308 21:20:02.817303 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qslr6" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="registry-server" containerID="cri-o://da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61" gracePeriod=2 Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.423164 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.579583 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-utilities\") pod \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.579746 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkrlh\" (UniqueName: \"kubernetes.io/projected/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-kube-api-access-dkrlh\") pod \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.579945 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-catalog-content\") pod \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\" (UID: \"55c2a47a-66e1-4e37-a252-04e2b98eb7bf\") " Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.580941 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-utilities" (OuterVolumeSpecName: "utilities") pod "55c2a47a-66e1-4e37-a252-04e2b98eb7bf" (UID: "55c2a47a-66e1-4e37-a252-04e2b98eb7bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.590122 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-kube-api-access-dkrlh" (OuterVolumeSpecName: "kube-api-access-dkrlh") pod "55c2a47a-66e1-4e37-a252-04e2b98eb7bf" (UID: "55c2a47a-66e1-4e37-a252-04e2b98eb7bf"). InnerVolumeSpecName "kube-api-access-dkrlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.606178 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55c2a47a-66e1-4e37-a252-04e2b98eb7bf" (UID: "55c2a47a-66e1-4e37-a252-04e2b98eb7bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.683470 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.683706 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.683718 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkrlh\" (UniqueName: \"kubernetes.io/projected/55c2a47a-66e1-4e37-a252-04e2b98eb7bf-kube-api-access-dkrlh\") on node \"crc\" DevicePath \"\"" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.827246 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550080-q8z87" event={"ID":"099da518-0e8c-4661-86bf-efcce5fd4f59","Type":"ContainerStarted","Data":"34b50f5f966e37811bb8a32ad3d6e1abb40b701290a57e1661e950c1bc924933"} Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.830858 4885 generic.go:334] "Generic (PLEG): container finished" podID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerID="da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61" exitCode=0 Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.830891 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerDied","Data":"da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61"} Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.830947 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qslr6" event={"ID":"55c2a47a-66e1-4e37-a252-04e2b98eb7bf","Type":"ContainerDied","Data":"02a8a96a2f1f9be2101d93146b1848a38e54aef39da2adc96249ad0085caa00c"} Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.830966 4885 scope.go:117] "RemoveContainer" containerID="da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.830986 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qslr6" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.852848 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550080-q8z87" podStartSLOduration=2.494814973 podStartE2EDuration="3.852818784s" podCreationTimestamp="2026-03-08 21:20:00 +0000 UTC" firstStartedPulling="2026-03-08 21:20:01.058462292 +0000 UTC m=+6502.454516325" lastFinishedPulling="2026-03-08 21:20:02.416466073 +0000 UTC m=+6503.812520136" observedRunningTime="2026-03-08 21:20:03.849575897 +0000 UTC m=+6505.245629930" watchObservedRunningTime="2026-03-08 21:20:03.852818784 +0000 UTC m=+6505.248872807" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.908609 4885 scope.go:117] "RemoveContainer" containerID="2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e" Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.912823 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qslr6"] Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.922999 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qslr6"] Mar 08 21:20:03 crc kubenswrapper[4885]: I0308 21:20:03.934751 4885 scope.go:117] "RemoveContainer" containerID="a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.001657 4885 scope.go:117] "RemoveContainer" containerID="da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61" Mar 08 21:20:04 crc kubenswrapper[4885]: E0308 21:20:04.002220 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61\": container with ID starting with da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61 not found: ID does not exist" containerID="da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.002303 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61"} err="failed to get container status \"da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61\": rpc error: code = NotFound desc = could not find container \"da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61\": container with ID starting with da21247064e8adfd8467340c132e8ed6c8453aa6ca5688e1633b73e642c6af61 not found: ID does not exist" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.002365 4885 scope.go:117] "RemoveContainer" containerID="2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e" Mar 08 21:20:04 crc kubenswrapper[4885]: E0308 21:20:04.002876 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e\": container with ID starting with 2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e not found: ID does not exist" containerID="2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.002903 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e"} err="failed to get container status \"2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e\": rpc error: code = NotFound desc = could not find container \"2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e\": container with ID starting with 2562f06bc617cf8d97846e28436de15bbb082e8d939f4ae777c0ad2654e3172e not found: ID does not exist" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.002984 4885 scope.go:117] "RemoveContainer" containerID="a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164" Mar 08 21:20:04 crc kubenswrapper[4885]: E0308 21:20:04.003324 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164\": container with ID starting with a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164 not found: ID does not exist" containerID="a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.003357 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164"} err="failed to get container status \"a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164\": rpc error: code = NotFound desc = could not find container \"a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164\": container with ID starting with a2b0c59522472f7bf7b64e38c3244e73a836d97d6dad4fb4ea6d4f5301b67164 not found: ID does not exist" Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.849432 4885 generic.go:334] "Generic (PLEG): container finished" podID="099da518-0e8c-4661-86bf-efcce5fd4f59" containerID="34b50f5f966e37811bb8a32ad3d6e1abb40b701290a57e1661e950c1bc924933" exitCode=0 Mar 08 21:20:04 crc kubenswrapper[4885]: I0308 21:20:04.849512 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550080-q8z87" event={"ID":"099da518-0e8c-4661-86bf-efcce5fd4f59","Type":"ContainerDied","Data":"34b50f5f966e37811bb8a32ad3d6e1abb40b701290a57e1661e950c1bc924933"} Mar 08 21:20:05 crc kubenswrapper[4885]: I0308 21:20:05.389527 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" path="/var/lib/kubelet/pods/55c2a47a-66e1-4e37-a252-04e2b98eb7bf/volumes" Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.256451 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.448509 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b62p8\" (UniqueName: \"kubernetes.io/projected/099da518-0e8c-4661-86bf-efcce5fd4f59-kube-api-access-b62p8\") pod \"099da518-0e8c-4661-86bf-efcce5fd4f59\" (UID: \"099da518-0e8c-4661-86bf-efcce5fd4f59\") " Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.458027 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/099da518-0e8c-4661-86bf-efcce5fd4f59-kube-api-access-b62p8" (OuterVolumeSpecName: "kube-api-access-b62p8") pod "099da518-0e8c-4661-86bf-efcce5fd4f59" (UID: "099da518-0e8c-4661-86bf-efcce5fd4f59"). InnerVolumeSpecName "kube-api-access-b62p8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.551941 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b62p8\" (UniqueName: \"kubernetes.io/projected/099da518-0e8c-4661-86bf-efcce5fd4f59-kube-api-access-b62p8\") on node \"crc\" DevicePath \"\"" Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.887429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550080-q8z87" event={"ID":"099da518-0e8c-4661-86bf-efcce5fd4f59","Type":"ContainerDied","Data":"bd5550eb8bb8b9907536a5623038cd9d8f560a22bc77082293d5ea76e71eee9e"} Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.887752 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd5550eb8bb8b9907536a5623038cd9d8f560a22bc77082293d5ea76e71eee9e" Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.887820 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550080-q8z87" Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.951723 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550074-lqgkm"] Mar 08 21:20:06 crc kubenswrapper[4885]: I0308 21:20:06.964776 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550074-lqgkm"] Mar 08 21:20:07 crc kubenswrapper[4885]: I0308 21:20:07.393174 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a0dd2e1-2283-49bf-b5d1-deb889245d93" path="/var/lib/kubelet/pods/3a0dd2e1-2283-49bf-b5d1-deb889245d93/volumes" Mar 08 21:20:09 crc kubenswrapper[4885]: I0308 21:20:09.927639 4885 scope.go:117] "RemoveContainer" containerID="4f877a81b26ff73a856887aee5ba6b11b65e1a2c9a19d4946db0e527c8b1dfc9" Mar 08 21:20:09 crc kubenswrapper[4885]: I0308 21:20:09.962501 4885 scope.go:117] "RemoveContainer" containerID="52641a15b7d3eed6bc15113db82cacc9c6bd5304460efbff7b7427e47ea8d579" Mar 08 21:20:10 crc kubenswrapper[4885]: I0308 21:20:10.023266 4885 scope.go:117] "RemoveContainer" containerID="10f01dfd93c84f82b0e33850f2cd43179983bc59ab2fd73179f62505bcc743de" Mar 08 21:20:10 crc kubenswrapper[4885]: I0308 21:20:10.066809 4885 scope.go:117] "RemoveContainer" containerID="787b1783aeddf609e5e191b59369719fad9abea1b0e98367db13c4196466f2fe" Mar 08 21:20:10 crc kubenswrapper[4885]: I0308 21:20:10.119847 4885 scope.go:117] "RemoveContainer" containerID="49b008c658a6440bfe62e05cf707f768838f42546076db4ab0906a7cc3f15598" Mar 08 21:20:24 crc kubenswrapper[4885]: I0308 21:20:24.052149 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-bvj5k"] Mar 08 21:20:24 crc kubenswrapper[4885]: I0308 21:20:24.063450 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-bvj5k"] Mar 08 21:20:25 crc kubenswrapper[4885]: I0308 21:20:25.387491 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020ad790-8a8c-4e05-b3da-d6b823bb37e2" path="/var/lib/kubelet/pods/020ad790-8a8c-4e05-b3da-d6b823bb37e2/volumes" Mar 08 21:20:32 crc kubenswrapper[4885]: I0308 21:20:32.818087 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:20:32 crc kubenswrapper[4885]: I0308 21:20:32.818614 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.086715 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-krhv4"] Mar 08 21:20:41 crc kubenswrapper[4885]: E0308 21:20:41.088409 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="extract-utilities" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.088442 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="extract-utilities" Mar 08 21:20:41 crc kubenswrapper[4885]: E0308 21:20:41.088500 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="registry-server" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.088519 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="registry-server" Mar 08 21:20:41 crc kubenswrapper[4885]: E0308 21:20:41.088553 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="extract-content" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.088571 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="extract-content" Mar 08 21:20:41 crc kubenswrapper[4885]: E0308 21:20:41.088615 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="099da518-0e8c-4661-86bf-efcce5fd4f59" containerName="oc" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.088633 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="099da518-0e8c-4661-86bf-efcce5fd4f59" containerName="oc" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.089275 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="099da518-0e8c-4661-86bf-efcce5fd4f59" containerName="oc" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.089308 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c2a47a-66e1-4e37-a252-04e2b98eb7bf" containerName="registry-server" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.093182 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.109404 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krhv4"] Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.198358 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-catalog-content\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.198543 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9lh8\" (UniqueName: \"kubernetes.io/projected/86b928a5-44bc-4427-93bf-03426bea7ef0-kube-api-access-t9lh8\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.198707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-utilities\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.301434 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9lh8\" (UniqueName: \"kubernetes.io/projected/86b928a5-44bc-4427-93bf-03426bea7ef0-kube-api-access-t9lh8\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.301572 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-utilities\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.301779 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-catalog-content\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.302047 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-utilities\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.302283 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-catalog-content\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.328155 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9lh8\" (UniqueName: \"kubernetes.io/projected/86b928a5-44bc-4427-93bf-03426bea7ef0-kube-api-access-t9lh8\") pod \"certified-operators-krhv4\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.423013 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:20:41 crc kubenswrapper[4885]: I0308 21:20:41.957337 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krhv4"] Mar 08 21:20:42 crc kubenswrapper[4885]: I0308 21:20:42.310533 4885 generic.go:334] "Generic (PLEG): container finished" podID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerID="a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a" exitCode=0 Mar 08 21:20:42 crc kubenswrapper[4885]: I0308 21:20:42.310905 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerDied","Data":"a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a"} Mar 08 21:20:42 crc kubenswrapper[4885]: I0308 21:20:42.310985 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerStarted","Data":"3d026d2edcc32f59d1fb64048cf1fa1e8e9381630009926a8e768bbf9b68b079"} Mar 08 21:20:44 crc kubenswrapper[4885]: I0308 21:20:44.331610 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerStarted","Data":"b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc"} Mar 08 21:20:50 crc kubenswrapper[4885]: I0308 21:20:50.411222 4885 generic.go:334] "Generic (PLEG): container finished" podID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerID="b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc" exitCode=0 Mar 08 21:20:50 crc kubenswrapper[4885]: I0308 21:20:50.411416 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerDied","Data":"b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc"} Mar 08 21:20:51 crc kubenswrapper[4885]: I0308 21:20:51.424117 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerStarted","Data":"cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1"} Mar 08 21:20:51 crc kubenswrapper[4885]: I0308 21:20:51.445762 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-krhv4" podStartSLOduration=1.9560979330000001 podStartE2EDuration="10.445732456s" podCreationTimestamp="2026-03-08 21:20:41 +0000 UTC" firstStartedPulling="2026-03-08 21:20:42.312720018 +0000 UTC m=+6543.708774051" lastFinishedPulling="2026-03-08 21:20:50.802354551 +0000 UTC m=+6552.198408574" observedRunningTime="2026-03-08 21:20:51.444984726 +0000 UTC m=+6552.841038759" watchObservedRunningTime="2026-03-08 21:20:51.445732456 +0000 UTC m=+6552.841786529" Mar 08 21:21:01 crc kubenswrapper[4885]: I0308 21:21:01.423452 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:21:01 crc kubenswrapper[4885]: I0308 21:21:01.424163 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:21:01 crc kubenswrapper[4885]: I0308 21:21:01.498054 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:21:01 crc kubenswrapper[4885]: I0308 21:21:01.614056 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:21:01 crc kubenswrapper[4885]: I0308 21:21:01.760374 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krhv4"] Mar 08 21:21:02 crc kubenswrapper[4885]: I0308 21:21:02.819296 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:21:02 crc kubenswrapper[4885]: I0308 21:21:02.820145 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:21:03 crc kubenswrapper[4885]: I0308 21:21:03.573660 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-krhv4" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="registry-server" containerID="cri-o://cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1" gracePeriod=2 Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.170806 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.252022 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-utilities\") pod \"86b928a5-44bc-4427-93bf-03426bea7ef0\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.252150 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-catalog-content\") pod \"86b928a5-44bc-4427-93bf-03426bea7ef0\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.252385 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9lh8\" (UniqueName: \"kubernetes.io/projected/86b928a5-44bc-4427-93bf-03426bea7ef0-kube-api-access-t9lh8\") pod \"86b928a5-44bc-4427-93bf-03426bea7ef0\" (UID: \"86b928a5-44bc-4427-93bf-03426bea7ef0\") " Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.254729 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-utilities" (OuterVolumeSpecName: "utilities") pod "86b928a5-44bc-4427-93bf-03426bea7ef0" (UID: "86b928a5-44bc-4427-93bf-03426bea7ef0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.268111 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b928a5-44bc-4427-93bf-03426bea7ef0-kube-api-access-t9lh8" (OuterVolumeSpecName: "kube-api-access-t9lh8") pod "86b928a5-44bc-4427-93bf-03426bea7ef0" (UID: "86b928a5-44bc-4427-93bf-03426bea7ef0"). InnerVolumeSpecName "kube-api-access-t9lh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.343546 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86b928a5-44bc-4427-93bf-03426bea7ef0" (UID: "86b928a5-44bc-4427-93bf-03426bea7ef0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.354295 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9lh8\" (UniqueName: \"kubernetes.io/projected/86b928a5-44bc-4427-93bf-03426bea7ef0-kube-api-access-t9lh8\") on node \"crc\" DevicePath \"\"" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.354335 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.354344 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b928a5-44bc-4427-93bf-03426bea7ef0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.593246 4885 generic.go:334] "Generic (PLEG): container finished" podID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerID="cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1" exitCode=0 Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.593306 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerDied","Data":"cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1"} Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.593347 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krhv4" event={"ID":"86b928a5-44bc-4427-93bf-03426bea7ef0","Type":"ContainerDied","Data":"3d026d2edcc32f59d1fb64048cf1fa1e8e9381630009926a8e768bbf9b68b079"} Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.593377 4885 scope.go:117] "RemoveContainer" containerID="cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.593569 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krhv4" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.627999 4885 scope.go:117] "RemoveContainer" containerID="b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.637327 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krhv4"] Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.663700 4885 scope.go:117] "RemoveContainer" containerID="a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.664724 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-krhv4"] Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.708302 4885 scope.go:117] "RemoveContainer" containerID="cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1" Mar 08 21:21:04 crc kubenswrapper[4885]: E0308 21:21:04.708935 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1\": container with ID starting with cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1 not found: ID does not exist" containerID="cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.709005 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1"} err="failed to get container status \"cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1\": rpc error: code = NotFound desc = could not find container \"cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1\": container with ID starting with cd88ef835eec1796e3ba66c71db443dfb57638e156626b03f2ef87ee633ca9a1 not found: ID does not exist" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.709036 4885 scope.go:117] "RemoveContainer" containerID="b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc" Mar 08 21:21:04 crc kubenswrapper[4885]: E0308 21:21:04.709485 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc\": container with ID starting with b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc not found: ID does not exist" containerID="b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.709545 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc"} err="failed to get container status \"b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc\": rpc error: code = NotFound desc = could not find container \"b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc\": container with ID starting with b1d6502ca11e08fa2500f4756232a0aac2e9f266e0cb87d167ff32b63d24c4cc not found: ID does not exist" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.709594 4885 scope.go:117] "RemoveContainer" containerID="a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a" Mar 08 21:21:04 crc kubenswrapper[4885]: E0308 21:21:04.709987 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a\": container with ID starting with a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a not found: ID does not exist" containerID="a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a" Mar 08 21:21:04 crc kubenswrapper[4885]: I0308 21:21:04.710024 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a"} err="failed to get container status \"a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a\": rpc error: code = NotFound desc = could not find container \"a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a\": container with ID starting with a3afda6992362e908514f5ca8e33f9832e8388327621327d7ffdcad4876d7e7a not found: ID does not exist" Mar 08 21:21:05 crc kubenswrapper[4885]: I0308 21:21:05.387044 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" path="/var/lib/kubelet/pods/86b928a5-44bc-4427-93bf-03426bea7ef0/volumes" Mar 08 21:21:10 crc kubenswrapper[4885]: I0308 21:21:10.370758 4885 scope.go:117] "RemoveContainer" containerID="3c54259ed2f9cad06d6b82cb394e43cf58f48cf3c6a30620f67aa6eb4637dc84" Mar 08 21:21:10 crc kubenswrapper[4885]: I0308 21:21:10.418494 4885 scope.go:117] "RemoveContainer" containerID="952b1007a07234fed2a2f1ecec5204600c8240958b93368c30ef4f62fcb4517a" Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.818155 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.818976 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.819052 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.820173 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"17d10d0a74a5cbd79193584e373306e9a6f05fd494997dbeda172a5bfdd668a3"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.820280 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://17d10d0a74a5cbd79193584e373306e9a6f05fd494997dbeda172a5bfdd668a3" gracePeriod=600 Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.955193 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="17d10d0a74a5cbd79193584e373306e9a6f05fd494997dbeda172a5bfdd668a3" exitCode=0 Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.955235 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"17d10d0a74a5cbd79193584e373306e9a6f05fd494997dbeda172a5bfdd668a3"} Mar 08 21:21:32 crc kubenswrapper[4885]: I0308 21:21:32.955265 4885 scope.go:117] "RemoveContainer" containerID="683dc9558b6a41c0fa2136e807e5879bd87f150422b9bda5818d93361d0daff9" Mar 08 21:21:33 crc kubenswrapper[4885]: I0308 21:21:33.967513 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937"} Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.154207 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550082-r8zzq"] Mar 08 21:22:00 crc kubenswrapper[4885]: E0308 21:22:00.155505 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="registry-server" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.155531 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="registry-server" Mar 08 21:22:00 crc kubenswrapper[4885]: E0308 21:22:00.155566 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="extract-utilities" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.155575 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="extract-utilities" Mar 08 21:22:00 crc kubenswrapper[4885]: E0308 21:22:00.155594 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="extract-content" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.155602 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="extract-content" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.155903 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b928a5-44bc-4427-93bf-03426bea7ef0" containerName="registry-server" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.156844 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.159789 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.159893 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.159993 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.169536 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550082-r8zzq"] Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.208808 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbt2j\" (UniqueName: \"kubernetes.io/projected/b683b024-7ab4-40e6-9380-ad5f3c4c9751-kube-api-access-sbt2j\") pod \"auto-csr-approver-29550082-r8zzq\" (UID: \"b683b024-7ab4-40e6-9380-ad5f3c4c9751\") " pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.311349 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbt2j\" (UniqueName: \"kubernetes.io/projected/b683b024-7ab4-40e6-9380-ad5f3c4c9751-kube-api-access-sbt2j\") pod \"auto-csr-approver-29550082-r8zzq\" (UID: \"b683b024-7ab4-40e6-9380-ad5f3c4c9751\") " pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.336093 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbt2j\" (UniqueName: \"kubernetes.io/projected/b683b024-7ab4-40e6-9380-ad5f3c4c9751-kube-api-access-sbt2j\") pod \"auto-csr-approver-29550082-r8zzq\" (UID: \"b683b024-7ab4-40e6-9380-ad5f3c4c9751\") " pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.478835 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:00 crc kubenswrapper[4885]: I0308 21:22:00.986712 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550082-r8zzq"] Mar 08 21:22:01 crc kubenswrapper[4885]: I0308 21:22:01.393117 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" event={"ID":"b683b024-7ab4-40e6-9380-ad5f3c4c9751","Type":"ContainerStarted","Data":"ae0e4390b6ca6abba67b9c50a489defce6283624f6c478fb25f1e982d66f647a"} Mar 08 21:22:02 crc kubenswrapper[4885]: I0308 21:22:02.405267 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" event={"ID":"b683b024-7ab4-40e6-9380-ad5f3c4c9751","Type":"ContainerStarted","Data":"d8208765f1f2335c1dc540c6cbadcec4aefebc9eff84e251842efe2b691b630b"} Mar 08 21:22:02 crc kubenswrapper[4885]: I0308 21:22:02.426162 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" podStartSLOduration=1.388172011 podStartE2EDuration="2.426139732s" podCreationTimestamp="2026-03-08 21:22:00 +0000 UTC" firstStartedPulling="2026-03-08 21:22:00.987550252 +0000 UTC m=+6622.383604305" lastFinishedPulling="2026-03-08 21:22:02.025517983 +0000 UTC m=+6623.421572026" observedRunningTime="2026-03-08 21:22:02.420564272 +0000 UTC m=+6623.816618305" watchObservedRunningTime="2026-03-08 21:22:02.426139732 +0000 UTC m=+6623.822193775" Mar 08 21:22:03 crc kubenswrapper[4885]: I0308 21:22:03.420578 4885 generic.go:334] "Generic (PLEG): container finished" podID="b683b024-7ab4-40e6-9380-ad5f3c4c9751" containerID="d8208765f1f2335c1dc540c6cbadcec4aefebc9eff84e251842efe2b691b630b" exitCode=0 Mar 08 21:22:03 crc kubenswrapper[4885]: I0308 21:22:03.421150 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" event={"ID":"b683b024-7ab4-40e6-9380-ad5f3c4c9751","Type":"ContainerDied","Data":"d8208765f1f2335c1dc540c6cbadcec4aefebc9eff84e251842efe2b691b630b"} Mar 08 21:22:04 crc kubenswrapper[4885]: I0308 21:22:04.915761 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.029854 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbt2j\" (UniqueName: \"kubernetes.io/projected/b683b024-7ab4-40e6-9380-ad5f3c4c9751-kube-api-access-sbt2j\") pod \"b683b024-7ab4-40e6-9380-ad5f3c4c9751\" (UID: \"b683b024-7ab4-40e6-9380-ad5f3c4c9751\") " Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.034908 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b683b024-7ab4-40e6-9380-ad5f3c4c9751-kube-api-access-sbt2j" (OuterVolumeSpecName: "kube-api-access-sbt2j") pod "b683b024-7ab4-40e6-9380-ad5f3c4c9751" (UID: "b683b024-7ab4-40e6-9380-ad5f3c4c9751"). InnerVolumeSpecName "kube-api-access-sbt2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.133113 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbt2j\" (UniqueName: \"kubernetes.io/projected/b683b024-7ab4-40e6-9380-ad5f3c4c9751-kube-api-access-sbt2j\") on node \"crc\" DevicePath \"\"" Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.450746 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" event={"ID":"b683b024-7ab4-40e6-9380-ad5f3c4c9751","Type":"ContainerDied","Data":"ae0e4390b6ca6abba67b9c50a489defce6283624f6c478fb25f1e982d66f647a"} Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.450808 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae0e4390b6ca6abba67b9c50a489defce6283624f6c478fb25f1e982d66f647a" Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.450873 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550082-r8zzq" Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.526272 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550076-qtxj8"] Mar 08 21:22:05 crc kubenswrapper[4885]: I0308 21:22:05.540297 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550076-qtxj8"] Mar 08 21:22:07 crc kubenswrapper[4885]: I0308 21:22:07.381005 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6963ac4b-0b7b-489f-a98a-7bad7270d510" path="/var/lib/kubelet/pods/6963ac4b-0b7b-489f-a98a-7bad7270d510/volumes" Mar 08 21:22:10 crc kubenswrapper[4885]: I0308 21:22:10.493964 4885 scope.go:117] "RemoveContainer" containerID="e527652c3f32f5179c847a20bd0a6dafb8df7997ca784a705d3328979c68ce90" Mar 08 21:23:07 crc kubenswrapper[4885]: I0308 21:23:07.051446 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-rstwd"] Mar 08 21:23:07 crc kubenswrapper[4885]: I0308 21:23:07.066028 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-rstwd"] Mar 08 21:23:07 crc kubenswrapper[4885]: I0308 21:23:07.383402 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b956841b-a9a1-4c38-99e9-05c6e5f9f363" path="/var/lib/kubelet/pods/b956841b-a9a1-4c38-99e9-05c6e5f9f363/volumes" Mar 08 21:23:08 crc kubenswrapper[4885]: I0308 21:23:08.064284 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-6865-account-create-update-5p2d8"] Mar 08 21:23:08 crc kubenswrapper[4885]: I0308 21:23:08.076787 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-6865-account-create-update-5p2d8"] Mar 08 21:23:09 crc kubenswrapper[4885]: I0308 21:23:09.380571 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49449c65-7a7c-437f-b4d9-23b2a219485f" path="/var/lib/kubelet/pods/49449c65-7a7c-437f-b4d9-23b2a219485f/volumes" Mar 08 21:23:10 crc kubenswrapper[4885]: I0308 21:23:10.634936 4885 scope.go:117] "RemoveContainer" containerID="d70a935397b593663bdf26afd9e76f5f57ebfae75c4ef218c57dc585c1689e21" Mar 08 21:23:10 crc kubenswrapper[4885]: I0308 21:23:10.658116 4885 scope.go:117] "RemoveContainer" containerID="9a171ca46b7a7190a2333a95503ee285e088cd83dd56cbc13cb8f2021b946782" Mar 08 21:23:23 crc kubenswrapper[4885]: I0308 21:23:23.058630 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-fs9dx"] Mar 08 21:23:23 crc kubenswrapper[4885]: I0308 21:23:23.067805 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-fs9dx"] Mar 08 21:23:23 crc kubenswrapper[4885]: I0308 21:23:23.385101 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="495d39cf-6a4d-4ca0-90b6-9a22323d1568" path="/var/lib/kubelet/pods/495d39cf-6a4d-4ca0-90b6-9a22323d1568/volumes" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.156850 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550084-9r2m6"] Mar 08 21:24:00 crc kubenswrapper[4885]: E0308 21:24:00.158438 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b683b024-7ab4-40e6-9380-ad5f3c4c9751" containerName="oc" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.158464 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b683b024-7ab4-40e6-9380-ad5f3c4c9751" containerName="oc" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.158837 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b683b024-7ab4-40e6-9380-ad5f3c4c9751" containerName="oc" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.160690 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.165370 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.165775 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.166047 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.178978 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550084-9r2m6"] Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.286288 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjzxp\" (UniqueName: \"kubernetes.io/projected/3d2e43f9-d4b1-4059-b714-26745e0d96ce-kube-api-access-sjzxp\") pod \"auto-csr-approver-29550084-9r2m6\" (UID: \"3d2e43f9-d4b1-4059-b714-26745e0d96ce\") " pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.388818 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjzxp\" (UniqueName: \"kubernetes.io/projected/3d2e43f9-d4b1-4059-b714-26745e0d96ce-kube-api-access-sjzxp\") pod \"auto-csr-approver-29550084-9r2m6\" (UID: \"3d2e43f9-d4b1-4059-b714-26745e0d96ce\") " pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.412339 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjzxp\" (UniqueName: \"kubernetes.io/projected/3d2e43f9-d4b1-4059-b714-26745e0d96ce-kube-api-access-sjzxp\") pod \"auto-csr-approver-29550084-9r2m6\" (UID: \"3d2e43f9-d4b1-4059-b714-26745e0d96ce\") " pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:00 crc kubenswrapper[4885]: I0308 21:24:00.485233 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:01 crc kubenswrapper[4885]: I0308 21:24:01.051224 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550084-9r2m6"] Mar 08 21:24:01 crc kubenswrapper[4885]: I0308 21:24:01.229795 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" event={"ID":"3d2e43f9-d4b1-4059-b714-26745e0d96ce","Type":"ContainerStarted","Data":"b56a346f23a5ff393f482bfe2e6c0d17c5b4dcf44148e8c127f56167e792f65a"} Mar 08 21:24:02 crc kubenswrapper[4885]: I0308 21:24:02.818410 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:24:02 crc kubenswrapper[4885]: I0308 21:24:02.819062 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:24:03 crc kubenswrapper[4885]: I0308 21:24:03.257825 4885 generic.go:334] "Generic (PLEG): container finished" podID="3d2e43f9-d4b1-4059-b714-26745e0d96ce" containerID="b81c31bbcbb29c0b9da44d4b1b46d19caf5f3201ab351fd09c58a29971e19359" exitCode=0 Mar 08 21:24:03 crc kubenswrapper[4885]: I0308 21:24:03.257888 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" event={"ID":"3d2e43f9-d4b1-4059-b714-26745e0d96ce","Type":"ContainerDied","Data":"b81c31bbcbb29c0b9da44d4b1b46d19caf5f3201ab351fd09c58a29971e19359"} Mar 08 21:24:04 crc kubenswrapper[4885]: I0308 21:24:04.769621 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:04 crc kubenswrapper[4885]: I0308 21:24:04.898877 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjzxp\" (UniqueName: \"kubernetes.io/projected/3d2e43f9-d4b1-4059-b714-26745e0d96ce-kube-api-access-sjzxp\") pod \"3d2e43f9-d4b1-4059-b714-26745e0d96ce\" (UID: \"3d2e43f9-d4b1-4059-b714-26745e0d96ce\") " Mar 08 21:24:04 crc kubenswrapper[4885]: I0308 21:24:04.910099 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d2e43f9-d4b1-4059-b714-26745e0d96ce-kube-api-access-sjzxp" (OuterVolumeSpecName: "kube-api-access-sjzxp") pod "3d2e43f9-d4b1-4059-b714-26745e0d96ce" (UID: "3d2e43f9-d4b1-4059-b714-26745e0d96ce"). InnerVolumeSpecName "kube-api-access-sjzxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:24:05 crc kubenswrapper[4885]: I0308 21:24:05.002246 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjzxp\" (UniqueName: \"kubernetes.io/projected/3d2e43f9-d4b1-4059-b714-26745e0d96ce-kube-api-access-sjzxp\") on node \"crc\" DevicePath \"\"" Mar 08 21:24:05 crc kubenswrapper[4885]: I0308 21:24:05.285186 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" event={"ID":"3d2e43f9-d4b1-4059-b714-26745e0d96ce","Type":"ContainerDied","Data":"b56a346f23a5ff393f482bfe2e6c0d17c5b4dcf44148e8c127f56167e792f65a"} Mar 08 21:24:05 crc kubenswrapper[4885]: I0308 21:24:05.286253 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b56a346f23a5ff393f482bfe2e6c0d17c5b4dcf44148e8c127f56167e792f65a" Mar 08 21:24:05 crc kubenswrapper[4885]: I0308 21:24:05.285268 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550084-9r2m6" Mar 08 21:24:05 crc kubenswrapper[4885]: I0308 21:24:05.871743 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550078-gspxv"] Mar 08 21:24:05 crc kubenswrapper[4885]: I0308 21:24:05.882740 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550078-gspxv"] Mar 08 21:24:07 crc kubenswrapper[4885]: I0308 21:24:07.382938 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f0d7d35-3a1c-4505-8fa3-190d8ec038ee" path="/var/lib/kubelet/pods/1f0d7d35-3a1c-4505-8fa3-190d8ec038ee/volumes" Mar 08 21:24:10 crc kubenswrapper[4885]: I0308 21:24:10.764004 4885 scope.go:117] "RemoveContainer" containerID="cdd1e2f0b8ab5f05aa992613d4a0e2df88f958ed88a6e2f7da9cadebeb33bfdc" Mar 08 21:24:10 crc kubenswrapper[4885]: I0308 21:24:10.854223 4885 scope.go:117] "RemoveContainer" containerID="27c7130d460aa9b10cdac4b0fbc1bf5fbafc6534f511b76384c6bd5cf7ea008a" Mar 08 21:24:32 crc kubenswrapper[4885]: I0308 21:24:32.818163 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:24:32 crc kubenswrapper[4885]: I0308 21:24:32.818735 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:24:42 crc kubenswrapper[4885]: I0308 21:24:42.958723 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7w5nj"] Mar 08 21:24:42 crc kubenswrapper[4885]: E0308 21:24:42.960126 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d2e43f9-d4b1-4059-b714-26745e0d96ce" containerName="oc" Mar 08 21:24:42 crc kubenswrapper[4885]: I0308 21:24:42.960149 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d2e43f9-d4b1-4059-b714-26745e0d96ce" containerName="oc" Mar 08 21:24:42 crc kubenswrapper[4885]: I0308 21:24:42.960508 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d2e43f9-d4b1-4059-b714-26745e0d96ce" containerName="oc" Mar 08 21:24:42 crc kubenswrapper[4885]: I0308 21:24:42.963269 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:42 crc kubenswrapper[4885]: I0308 21:24:42.973317 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7w5nj"] Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.041406 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-catalog-content\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.041725 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-utilities\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.041832 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrst7\" (UniqueName: \"kubernetes.io/projected/cded65d6-06d8-49c9-8bc3-0223d72ee23c-kube-api-access-xrst7\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.144121 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-catalog-content\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.144206 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-utilities\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.144344 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrst7\" (UniqueName: \"kubernetes.io/projected/cded65d6-06d8-49c9-8bc3-0223d72ee23c-kube-api-access-xrst7\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.144661 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-catalog-content\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.144905 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-utilities\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.166534 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrst7\" (UniqueName: \"kubernetes.io/projected/cded65d6-06d8-49c9-8bc3-0223d72ee23c-kube-api-access-xrst7\") pod \"community-operators-7w5nj\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.307220 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:43 crc kubenswrapper[4885]: I0308 21:24:43.886089 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7w5nj"] Mar 08 21:24:44 crc kubenswrapper[4885]: I0308 21:24:44.820732 4885 generic.go:334] "Generic (PLEG): container finished" podID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerID="9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574" exitCode=0 Mar 08 21:24:44 crc kubenswrapper[4885]: I0308 21:24:44.820808 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerDied","Data":"9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574"} Mar 08 21:24:44 crc kubenswrapper[4885]: I0308 21:24:44.821023 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerStarted","Data":"da4afd917b03cf9f273dbbf6900868bc97461a3068639de562d9b471c1e46179"} Mar 08 21:24:45 crc kubenswrapper[4885]: I0308 21:24:45.846432 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerStarted","Data":"2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee"} Mar 08 21:24:47 crc kubenswrapper[4885]: I0308 21:24:47.874214 4885 generic.go:334] "Generic (PLEG): container finished" podID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerID="2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee" exitCode=0 Mar 08 21:24:47 crc kubenswrapper[4885]: I0308 21:24:47.874300 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerDied","Data":"2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee"} Mar 08 21:24:48 crc kubenswrapper[4885]: I0308 21:24:48.888995 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerStarted","Data":"fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653"} Mar 08 21:24:48 crc kubenswrapper[4885]: I0308 21:24:48.930306 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7w5nj" podStartSLOduration=3.403980383 podStartE2EDuration="6.930269751s" podCreationTimestamp="2026-03-08 21:24:42 +0000 UTC" firstStartedPulling="2026-03-08 21:24:44.823105821 +0000 UTC m=+6786.219159844" lastFinishedPulling="2026-03-08 21:24:48.349395189 +0000 UTC m=+6789.745449212" observedRunningTime="2026-03-08 21:24:48.924382164 +0000 UTC m=+6790.320436217" watchObservedRunningTime="2026-03-08 21:24:48.930269751 +0000 UTC m=+6790.326323804" Mar 08 21:24:53 crc kubenswrapper[4885]: I0308 21:24:53.308127 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:53 crc kubenswrapper[4885]: I0308 21:24:53.308733 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:53 crc kubenswrapper[4885]: I0308 21:24:53.363370 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:54 crc kubenswrapper[4885]: I0308 21:24:54.005059 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:54 crc kubenswrapper[4885]: I0308 21:24:54.057892 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7w5nj"] Mar 08 21:24:55 crc kubenswrapper[4885]: I0308 21:24:55.960356 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7w5nj" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="registry-server" containerID="cri-o://fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653" gracePeriod=2 Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.509299 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.571562 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-utilities\") pod \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.571632 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-catalog-content\") pod \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.571823 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrst7\" (UniqueName: \"kubernetes.io/projected/cded65d6-06d8-49c9-8bc3-0223d72ee23c-kube-api-access-xrst7\") pod \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\" (UID: \"cded65d6-06d8-49c9-8bc3-0223d72ee23c\") " Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.574236 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-utilities" (OuterVolumeSpecName: "utilities") pod "cded65d6-06d8-49c9-8bc3-0223d72ee23c" (UID: "cded65d6-06d8-49c9-8bc3-0223d72ee23c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.582186 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cded65d6-06d8-49c9-8bc3-0223d72ee23c-kube-api-access-xrst7" (OuterVolumeSpecName: "kube-api-access-xrst7") pod "cded65d6-06d8-49c9-8bc3-0223d72ee23c" (UID: "cded65d6-06d8-49c9-8bc3-0223d72ee23c"). InnerVolumeSpecName "kube-api-access-xrst7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.648681 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cded65d6-06d8-49c9-8bc3-0223d72ee23c" (UID: "cded65d6-06d8-49c9-8bc3-0223d72ee23c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.674668 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrst7\" (UniqueName: \"kubernetes.io/projected/cded65d6-06d8-49c9-8bc3-0223d72ee23c-kube-api-access-xrst7\") on node \"crc\" DevicePath \"\"" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.674700 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.674711 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cded65d6-06d8-49c9-8bc3-0223d72ee23c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.975676 4885 generic.go:334] "Generic (PLEG): container finished" podID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerID="fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653" exitCode=0 Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.975746 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerDied","Data":"fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653"} Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.975800 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7w5nj" event={"ID":"cded65d6-06d8-49c9-8bc3-0223d72ee23c","Type":"ContainerDied","Data":"da4afd917b03cf9f273dbbf6900868bc97461a3068639de562d9b471c1e46179"} Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.975758 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7w5nj" Mar 08 21:24:56 crc kubenswrapper[4885]: I0308 21:24:56.975823 4885 scope.go:117] "RemoveContainer" containerID="fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.027313 4885 scope.go:117] "RemoveContainer" containerID="2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.029776 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7w5nj"] Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.041314 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7w5nj"] Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.064100 4885 scope.go:117] "RemoveContainer" containerID="9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.105201 4885 scope.go:117] "RemoveContainer" containerID="fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653" Mar 08 21:24:57 crc kubenswrapper[4885]: E0308 21:24:57.105950 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653\": container with ID starting with fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653 not found: ID does not exist" containerID="fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.106006 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653"} err="failed to get container status \"fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653\": rpc error: code = NotFound desc = could not find container \"fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653\": container with ID starting with fd63bdf94e8a7499008bf9d31360d5d43d34f085e52153d20181235fb911a653 not found: ID does not exist" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.106037 4885 scope.go:117] "RemoveContainer" containerID="2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee" Mar 08 21:24:57 crc kubenswrapper[4885]: E0308 21:24:57.106870 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee\": container with ID starting with 2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee not found: ID does not exist" containerID="2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.106946 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee"} err="failed to get container status \"2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee\": rpc error: code = NotFound desc = could not find container \"2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee\": container with ID starting with 2f32b476bb719256801cf9f0d6a9b475615c0ee8c225e7f0f30263895d91a9ee not found: ID does not exist" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.106981 4885 scope.go:117] "RemoveContainer" containerID="9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574" Mar 08 21:24:57 crc kubenswrapper[4885]: E0308 21:24:57.107447 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574\": container with ID starting with 9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574 not found: ID does not exist" containerID="9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.107480 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574"} err="failed to get container status \"9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574\": rpc error: code = NotFound desc = could not find container \"9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574\": container with ID starting with 9b35645d3d6fb5a6362978a453dd23c6c8b98921f24a2c0126ca5a6931a26574 not found: ID does not exist" Mar 08 21:24:57 crc kubenswrapper[4885]: I0308 21:24:57.395831 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" path="/var/lib/kubelet/pods/cded65d6-06d8-49c9-8bc3-0223d72ee23c/volumes" Mar 08 21:25:02 crc kubenswrapper[4885]: I0308 21:25:02.817671 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:25:02 crc kubenswrapper[4885]: I0308 21:25:02.818844 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:25:02 crc kubenswrapper[4885]: I0308 21:25:02.819148 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:25:02 crc kubenswrapper[4885]: I0308 21:25:02.819977 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:25:02 crc kubenswrapper[4885]: I0308 21:25:02.820092 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" gracePeriod=600 Mar 08 21:25:02 crc kubenswrapper[4885]: E0308 21:25:02.938267 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:25:03 crc kubenswrapper[4885]: I0308 21:25:03.042282 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" exitCode=0 Mar 08 21:25:03 crc kubenswrapper[4885]: I0308 21:25:03.042321 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937"} Mar 08 21:25:03 crc kubenswrapper[4885]: I0308 21:25:03.042349 4885 scope.go:117] "RemoveContainer" containerID="17d10d0a74a5cbd79193584e373306e9a6f05fd494997dbeda172a5bfdd668a3" Mar 08 21:25:03 crc kubenswrapper[4885]: I0308 21:25:03.042991 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:25:03 crc kubenswrapper[4885]: E0308 21:25:03.043273 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:25:14 crc kubenswrapper[4885]: I0308 21:25:14.368721 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:25:14 crc kubenswrapper[4885]: E0308 21:25:14.369522 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:25:29 crc kubenswrapper[4885]: I0308 21:25:29.382252 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:25:29 crc kubenswrapper[4885]: E0308 21:25:29.383149 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:25:41 crc kubenswrapper[4885]: I0308 21:25:41.369045 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:25:41 crc kubenswrapper[4885]: E0308 21:25:41.370699 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:25:55 crc kubenswrapper[4885]: I0308 21:25:55.370779 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:25:55 crc kubenswrapper[4885]: E0308 21:25:55.372108 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:25:56 crc kubenswrapper[4885]: I0308 21:25:56.066602 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-148a-account-create-update-vw6hm"] Mar 08 21:25:56 crc kubenswrapper[4885]: I0308 21:25:56.083599 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-jfnt7"] Mar 08 21:25:56 crc kubenswrapper[4885]: I0308 21:25:56.098851 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-jfnt7"] Mar 08 21:25:56 crc kubenswrapper[4885]: I0308 21:25:56.109868 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-148a-account-create-update-vw6hm"] Mar 08 21:25:57 crc kubenswrapper[4885]: I0308 21:25:57.388944 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92cde86b-0d50-444d-b116-e32fbf5004f9" path="/var/lib/kubelet/pods/92cde86b-0d50-444d-b116-e32fbf5004f9/volumes" Mar 08 21:25:57 crc kubenswrapper[4885]: I0308 21:25:57.392748 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b454a1c4-958a-40a9-8c50-9154281574fd" path="/var/lib/kubelet/pods/b454a1c4-958a-40a9-8c50-9154281574fd/volumes" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.175873 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550086-vhcf6"] Mar 08 21:26:00 crc kubenswrapper[4885]: E0308 21:26:00.176883 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="registry-server" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.176899 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="registry-server" Mar 08 21:26:00 crc kubenswrapper[4885]: E0308 21:26:00.176952 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="extract-content" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.176962 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="extract-content" Mar 08 21:26:00 crc kubenswrapper[4885]: E0308 21:26:00.176992 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="extract-utilities" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.177001 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="extract-utilities" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.177260 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cded65d6-06d8-49c9-8bc3-0223d72ee23c" containerName="registry-server" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.178159 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.181279 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.181628 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.182556 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.187658 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550086-vhcf6"] Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.357468 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z448l\" (UniqueName: \"kubernetes.io/projected/e5e490c3-347d-4c4b-aa26-3f680e0bebc0-kube-api-access-z448l\") pod \"auto-csr-approver-29550086-vhcf6\" (UID: \"e5e490c3-347d-4c4b-aa26-3f680e0bebc0\") " pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.459543 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z448l\" (UniqueName: \"kubernetes.io/projected/e5e490c3-347d-4c4b-aa26-3f680e0bebc0-kube-api-access-z448l\") pod \"auto-csr-approver-29550086-vhcf6\" (UID: \"e5e490c3-347d-4c4b-aa26-3f680e0bebc0\") " pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.488784 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z448l\" (UniqueName: \"kubernetes.io/projected/e5e490c3-347d-4c4b-aa26-3f680e0bebc0-kube-api-access-z448l\") pod \"auto-csr-approver-29550086-vhcf6\" (UID: \"e5e490c3-347d-4c4b-aa26-3f680e0bebc0\") " pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:00 crc kubenswrapper[4885]: I0308 21:26:00.514206 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:01 crc kubenswrapper[4885]: I0308 21:26:01.111428 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550086-vhcf6"] Mar 08 21:26:01 crc kubenswrapper[4885]: I0308 21:26:01.119229 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:26:01 crc kubenswrapper[4885]: I0308 21:26:01.770710 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" event={"ID":"e5e490c3-347d-4c4b-aa26-3f680e0bebc0","Type":"ContainerStarted","Data":"68d9980f62d4954a3285c86bb44d8da1bd0eef319852e66579b38c24d2a25bce"} Mar 08 21:26:02 crc kubenswrapper[4885]: I0308 21:26:02.784710 4885 generic.go:334] "Generic (PLEG): container finished" podID="e5e490c3-347d-4c4b-aa26-3f680e0bebc0" containerID="a2343f387ee044683e4b5c10c184262a0a1ddcb6bceb9b6768e6cef7d9c4c637" exitCode=0 Mar 08 21:26:02 crc kubenswrapper[4885]: I0308 21:26:02.784795 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" event={"ID":"e5e490c3-347d-4c4b-aa26-3f680e0bebc0","Type":"ContainerDied","Data":"a2343f387ee044683e4b5c10c184262a0a1ddcb6bceb9b6768e6cef7d9c4c637"} Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.241903 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.362895 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z448l\" (UniqueName: \"kubernetes.io/projected/e5e490c3-347d-4c4b-aa26-3f680e0bebc0-kube-api-access-z448l\") pod \"e5e490c3-347d-4c4b-aa26-3f680e0bebc0\" (UID: \"e5e490c3-347d-4c4b-aa26-3f680e0bebc0\") " Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.373244 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e490c3-347d-4c4b-aa26-3f680e0bebc0-kube-api-access-z448l" (OuterVolumeSpecName: "kube-api-access-z448l") pod "e5e490c3-347d-4c4b-aa26-3f680e0bebc0" (UID: "e5e490c3-347d-4c4b-aa26-3f680e0bebc0"). InnerVolumeSpecName "kube-api-access-z448l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.465941 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z448l\" (UniqueName: \"kubernetes.io/projected/e5e490c3-347d-4c4b-aa26-3f680e0bebc0-kube-api-access-z448l\") on node \"crc\" DevicePath \"\"" Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.811497 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" event={"ID":"e5e490c3-347d-4c4b-aa26-3f680e0bebc0","Type":"ContainerDied","Data":"68d9980f62d4954a3285c86bb44d8da1bd0eef319852e66579b38c24d2a25bce"} Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.811565 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68d9980f62d4954a3285c86bb44d8da1bd0eef319852e66579b38c24d2a25bce" Mar 08 21:26:04 crc kubenswrapper[4885]: I0308 21:26:04.811616 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550086-vhcf6" Mar 08 21:26:05 crc kubenswrapper[4885]: I0308 21:26:05.343945 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550080-q8z87"] Mar 08 21:26:05 crc kubenswrapper[4885]: I0308 21:26:05.358053 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550080-q8z87"] Mar 08 21:26:05 crc kubenswrapper[4885]: I0308 21:26:05.382369 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="099da518-0e8c-4661-86bf-efcce5fd4f59" path="/var/lib/kubelet/pods/099da518-0e8c-4661-86bf-efcce5fd4f59/volumes" Mar 08 21:26:07 crc kubenswrapper[4885]: I0308 21:26:07.369384 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:26:07 crc kubenswrapper[4885]: E0308 21:26:07.370024 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:26:09 crc kubenswrapper[4885]: I0308 21:26:09.036440 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-hj6ng"] Mar 08 21:26:09 crc kubenswrapper[4885]: I0308 21:26:09.048342 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-hj6ng"] Mar 08 21:26:09 crc kubenswrapper[4885]: I0308 21:26:09.388691 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddddf0b1-83be-4ebb-8318-9d40522a3efb" path="/var/lib/kubelet/pods/ddddf0b1-83be-4ebb-8318-9d40522a3efb/volumes" Mar 08 21:26:11 crc kubenswrapper[4885]: I0308 21:26:11.045966 4885 scope.go:117] "RemoveContainer" containerID="a35a9b66ff3babcb2662995c86d66b4cb67b7df0bec572a2fefb5352c1e090cb" Mar 08 21:26:11 crc kubenswrapper[4885]: I0308 21:26:11.095187 4885 scope.go:117] "RemoveContainer" containerID="50f56d5baf9ae0368c43d2d5d7b045c4c64547d6f51fe21432b96e232f3f2393" Mar 08 21:26:11 crc kubenswrapper[4885]: I0308 21:26:11.167401 4885 scope.go:117] "RemoveContainer" containerID="870ff46cb1f6250fba56c9497a2a58f99777f85302f8adb2a09cd3289b27392e" Mar 08 21:26:11 crc kubenswrapper[4885]: I0308 21:26:11.209177 4885 scope.go:117] "RemoveContainer" containerID="34b50f5f966e37811bb8a32ad3d6e1abb40b701290a57e1661e950c1bc924933" Mar 08 21:26:22 crc kubenswrapper[4885]: I0308 21:26:22.369472 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:26:22 crc kubenswrapper[4885]: E0308 21:26:22.370579 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.036437 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-wmgbb"] Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.049557 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-b45a-account-create-update-zt9mb"] Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.060312 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-wmgbb"] Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.069986 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-b45a-account-create-update-zt9mb"] Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.368362 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:26:33 crc kubenswrapper[4885]: E0308 21:26:33.368622 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.379533 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="451cc09f-d6aa-4930-be69-102ce5b86575" path="/var/lib/kubelet/pods/451cc09f-d6aa-4930-be69-102ce5b86575/volumes" Mar 08 21:26:33 crc kubenswrapper[4885]: I0308 21:26:33.380583 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="835eb61f-3559-41d5-9891-23a6ecef9ed1" path="/var/lib/kubelet/pods/835eb61f-3559-41d5-9891-23a6ecef9ed1/volumes" Mar 08 21:26:45 crc kubenswrapper[4885]: I0308 21:26:45.369092 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:26:45 crc kubenswrapper[4885]: E0308 21:26:45.370768 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:26:46 crc kubenswrapper[4885]: I0308 21:26:46.056234 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-gtv5s"] Mar 08 21:26:46 crc kubenswrapper[4885]: I0308 21:26:46.066810 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-gtv5s"] Mar 08 21:26:47 crc kubenswrapper[4885]: I0308 21:26:47.395077 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4393565c-775a-48fd-a497-602a556ff169" path="/var/lib/kubelet/pods/4393565c-775a-48fd-a497-602a556ff169/volumes" Mar 08 21:26:58 crc kubenswrapper[4885]: I0308 21:26:58.369005 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:26:58 crc kubenswrapper[4885]: E0308 21:26:58.370226 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:27:11 crc kubenswrapper[4885]: I0308 21:27:11.372791 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:27:11 crc kubenswrapper[4885]: E0308 21:27:11.374224 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:27:11 crc kubenswrapper[4885]: I0308 21:27:11.374424 4885 scope.go:117] "RemoveContainer" containerID="8cfe6ba1dd8d427385a1015c78367bf6a932fa6920ddcb36d2679cfdab2e9416" Mar 08 21:27:11 crc kubenswrapper[4885]: I0308 21:27:11.440969 4885 scope.go:117] "RemoveContainer" containerID="ff2590b431e04ce466ce231540eb022968990845b1b8a9f29903a084f907a810" Mar 08 21:27:11 crc kubenswrapper[4885]: I0308 21:27:11.515061 4885 scope.go:117] "RemoveContainer" containerID="ab2bac58c78cebfa3dc65d3179c712fb4a25e9ae89fdc3f09281d9b68706ac0c" Mar 08 21:27:23 crc kubenswrapper[4885]: I0308 21:27:23.369295 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:27:23 crc kubenswrapper[4885]: E0308 21:27:23.370734 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:27:34 crc kubenswrapper[4885]: I0308 21:27:34.368652 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:27:34 crc kubenswrapper[4885]: E0308 21:27:34.369734 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:27:48 crc kubenswrapper[4885]: I0308 21:27:48.368897 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:27:48 crc kubenswrapper[4885]: E0308 21:27:48.370587 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.159074 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550088-xzfvm"] Mar 08 21:28:00 crc kubenswrapper[4885]: E0308 21:28:00.160323 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e490c3-347d-4c4b-aa26-3f680e0bebc0" containerName="oc" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.160349 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e490c3-347d-4c4b-aa26-3f680e0bebc0" containerName="oc" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.160744 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e490c3-347d-4c4b-aa26-3f680e0bebc0" containerName="oc" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.161997 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.164390 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.164610 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.168503 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.180369 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550088-xzfvm"] Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.262720 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhpwt\" (UniqueName: \"kubernetes.io/projected/ec092c86-e0c8-415e-bfe7-80914fe8ce5b-kube-api-access-fhpwt\") pod \"auto-csr-approver-29550088-xzfvm\" (UID: \"ec092c86-e0c8-415e-bfe7-80914fe8ce5b\") " pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.363737 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhpwt\" (UniqueName: \"kubernetes.io/projected/ec092c86-e0c8-415e-bfe7-80914fe8ce5b-kube-api-access-fhpwt\") pod \"auto-csr-approver-29550088-xzfvm\" (UID: \"ec092c86-e0c8-415e-bfe7-80914fe8ce5b\") " pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.384880 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhpwt\" (UniqueName: \"kubernetes.io/projected/ec092c86-e0c8-415e-bfe7-80914fe8ce5b-kube-api-access-fhpwt\") pod \"auto-csr-approver-29550088-xzfvm\" (UID: \"ec092c86-e0c8-415e-bfe7-80914fe8ce5b\") " pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:00 crc kubenswrapper[4885]: I0308 21:28:00.518189 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:01 crc kubenswrapper[4885]: I0308 21:28:01.020318 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550088-xzfvm"] Mar 08 21:28:01 crc kubenswrapper[4885]: I0308 21:28:01.273982 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" event={"ID":"ec092c86-e0c8-415e-bfe7-80914fe8ce5b","Type":"ContainerStarted","Data":"83e7b762cd3ceb0476f382df7f8d7a84d710ce4ecb9e0f28389e5bd476e95b52"} Mar 08 21:28:02 crc kubenswrapper[4885]: I0308 21:28:02.369192 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:28:02 crc kubenswrapper[4885]: E0308 21:28:02.370610 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:28:03 crc kubenswrapper[4885]: I0308 21:28:03.301720 4885 generic.go:334] "Generic (PLEG): container finished" podID="ec092c86-e0c8-415e-bfe7-80914fe8ce5b" containerID="38151e84b4b7428445936ba0b4e7f51f9bdc2be5ab2ec1353c272510a10895bd" exitCode=0 Mar 08 21:28:03 crc kubenswrapper[4885]: I0308 21:28:03.301809 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" event={"ID":"ec092c86-e0c8-415e-bfe7-80914fe8ce5b","Type":"ContainerDied","Data":"38151e84b4b7428445936ba0b4e7f51f9bdc2be5ab2ec1353c272510a10895bd"} Mar 08 21:28:04 crc kubenswrapper[4885]: I0308 21:28:04.747108 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:04 crc kubenswrapper[4885]: I0308 21:28:04.761847 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhpwt\" (UniqueName: \"kubernetes.io/projected/ec092c86-e0c8-415e-bfe7-80914fe8ce5b-kube-api-access-fhpwt\") pod \"ec092c86-e0c8-415e-bfe7-80914fe8ce5b\" (UID: \"ec092c86-e0c8-415e-bfe7-80914fe8ce5b\") " Mar 08 21:28:04 crc kubenswrapper[4885]: I0308 21:28:04.771992 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec092c86-e0c8-415e-bfe7-80914fe8ce5b-kube-api-access-fhpwt" (OuterVolumeSpecName: "kube-api-access-fhpwt") pod "ec092c86-e0c8-415e-bfe7-80914fe8ce5b" (UID: "ec092c86-e0c8-415e-bfe7-80914fe8ce5b"). InnerVolumeSpecName "kube-api-access-fhpwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:28:04 crc kubenswrapper[4885]: I0308 21:28:04.865078 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhpwt\" (UniqueName: \"kubernetes.io/projected/ec092c86-e0c8-415e-bfe7-80914fe8ce5b-kube-api-access-fhpwt\") on node \"crc\" DevicePath \"\"" Mar 08 21:28:05 crc kubenswrapper[4885]: I0308 21:28:05.327221 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" event={"ID":"ec092c86-e0c8-415e-bfe7-80914fe8ce5b","Type":"ContainerDied","Data":"83e7b762cd3ceb0476f382df7f8d7a84d710ce4ecb9e0f28389e5bd476e95b52"} Mar 08 21:28:05 crc kubenswrapper[4885]: I0308 21:28:05.327645 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83e7b762cd3ceb0476f382df7f8d7a84d710ce4ecb9e0f28389e5bd476e95b52" Mar 08 21:28:05 crc kubenswrapper[4885]: I0308 21:28:05.327321 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550088-xzfvm" Mar 08 21:28:05 crc kubenswrapper[4885]: I0308 21:28:05.835031 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550082-r8zzq"] Mar 08 21:28:05 crc kubenswrapper[4885]: I0308 21:28:05.843704 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550082-r8zzq"] Mar 08 21:28:07 crc kubenswrapper[4885]: I0308 21:28:07.387258 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b683b024-7ab4-40e6-9380-ad5f3c4c9751" path="/var/lib/kubelet/pods/b683b024-7ab4-40e6-9380-ad5f3c4c9751/volumes" Mar 08 21:28:11 crc kubenswrapper[4885]: I0308 21:28:11.646626 4885 scope.go:117] "RemoveContainer" containerID="d8208765f1f2335c1dc540c6cbadcec4aefebc9eff84e251842efe2b691b630b" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.121372 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-727hb"] Mar 08 21:28:14 crc kubenswrapper[4885]: E0308 21:28:14.122462 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec092c86-e0c8-415e-bfe7-80914fe8ce5b" containerName="oc" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.122477 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec092c86-e0c8-415e-bfe7-80914fe8ce5b" containerName="oc" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.122734 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec092c86-e0c8-415e-bfe7-80914fe8ce5b" containerName="oc" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.124610 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.148440 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-727hb"] Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.303907 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-catalog-content\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.304247 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz8md\" (UniqueName: \"kubernetes.io/projected/c690d770-1f1e-4e17-991c-4a7696a26cea-kube-api-access-dz8md\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.304329 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-utilities\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.406348 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-utilities\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.406583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-catalog-content\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.407155 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-utilities\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.407353 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-catalog-content\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.407575 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz8md\" (UniqueName: \"kubernetes.io/projected/c690d770-1f1e-4e17-991c-4a7696a26cea-kube-api-access-dz8md\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.429809 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz8md\" (UniqueName: \"kubernetes.io/projected/c690d770-1f1e-4e17-991c-4a7696a26cea-kube-api-access-dz8md\") pod \"redhat-operators-727hb\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.447417 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:14 crc kubenswrapper[4885]: I0308 21:28:14.955394 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-727hb"] Mar 08 21:28:15 crc kubenswrapper[4885]: I0308 21:28:15.456940 4885 generic.go:334] "Generic (PLEG): container finished" podID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerID="8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc" exitCode=0 Mar 08 21:28:15 crc kubenswrapper[4885]: I0308 21:28:15.456988 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerDied","Data":"8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc"} Mar 08 21:28:15 crc kubenswrapper[4885]: I0308 21:28:15.457219 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerStarted","Data":"07f2ce8eb5cf4e9dbca7c1fd51d8cad03cbdd45dcff7bdcae0d172c480181377"} Mar 08 21:28:16 crc kubenswrapper[4885]: I0308 21:28:16.369609 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:28:16 crc kubenswrapper[4885]: E0308 21:28:16.370598 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:28:16 crc kubenswrapper[4885]: I0308 21:28:16.468502 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerStarted","Data":"d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf"} Mar 08 21:28:22 crc kubenswrapper[4885]: I0308 21:28:22.542072 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerDied","Data":"d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf"} Mar 08 21:28:22 crc kubenswrapper[4885]: I0308 21:28:22.542029 4885 generic.go:334] "Generic (PLEG): container finished" podID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerID="d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf" exitCode=0 Mar 08 21:28:24 crc kubenswrapper[4885]: I0308 21:28:24.574692 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerStarted","Data":"a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10"} Mar 08 21:28:24 crc kubenswrapper[4885]: I0308 21:28:24.619508 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-727hb" podStartSLOduration=2.717009545 podStartE2EDuration="10.619486503s" podCreationTimestamp="2026-03-08 21:28:14 +0000 UTC" firstStartedPulling="2026-03-08 21:28:15.458436939 +0000 UTC m=+6996.854490962" lastFinishedPulling="2026-03-08 21:28:23.360913867 +0000 UTC m=+7004.756967920" observedRunningTime="2026-03-08 21:28:24.598911967 +0000 UTC m=+7005.994966040" watchObservedRunningTime="2026-03-08 21:28:24.619486503 +0000 UTC m=+7006.015540536" Mar 08 21:28:31 crc kubenswrapper[4885]: I0308 21:28:31.371451 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:28:31 crc kubenswrapper[4885]: E0308 21:28:31.372234 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:28:34 crc kubenswrapper[4885]: I0308 21:28:34.448135 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:34 crc kubenswrapper[4885]: I0308 21:28:34.449286 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:34 crc kubenswrapper[4885]: I0308 21:28:34.522888 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:34 crc kubenswrapper[4885]: I0308 21:28:34.804694 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:34 crc kubenswrapper[4885]: I0308 21:28:34.876513 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-727hb"] Mar 08 21:28:36 crc kubenswrapper[4885]: I0308 21:28:36.763280 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-727hb" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="registry-server" containerID="cri-o://a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10" gracePeriod=2 Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.326309 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.372602 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz8md\" (UniqueName: \"kubernetes.io/projected/c690d770-1f1e-4e17-991c-4a7696a26cea-kube-api-access-dz8md\") pod \"c690d770-1f1e-4e17-991c-4a7696a26cea\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.372732 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-catalog-content\") pod \"c690d770-1f1e-4e17-991c-4a7696a26cea\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.372778 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-utilities\") pod \"c690d770-1f1e-4e17-991c-4a7696a26cea\" (UID: \"c690d770-1f1e-4e17-991c-4a7696a26cea\") " Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.374252 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-utilities" (OuterVolumeSpecName: "utilities") pod "c690d770-1f1e-4e17-991c-4a7696a26cea" (UID: "c690d770-1f1e-4e17-991c-4a7696a26cea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.375695 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.383004 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c690d770-1f1e-4e17-991c-4a7696a26cea-kube-api-access-dz8md" (OuterVolumeSpecName: "kube-api-access-dz8md") pod "c690d770-1f1e-4e17-991c-4a7696a26cea" (UID: "c690d770-1f1e-4e17-991c-4a7696a26cea"). InnerVolumeSpecName "kube-api-access-dz8md". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.477586 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz8md\" (UniqueName: \"kubernetes.io/projected/c690d770-1f1e-4e17-991c-4a7696a26cea-kube-api-access-dz8md\") on node \"crc\" DevicePath \"\"" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.510636 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c690d770-1f1e-4e17-991c-4a7696a26cea" (UID: "c690d770-1f1e-4e17-991c-4a7696a26cea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.579438 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c690d770-1f1e-4e17-991c-4a7696a26cea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.779548 4885 generic.go:334] "Generic (PLEG): container finished" podID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerID="a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10" exitCode=0 Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.779611 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerDied","Data":"a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10"} Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.779674 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-727hb" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.779705 4885 scope.go:117] "RemoveContainer" containerID="a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.779686 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-727hb" event={"ID":"c690d770-1f1e-4e17-991c-4a7696a26cea","Type":"ContainerDied","Data":"07f2ce8eb5cf4e9dbca7c1fd51d8cad03cbdd45dcff7bdcae0d172c480181377"} Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.831830 4885 scope.go:117] "RemoveContainer" containerID="d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.835505 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-727hb"] Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.846585 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-727hb"] Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.876623 4885 scope.go:117] "RemoveContainer" containerID="8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.937482 4885 scope.go:117] "RemoveContainer" containerID="a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10" Mar 08 21:28:37 crc kubenswrapper[4885]: E0308 21:28:37.938392 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10\": container with ID starting with a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10 not found: ID does not exist" containerID="a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.938464 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10"} err="failed to get container status \"a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10\": rpc error: code = NotFound desc = could not find container \"a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10\": container with ID starting with a19fbaf5ac49ea88782826e5224c3b00d48575d5d98e1ed9b7d169554aaabe10 not found: ID does not exist" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.938506 4885 scope.go:117] "RemoveContainer" containerID="d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf" Mar 08 21:28:37 crc kubenswrapper[4885]: E0308 21:28:37.939190 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf\": container with ID starting with d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf not found: ID does not exist" containerID="d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.939292 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf"} err="failed to get container status \"d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf\": rpc error: code = NotFound desc = could not find container \"d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf\": container with ID starting with d7b12f94fab847d24f6dac5a88881bf8aafb845228d15c03d77c8f05eab768cf not found: ID does not exist" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.939381 4885 scope.go:117] "RemoveContainer" containerID="8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc" Mar 08 21:28:37 crc kubenswrapper[4885]: E0308 21:28:37.939974 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc\": container with ID starting with 8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc not found: ID does not exist" containerID="8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc" Mar 08 21:28:37 crc kubenswrapper[4885]: I0308 21:28:37.940037 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc"} err="failed to get container status \"8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc\": rpc error: code = NotFound desc = could not find container \"8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc\": container with ID starting with 8848e139df58f82b2a8be9a363ddf6f66ad032e472de6499dbd6a24bd7b1a2bc not found: ID does not exist" Mar 08 21:28:39 crc kubenswrapper[4885]: I0308 21:28:39.393684 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" path="/var/lib/kubelet/pods/c690d770-1f1e-4e17-991c-4a7696a26cea/volumes" Mar 08 21:28:46 crc kubenswrapper[4885]: I0308 21:28:46.368152 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:28:46 crc kubenswrapper[4885]: E0308 21:28:46.368875 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:28:58 crc kubenswrapper[4885]: I0308 21:28:58.369306 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:28:58 crc kubenswrapper[4885]: E0308 21:28:58.370314 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:29:11 crc kubenswrapper[4885]: I0308 21:29:11.179683 4885 generic.go:334] "Generic (PLEG): container finished" podID="8be575f8-a741-4b5a-b7fa-c43e5dd65598" containerID="9a54c0c84a047f8db9c0abfba2cd8a399a25f81595ecb87af623107a04129487" exitCode=0 Mar 08 21:29:11 crc kubenswrapper[4885]: I0308 21:29:11.179842 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" event={"ID":"8be575f8-a741-4b5a-b7fa-c43e5dd65598","Type":"ContainerDied","Data":"9a54c0c84a047f8db9c0abfba2cd8a399a25f81595ecb87af623107a04129487"} Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.856649 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.978495 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ceph\") pod \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.978926 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-tripleo-cleanup-combined-ca-bundle\") pod \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.979092 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc9lt\" (UniqueName: \"kubernetes.io/projected/8be575f8-a741-4b5a-b7fa-c43e5dd65598-kube-api-access-pc9lt\") pod \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.979656 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-inventory\") pod \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.979721 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ssh-key-openstack-cell1\") pod \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\" (UID: \"8be575f8-a741-4b5a-b7fa-c43e5dd65598\") " Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.985098 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "8be575f8-a741-4b5a-b7fa-c43e5dd65598" (UID: "8be575f8-a741-4b5a-b7fa-c43e5dd65598"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.987068 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be575f8-a741-4b5a-b7fa-c43e5dd65598-kube-api-access-pc9lt" (OuterVolumeSpecName: "kube-api-access-pc9lt") pod "8be575f8-a741-4b5a-b7fa-c43e5dd65598" (UID: "8be575f8-a741-4b5a-b7fa-c43e5dd65598"). InnerVolumeSpecName "kube-api-access-pc9lt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:29:12 crc kubenswrapper[4885]: I0308 21:29:12.989280 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ceph" (OuterVolumeSpecName: "ceph") pod "8be575f8-a741-4b5a-b7fa-c43e5dd65598" (UID: "8be575f8-a741-4b5a-b7fa-c43e5dd65598"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.021129 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "8be575f8-a741-4b5a-b7fa-c43e5dd65598" (UID: "8be575f8-a741-4b5a-b7fa-c43e5dd65598"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.033554 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-inventory" (OuterVolumeSpecName: "inventory") pod "8be575f8-a741-4b5a-b7fa-c43e5dd65598" (UID: "8be575f8-a741-4b5a-b7fa-c43e5dd65598"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.082099 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc9lt\" (UniqueName: \"kubernetes.io/projected/8be575f8-a741-4b5a-b7fa-c43e5dd65598-kube-api-access-pc9lt\") on node \"crc\" DevicePath \"\"" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.082137 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.082147 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.082159 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.082174 4885 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be575f8-a741-4b5a-b7fa-c43e5dd65598-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.206044 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" event={"ID":"8be575f8-a741-4b5a-b7fa-c43e5dd65598","Type":"ContainerDied","Data":"fcaf57e77e3aeefacc328413ba9ef9243f80683c97c18bb5d80337ab11bd45a0"} Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.206102 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcaf57e77e3aeefacc328413ba9ef9243f80683c97c18bb5d80337ab11bd45a0" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.206173 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6" Mar 08 21:29:13 crc kubenswrapper[4885]: I0308 21:29:13.369087 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:29:13 crc kubenswrapper[4885]: E0308 21:29:13.369415 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.367320 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-bcvmz"] Mar 08 21:29:15 crc kubenswrapper[4885]: E0308 21:29:15.368377 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="registry-server" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.368394 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="registry-server" Mar 08 21:29:15 crc kubenswrapper[4885]: E0308 21:29:15.368416 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be575f8-a741-4b5a-b7fa-c43e5dd65598" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.368426 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be575f8-a741-4b5a-b7fa-c43e5dd65598" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 08 21:29:15 crc kubenswrapper[4885]: E0308 21:29:15.368483 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="extract-content" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.368491 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="extract-content" Mar 08 21:29:15 crc kubenswrapper[4885]: E0308 21:29:15.368514 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="extract-utilities" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.368523 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="extract-utilities" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.368754 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c690d770-1f1e-4e17-991c-4a7696a26cea" containerName="registry-server" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.368777 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be575f8-a741-4b5a-b7fa-c43e5dd65598" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.381601 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-bcvmz"] Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.381757 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.388417 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.388598 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.389440 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.389583 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.541181 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.541427 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ceph\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.541607 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.541635 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-inventory\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.541817 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh6cn\" (UniqueName: \"kubernetes.io/projected/51b71742-3986-42a4-a016-eeecb3a7ba16-kube-api-access-gh6cn\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.643377 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-inventory\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.643428 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.643544 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh6cn\" (UniqueName: \"kubernetes.io/projected/51b71742-3986-42a4-a016-eeecb3a7ba16-kube-api-access-gh6cn\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.643646 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.643677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ceph\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.650520 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.650539 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-inventory\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.650822 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ceph\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.651976 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ssh-key-openstack-cell1\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.666006 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh6cn\" (UniqueName: \"kubernetes.io/projected/51b71742-3986-42a4-a016-eeecb3a7ba16-kube-api-access-gh6cn\") pod \"bootstrap-openstack-openstack-cell1-bcvmz\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:15 crc kubenswrapper[4885]: I0308 21:29:15.730856 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:29:16 crc kubenswrapper[4885]: I0308 21:29:16.296470 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-bcvmz"] Mar 08 21:29:17 crc kubenswrapper[4885]: I0308 21:29:17.253006 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" event={"ID":"51b71742-3986-42a4-a016-eeecb3a7ba16","Type":"ContainerStarted","Data":"406060aa77426fd158b89e1d5655fcca0b66ca3902f98a8c32bb3f98e32a6fac"} Mar 08 21:29:17 crc kubenswrapper[4885]: I0308 21:29:17.253715 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" event={"ID":"51b71742-3986-42a4-a016-eeecb3a7ba16","Type":"ContainerStarted","Data":"402a266dd7afc4a5f19cb1569e07873933799091a2ea37f980ac79de9b2acced"} Mar 08 21:29:17 crc kubenswrapper[4885]: I0308 21:29:17.276864 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" podStartSLOduration=1.714823956 podStartE2EDuration="2.276839959s" podCreationTimestamp="2026-03-08 21:29:15 +0000 UTC" firstStartedPulling="2026-03-08 21:29:16.305297864 +0000 UTC m=+7057.701351887" lastFinishedPulling="2026-03-08 21:29:16.867313837 +0000 UTC m=+7058.263367890" observedRunningTime="2026-03-08 21:29:17.273556532 +0000 UTC m=+7058.669610595" watchObservedRunningTime="2026-03-08 21:29:17.276839959 +0000 UTC m=+7058.672893982" Mar 08 21:29:25 crc kubenswrapper[4885]: I0308 21:29:25.369020 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:29:25 crc kubenswrapper[4885]: E0308 21:29:25.369849 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:29:38 crc kubenswrapper[4885]: I0308 21:29:38.367991 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:29:38 crc kubenswrapper[4885]: E0308 21:29:38.369667 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:29:52 crc kubenswrapper[4885]: I0308 21:29:52.368084 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:29:52 crc kubenswrapper[4885]: E0308 21:29:52.368906 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.167780 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550090-hww59"] Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.169665 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.173194 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.173439 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.173691 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.186956 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb"] Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.188709 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.190408 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.190654 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.199121 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550090-hww59"] Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.205602 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb"] Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.236443 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74tb7\" (UniqueName: \"kubernetes.io/projected/ab193869-7f8c-4475-8be4-393848bd54e3-kube-api-access-74tb7\") pod \"auto-csr-approver-29550090-hww59\" (UID: \"ab193869-7f8c-4475-8be4-393848bd54e3\") " pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.338302 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-config-volume\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.338359 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6s5t\" (UniqueName: \"kubernetes.io/projected/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-kube-api-access-b6s5t\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.338684 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74tb7\" (UniqueName: \"kubernetes.io/projected/ab193869-7f8c-4475-8be4-393848bd54e3-kube-api-access-74tb7\") pod \"auto-csr-approver-29550090-hww59\" (UID: \"ab193869-7f8c-4475-8be4-393848bd54e3\") " pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.338935 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-secret-volume\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.359565 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74tb7\" (UniqueName: \"kubernetes.io/projected/ab193869-7f8c-4475-8be4-393848bd54e3-kube-api-access-74tb7\") pod \"auto-csr-approver-29550090-hww59\" (UID: \"ab193869-7f8c-4475-8be4-393848bd54e3\") " pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.441335 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6s5t\" (UniqueName: \"kubernetes.io/projected/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-kube-api-access-b6s5t\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.441795 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-secret-volume\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.441879 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-config-volume\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.442613 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-config-volume\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.446134 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-secret-volume\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.468352 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6s5t\" (UniqueName: \"kubernetes.io/projected/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-kube-api-access-b6s5t\") pod \"collect-profiles-29550090-qp6gb\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.518382 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:00 crc kubenswrapper[4885]: I0308 21:30:00.526240 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:01 crc kubenswrapper[4885]: I0308 21:30:01.042135 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb"] Mar 08 21:30:01 crc kubenswrapper[4885]: I0308 21:30:01.115760 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550090-hww59"] Mar 08 21:30:01 crc kubenswrapper[4885]: I0308 21:30:01.765674 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550090-hww59" event={"ID":"ab193869-7f8c-4475-8be4-393848bd54e3","Type":"ContainerStarted","Data":"bcc086b5f410e69108c6354eee341d85d4749c8318a849c927136f2ce32247ec"} Mar 08 21:30:01 crc kubenswrapper[4885]: I0308 21:30:01.768064 4885 generic.go:334] "Generic (PLEG): container finished" podID="c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" containerID="8ffeb3ea1d44ddbc8ed5f91dcd1d3740e5d0c398b63612136a09bb9296a735fb" exitCode=0 Mar 08 21:30:01 crc kubenswrapper[4885]: I0308 21:30:01.768174 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" event={"ID":"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba","Type":"ContainerDied","Data":"8ffeb3ea1d44ddbc8ed5f91dcd1d3740e5d0c398b63612136a09bb9296a735fb"} Mar 08 21:30:01 crc kubenswrapper[4885]: I0308 21:30:01.768226 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" event={"ID":"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba","Type":"ContainerStarted","Data":"d0205129c28fbc2f325be58a840e914ab363a820f495c414162e025d04e83d97"} Mar 08 21:30:02 crc kubenswrapper[4885]: I0308 21:30:02.781035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550090-hww59" event={"ID":"ab193869-7f8c-4475-8be4-393848bd54e3","Type":"ContainerStarted","Data":"7eee2ffb4dea4a4b434fae8ad567627bf150b9abb9d76f55cab57ee721350700"} Mar 08 21:30:02 crc kubenswrapper[4885]: I0308 21:30:02.825906 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550090-hww59" podStartSLOduration=1.590647484 podStartE2EDuration="2.82588407s" podCreationTimestamp="2026-03-08 21:30:00 +0000 UTC" firstStartedPulling="2026-03-08 21:30:01.127701482 +0000 UTC m=+7102.523755515" lastFinishedPulling="2026-03-08 21:30:02.362938078 +0000 UTC m=+7103.758992101" observedRunningTime="2026-03-08 21:30:02.809059444 +0000 UTC m=+7104.205113467" watchObservedRunningTime="2026-03-08 21:30:02.82588407 +0000 UTC m=+7104.221938093" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.353989 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.416990 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-config-volume\") pod \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.417072 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-secret-volume\") pod \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.417126 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6s5t\" (UniqueName: \"kubernetes.io/projected/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-kube-api-access-b6s5t\") pod \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\" (UID: \"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba\") " Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.417882 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-config-volume" (OuterVolumeSpecName: "config-volume") pod "c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" (UID: "c2e12e20-b9a4-4fc7-8101-cd76f53c70ba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.423982 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" (UID: "c2e12e20-b9a4-4fc7-8101-cd76f53c70ba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.424116 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-kube-api-access-b6s5t" (OuterVolumeSpecName: "kube-api-access-b6s5t") pod "c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" (UID: "c2e12e20-b9a4-4fc7-8101-cd76f53c70ba"). InnerVolumeSpecName "kube-api-access-b6s5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.520450 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.520491 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.520511 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6s5t\" (UniqueName: \"kubernetes.io/projected/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba-kube-api-access-b6s5t\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.795588 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" event={"ID":"c2e12e20-b9a4-4fc7-8101-cd76f53c70ba","Type":"ContainerDied","Data":"d0205129c28fbc2f325be58a840e914ab363a820f495c414162e025d04e83d97"} Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.796106 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0205129c28fbc2f325be58a840e914ab363a820f495c414162e025d04e83d97" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.795612 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb" Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.798178 4885 generic.go:334] "Generic (PLEG): container finished" podID="ab193869-7f8c-4475-8be4-393848bd54e3" containerID="7eee2ffb4dea4a4b434fae8ad567627bf150b9abb9d76f55cab57ee721350700" exitCode=0 Mar 08 21:30:03 crc kubenswrapper[4885]: I0308 21:30:03.798232 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550090-hww59" event={"ID":"ab193869-7f8c-4475-8be4-393848bd54e3","Type":"ContainerDied","Data":"7eee2ffb4dea4a4b434fae8ad567627bf150b9abb9d76f55cab57ee721350700"} Mar 08 21:30:04 crc kubenswrapper[4885]: I0308 21:30:04.462019 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r"] Mar 08 21:30:04 crc kubenswrapper[4885]: I0308 21:30:04.476197 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550045-jnm6r"] Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.262821 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.364685 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74tb7\" (UniqueName: \"kubernetes.io/projected/ab193869-7f8c-4475-8be4-393848bd54e3-kube-api-access-74tb7\") pod \"ab193869-7f8c-4475-8be4-393848bd54e3\" (UID: \"ab193869-7f8c-4475-8be4-393848bd54e3\") " Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.369659 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab193869-7f8c-4475-8be4-393848bd54e3-kube-api-access-74tb7" (OuterVolumeSpecName: "kube-api-access-74tb7") pod "ab193869-7f8c-4475-8be4-393848bd54e3" (UID: "ab193869-7f8c-4475-8be4-393848bd54e3"). InnerVolumeSpecName "kube-api-access-74tb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.370090 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.387070 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414af8b3-3809-477a-a110-9acaf82a7a3b" path="/var/lib/kubelet/pods/414af8b3-3809-477a-a110-9acaf82a7a3b/volumes" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.467624 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74tb7\" (UniqueName: \"kubernetes.io/projected/ab193869-7f8c-4475-8be4-393848bd54e3-kube-api-access-74tb7\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.820207 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"eb4693f4eeb79088711f27b4882bee725d38950ce75255766be3668fb258c672"} Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.822455 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550090-hww59" event={"ID":"ab193869-7f8c-4475-8be4-393848bd54e3","Type":"ContainerDied","Data":"bcc086b5f410e69108c6354eee341d85d4749c8318a849c927136f2ce32247ec"} Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.822479 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcc086b5f410e69108c6354eee341d85d4749c8318a849c927136f2ce32247ec" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.822504 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550090-hww59" Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.893615 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550084-9r2m6"] Mar 08 21:30:05 crc kubenswrapper[4885]: I0308 21:30:05.907558 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550084-9r2m6"] Mar 08 21:30:07 crc kubenswrapper[4885]: I0308 21:30:07.412722 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d2e43f9-d4b1-4059-b714-26745e0d96ce" path="/var/lib/kubelet/pods/3d2e43f9-d4b1-4059-b714-26745e0d96ce/volumes" Mar 08 21:30:11 crc kubenswrapper[4885]: I0308 21:30:11.812202 4885 scope.go:117] "RemoveContainer" containerID="62ad3a335e07200b3e1dfc3daa3934ac465add0682b6dca882716bb449686e0a" Mar 08 21:30:11 crc kubenswrapper[4885]: I0308 21:30:11.862720 4885 scope.go:117] "RemoveContainer" containerID="b81c31bbcbb29c0b9da44d4b1b46d19caf5f3201ab351fd09c58a29971e19359" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.581811 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gsqw2"] Mar 08 21:30:38 crc kubenswrapper[4885]: E0308 21:30:38.586447 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab193869-7f8c-4475-8be4-393848bd54e3" containerName="oc" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.586470 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab193869-7f8c-4475-8be4-393848bd54e3" containerName="oc" Mar 08 21:30:38 crc kubenswrapper[4885]: E0308 21:30:38.586512 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" containerName="collect-profiles" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.586524 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" containerName="collect-profiles" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.586800 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" containerName="collect-profiles" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.586826 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab193869-7f8c-4475-8be4-393848bd54e3" containerName="oc" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.588888 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.594287 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsqw2"] Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.769486 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ndmm\" (UniqueName: \"kubernetes.io/projected/746e5174-2bf4-4698-b846-9bf402677b6f-kube-api-access-6ndmm\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.769752 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-utilities\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.769787 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-catalog-content\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.872320 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ndmm\" (UniqueName: \"kubernetes.io/projected/746e5174-2bf4-4698-b846-9bf402677b6f-kube-api-access-6ndmm\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.872379 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-utilities\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.872424 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-catalog-content\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.872938 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-utilities\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.872968 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-catalog-content\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.904755 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ndmm\" (UniqueName: \"kubernetes.io/projected/746e5174-2bf4-4698-b846-9bf402677b6f-kube-api-access-6ndmm\") pod \"redhat-marketplace-gsqw2\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:38 crc kubenswrapper[4885]: I0308 21:30:38.933290 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:39 crc kubenswrapper[4885]: I0308 21:30:39.429396 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsqw2"] Mar 08 21:30:40 crc kubenswrapper[4885]: I0308 21:30:40.276015 4885 generic.go:334] "Generic (PLEG): container finished" podID="746e5174-2bf4-4698-b846-9bf402677b6f" containerID="ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d" exitCode=0 Mar 08 21:30:40 crc kubenswrapper[4885]: I0308 21:30:40.276078 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerDied","Data":"ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d"} Mar 08 21:30:40 crc kubenswrapper[4885]: I0308 21:30:40.276424 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerStarted","Data":"a7e029066f7d884b7eac13ca0e4540905bf51badb3b42685af6bd3d5e88e7bf2"} Mar 08 21:30:41 crc kubenswrapper[4885]: I0308 21:30:41.286338 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerStarted","Data":"f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c"} Mar 08 21:30:42 crc kubenswrapper[4885]: I0308 21:30:42.300311 4885 generic.go:334] "Generic (PLEG): container finished" podID="746e5174-2bf4-4698-b846-9bf402677b6f" containerID="f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c" exitCode=0 Mar 08 21:30:42 crc kubenswrapper[4885]: I0308 21:30:42.300795 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerDied","Data":"f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c"} Mar 08 21:30:43 crc kubenswrapper[4885]: I0308 21:30:43.315387 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerStarted","Data":"cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf"} Mar 08 21:30:43 crc kubenswrapper[4885]: I0308 21:30:43.336312 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gsqw2" podStartSLOduration=2.89831784 podStartE2EDuration="5.336287651s" podCreationTimestamp="2026-03-08 21:30:38 +0000 UTC" firstStartedPulling="2026-03-08 21:30:40.281129393 +0000 UTC m=+7141.677183456" lastFinishedPulling="2026-03-08 21:30:42.719099244 +0000 UTC m=+7144.115153267" observedRunningTime="2026-03-08 21:30:43.33288698 +0000 UTC m=+7144.728941043" watchObservedRunningTime="2026-03-08 21:30:43.336287651 +0000 UTC m=+7144.732341684" Mar 08 21:30:48 crc kubenswrapper[4885]: I0308 21:30:48.933528 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:48 crc kubenswrapper[4885]: I0308 21:30:48.934158 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:48 crc kubenswrapper[4885]: I0308 21:30:48.979548 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:49 crc kubenswrapper[4885]: I0308 21:30:49.486408 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:49 crc kubenswrapper[4885]: I0308 21:30:49.559749 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsqw2"] Mar 08 21:30:51 crc kubenswrapper[4885]: I0308 21:30:51.410267 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gsqw2" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="registry-server" containerID="cri-o://cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf" gracePeriod=2 Mar 08 21:30:51 crc kubenswrapper[4885]: I0308 21:30:51.929800 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.009155 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ndmm\" (UniqueName: \"kubernetes.io/projected/746e5174-2bf4-4698-b846-9bf402677b6f-kube-api-access-6ndmm\") pod \"746e5174-2bf4-4698-b846-9bf402677b6f\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.009198 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-catalog-content\") pod \"746e5174-2bf4-4698-b846-9bf402677b6f\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.009269 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-utilities\") pod \"746e5174-2bf4-4698-b846-9bf402677b6f\" (UID: \"746e5174-2bf4-4698-b846-9bf402677b6f\") " Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.010777 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-utilities" (OuterVolumeSpecName: "utilities") pod "746e5174-2bf4-4698-b846-9bf402677b6f" (UID: "746e5174-2bf4-4698-b846-9bf402677b6f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.018476 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/746e5174-2bf4-4698-b846-9bf402677b6f-kube-api-access-6ndmm" (OuterVolumeSpecName: "kube-api-access-6ndmm") pod "746e5174-2bf4-4698-b846-9bf402677b6f" (UID: "746e5174-2bf4-4698-b846-9bf402677b6f"). InnerVolumeSpecName "kube-api-access-6ndmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.045364 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "746e5174-2bf4-4698-b846-9bf402677b6f" (UID: "746e5174-2bf4-4698-b846-9bf402677b6f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.111237 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ndmm\" (UniqueName: \"kubernetes.io/projected/746e5174-2bf4-4698-b846-9bf402677b6f-kube-api-access-6ndmm\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.111465 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.111546 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/746e5174-2bf4-4698-b846-9bf402677b6f-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.419813 4885 generic.go:334] "Generic (PLEG): container finished" podID="746e5174-2bf4-4698-b846-9bf402677b6f" containerID="cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf" exitCode=0 Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.419870 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerDied","Data":"cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf"} Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.420119 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsqw2" event={"ID":"746e5174-2bf4-4698-b846-9bf402677b6f","Type":"ContainerDied","Data":"a7e029066f7d884b7eac13ca0e4540905bf51badb3b42685af6bd3d5e88e7bf2"} Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.419908 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsqw2" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.420139 4885 scope.go:117] "RemoveContainer" containerID="cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.447665 4885 scope.go:117] "RemoveContainer" containerID="f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.480082 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsqw2"] Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.490716 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsqw2"] Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.496996 4885 scope.go:117] "RemoveContainer" containerID="ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.551826 4885 scope.go:117] "RemoveContainer" containerID="cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf" Mar 08 21:30:52 crc kubenswrapper[4885]: E0308 21:30:52.552341 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf\": container with ID starting with cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf not found: ID does not exist" containerID="cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.552445 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf"} err="failed to get container status \"cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf\": rpc error: code = NotFound desc = could not find container \"cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf\": container with ID starting with cc6f37faccaf2c283f030df578d1d4d391026a7beb9893688e584810ed977abf not found: ID does not exist" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.552478 4885 scope.go:117] "RemoveContainer" containerID="f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c" Mar 08 21:30:52 crc kubenswrapper[4885]: E0308 21:30:52.553040 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c\": container with ID starting with f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c not found: ID does not exist" containerID="f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.553080 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c"} err="failed to get container status \"f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c\": rpc error: code = NotFound desc = could not find container \"f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c\": container with ID starting with f4450a2565098a72566d71efced33939f44045b780335b436121c20f5feb126c not found: ID does not exist" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.553107 4885 scope.go:117] "RemoveContainer" containerID="ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d" Mar 08 21:30:52 crc kubenswrapper[4885]: E0308 21:30:52.553412 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d\": container with ID starting with ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d not found: ID does not exist" containerID="ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d" Mar 08 21:30:52 crc kubenswrapper[4885]: I0308 21:30:52.553440 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d"} err="failed to get container status \"ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d\": rpc error: code = NotFound desc = could not find container \"ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d\": container with ID starting with ec6fcdde151632e3fd40c382198b5cb78cd64e393140d09d40633e336c6b7e5d not found: ID does not exist" Mar 08 21:30:53 crc kubenswrapper[4885]: I0308 21:30:53.383127 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" path="/var/lib/kubelet/pods/746e5174-2bf4-4698-b846-9bf402677b6f/volumes" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.210360 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xwfb7"] Mar 08 21:31:09 crc kubenswrapper[4885]: E0308 21:31:09.213139 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="extract-utilities" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.213262 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="extract-utilities" Mar 08 21:31:09 crc kubenswrapper[4885]: E0308 21:31:09.213381 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="registry-server" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.213459 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="registry-server" Mar 08 21:31:09 crc kubenswrapper[4885]: E0308 21:31:09.213548 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="extract-content" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.213627 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="extract-content" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.214013 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="746e5174-2bf4-4698-b846-9bf402677b6f" containerName="registry-server" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.216275 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.252021 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xwfb7"] Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.339778 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-utilities\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.339852 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-catalog-content\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.340050 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-kube-api-access-lvbzs\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.442258 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-utilities\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.442628 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-catalog-content\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.442907 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-kube-api-access-lvbzs\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.443160 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-utilities\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.445524 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-catalog-content\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.463685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-kube-api-access-lvbzs\") pod \"certified-operators-xwfb7\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:09 crc kubenswrapper[4885]: I0308 21:31:09.595282 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:10 crc kubenswrapper[4885]: I0308 21:31:10.099544 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xwfb7"] Mar 08 21:31:10 crc kubenswrapper[4885]: I0308 21:31:10.630325 4885 generic.go:334] "Generic (PLEG): container finished" podID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerID="ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1" exitCode=0 Mar 08 21:31:10 crc kubenswrapper[4885]: I0308 21:31:10.630406 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerDied","Data":"ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1"} Mar 08 21:31:10 crc kubenswrapper[4885]: I0308 21:31:10.630748 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerStarted","Data":"6d3789bc5062d4a5f67bb288294750b1cd9355abc2ea912909328e88879e17f3"} Mar 08 21:31:10 crc kubenswrapper[4885]: I0308 21:31:10.634977 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:31:11 crc kubenswrapper[4885]: I0308 21:31:11.643477 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerStarted","Data":"5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a"} Mar 08 21:31:13 crc kubenswrapper[4885]: I0308 21:31:13.664525 4885 generic.go:334] "Generic (PLEG): container finished" podID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerID="5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a" exitCode=0 Mar 08 21:31:13 crc kubenswrapper[4885]: I0308 21:31:13.664620 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerDied","Data":"5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a"} Mar 08 21:31:14 crc kubenswrapper[4885]: I0308 21:31:14.687366 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerStarted","Data":"b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298"} Mar 08 21:31:14 crc kubenswrapper[4885]: I0308 21:31:14.718006 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xwfb7" podStartSLOduration=2.16202467 podStartE2EDuration="5.717986405s" podCreationTimestamp="2026-03-08 21:31:09 +0000 UTC" firstStartedPulling="2026-03-08 21:31:10.634410921 +0000 UTC m=+7172.030464984" lastFinishedPulling="2026-03-08 21:31:14.190372666 +0000 UTC m=+7175.586426719" observedRunningTime="2026-03-08 21:31:14.70952472 +0000 UTC m=+7176.105578773" watchObservedRunningTime="2026-03-08 21:31:14.717986405 +0000 UTC m=+7176.114040438" Mar 08 21:31:19 crc kubenswrapper[4885]: I0308 21:31:19.595847 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:19 crc kubenswrapper[4885]: I0308 21:31:19.596483 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:19 crc kubenswrapper[4885]: I0308 21:31:19.667502 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:19 crc kubenswrapper[4885]: I0308 21:31:19.819544 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:19 crc kubenswrapper[4885]: I0308 21:31:19.911514 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xwfb7"] Mar 08 21:31:21 crc kubenswrapper[4885]: I0308 21:31:21.763837 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xwfb7" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="registry-server" containerID="cri-o://b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298" gracePeriod=2 Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.324309 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.360036 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-kube-api-access-lvbzs\") pod \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.360175 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-utilities\") pod \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.360210 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-catalog-content\") pod \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\" (UID: \"f11798c6-04c3-4e9d-a01b-998f5e3c1e93\") " Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.366230 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-utilities" (OuterVolumeSpecName: "utilities") pod "f11798c6-04c3-4e9d-a01b-998f5e3c1e93" (UID: "f11798c6-04c3-4e9d-a01b-998f5e3c1e93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.373409 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-kube-api-access-lvbzs" (OuterVolumeSpecName: "kube-api-access-lvbzs") pod "f11798c6-04c3-4e9d-a01b-998f5e3c1e93" (UID: "f11798c6-04c3-4e9d-a01b-998f5e3c1e93"). InnerVolumeSpecName "kube-api-access-lvbzs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.420082 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f11798c6-04c3-4e9d-a01b-998f5e3c1e93" (UID: "f11798c6-04c3-4e9d-a01b-998f5e3c1e93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.462675 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvbzs\" (UniqueName: \"kubernetes.io/projected/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-kube-api-access-lvbzs\") on node \"crc\" DevicePath \"\"" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.462712 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.462797 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f11798c6-04c3-4e9d-a01b-998f5e3c1e93-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.780556 4885 generic.go:334] "Generic (PLEG): container finished" podID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerID="b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298" exitCode=0 Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.780913 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerDied","Data":"b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298"} Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.780990 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xwfb7" event={"ID":"f11798c6-04c3-4e9d-a01b-998f5e3c1e93","Type":"ContainerDied","Data":"6d3789bc5062d4a5f67bb288294750b1cd9355abc2ea912909328e88879e17f3"} Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.781019 4885 scope.go:117] "RemoveContainer" containerID="b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.781221 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xwfb7" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.805176 4885 scope.go:117] "RemoveContainer" containerID="5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.854304 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xwfb7"] Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.862510 4885 scope.go:117] "RemoveContainer" containerID="ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.885505 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xwfb7"] Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.909597 4885 scope.go:117] "RemoveContainer" containerID="b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298" Mar 08 21:31:22 crc kubenswrapper[4885]: E0308 21:31:22.910004 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298\": container with ID starting with b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298 not found: ID does not exist" containerID="b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.910043 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298"} err="failed to get container status \"b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298\": rpc error: code = NotFound desc = could not find container \"b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298\": container with ID starting with b026acbbe04a21372dbef108b0334d364c1cd8b1b282473542821628bc335298 not found: ID does not exist" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.910065 4885 scope.go:117] "RemoveContainer" containerID="5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a" Mar 08 21:31:22 crc kubenswrapper[4885]: E0308 21:31:22.910363 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a\": container with ID starting with 5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a not found: ID does not exist" containerID="5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.910413 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a"} err="failed to get container status \"5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a\": rpc error: code = NotFound desc = could not find container \"5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a\": container with ID starting with 5496f3e9b628537474bb1a27fbd234023b4892bb962ef298d56c1a402cdd318a not found: ID does not exist" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.910444 4885 scope.go:117] "RemoveContainer" containerID="ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1" Mar 08 21:31:22 crc kubenswrapper[4885]: E0308 21:31:22.910726 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1\": container with ID starting with ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1 not found: ID does not exist" containerID="ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1" Mar 08 21:31:22 crc kubenswrapper[4885]: I0308 21:31:22.910752 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1"} err="failed to get container status \"ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1\": rpc error: code = NotFound desc = could not find container \"ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1\": container with ID starting with ca029b358196b93ec422e6ffb6c05160962fa2a5299cc3f5510a4247261702e1 not found: ID does not exist" Mar 08 21:31:23 crc kubenswrapper[4885]: I0308 21:31:23.385865 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" path="/var/lib/kubelet/pods/f11798c6-04c3-4e9d-a01b-998f5e3c1e93/volumes" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.163132 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550092-kjpvt"] Mar 08 21:32:00 crc kubenswrapper[4885]: E0308 21:32:00.163883 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="extract-utilities" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.163894 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="extract-utilities" Mar 08 21:32:00 crc kubenswrapper[4885]: E0308 21:32:00.163938 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="extract-content" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.163944 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="extract-content" Mar 08 21:32:00 crc kubenswrapper[4885]: E0308 21:32:00.163958 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="registry-server" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.163963 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="registry-server" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.164151 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11798c6-04c3-4e9d-a01b-998f5e3c1e93" containerName="registry-server" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.164812 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.168087 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.168254 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.168509 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.183239 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550092-kjpvt"] Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.253490 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24b7n\" (UniqueName: \"kubernetes.io/projected/244b80f2-9a2b-4db4-a451-086baed68f2a-kube-api-access-24b7n\") pod \"auto-csr-approver-29550092-kjpvt\" (UID: \"244b80f2-9a2b-4db4-a451-086baed68f2a\") " pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.355944 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24b7n\" (UniqueName: \"kubernetes.io/projected/244b80f2-9a2b-4db4-a451-086baed68f2a-kube-api-access-24b7n\") pod \"auto-csr-approver-29550092-kjpvt\" (UID: \"244b80f2-9a2b-4db4-a451-086baed68f2a\") " pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.373354 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24b7n\" (UniqueName: \"kubernetes.io/projected/244b80f2-9a2b-4db4-a451-086baed68f2a-kube-api-access-24b7n\") pod \"auto-csr-approver-29550092-kjpvt\" (UID: \"244b80f2-9a2b-4db4-a451-086baed68f2a\") " pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:00 crc kubenswrapper[4885]: I0308 21:32:00.483855 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:01 crc kubenswrapper[4885]: I0308 21:32:01.013872 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550092-kjpvt"] Mar 08 21:32:01 crc kubenswrapper[4885]: I0308 21:32:01.316700 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" event={"ID":"244b80f2-9a2b-4db4-a451-086baed68f2a","Type":"ContainerStarted","Data":"1dfa212bca25c37e1b5db777ec7ce620194a8ed2afe9236e6a4d1b3837b2e6a4"} Mar 08 21:32:02 crc kubenswrapper[4885]: I0308 21:32:02.328026 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" event={"ID":"244b80f2-9a2b-4db4-a451-086baed68f2a","Type":"ContainerStarted","Data":"3d584615f7c68fc963e45c7150d91d00d64b9ed657e10cd4826322e66a7ec964"} Mar 08 21:32:02 crc kubenswrapper[4885]: I0308 21:32:02.358895 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" podStartSLOduration=1.427598881 podStartE2EDuration="2.358875797s" podCreationTimestamp="2026-03-08 21:32:00 +0000 UTC" firstStartedPulling="2026-03-08 21:32:01.02719945 +0000 UTC m=+7222.423253513" lastFinishedPulling="2026-03-08 21:32:01.958476366 +0000 UTC m=+7223.354530429" observedRunningTime="2026-03-08 21:32:02.347259698 +0000 UTC m=+7223.743313731" watchObservedRunningTime="2026-03-08 21:32:02.358875797 +0000 UTC m=+7223.754929830" Mar 08 21:32:03 crc kubenswrapper[4885]: I0308 21:32:03.351315 4885 generic.go:334] "Generic (PLEG): container finished" podID="244b80f2-9a2b-4db4-a451-086baed68f2a" containerID="3d584615f7c68fc963e45c7150d91d00d64b9ed657e10cd4826322e66a7ec964" exitCode=0 Mar 08 21:32:03 crc kubenswrapper[4885]: I0308 21:32:03.351454 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" event={"ID":"244b80f2-9a2b-4db4-a451-086baed68f2a","Type":"ContainerDied","Data":"3d584615f7c68fc963e45c7150d91d00d64b9ed657e10cd4826322e66a7ec964"} Mar 08 21:32:04 crc kubenswrapper[4885]: I0308 21:32:04.776373 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:04 crc kubenswrapper[4885]: I0308 21:32:04.868746 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24b7n\" (UniqueName: \"kubernetes.io/projected/244b80f2-9a2b-4db4-a451-086baed68f2a-kube-api-access-24b7n\") pod \"244b80f2-9a2b-4db4-a451-086baed68f2a\" (UID: \"244b80f2-9a2b-4db4-a451-086baed68f2a\") " Mar 08 21:32:04 crc kubenswrapper[4885]: I0308 21:32:04.880192 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/244b80f2-9a2b-4db4-a451-086baed68f2a-kube-api-access-24b7n" (OuterVolumeSpecName: "kube-api-access-24b7n") pod "244b80f2-9a2b-4db4-a451-086baed68f2a" (UID: "244b80f2-9a2b-4db4-a451-086baed68f2a"). InnerVolumeSpecName "kube-api-access-24b7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:32:04 crc kubenswrapper[4885]: I0308 21:32:04.971519 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24b7n\" (UniqueName: \"kubernetes.io/projected/244b80f2-9a2b-4db4-a451-086baed68f2a-kube-api-access-24b7n\") on node \"crc\" DevicePath \"\"" Mar 08 21:32:05 crc kubenswrapper[4885]: I0308 21:32:05.390408 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" event={"ID":"244b80f2-9a2b-4db4-a451-086baed68f2a","Type":"ContainerDied","Data":"1dfa212bca25c37e1b5db777ec7ce620194a8ed2afe9236e6a4d1b3837b2e6a4"} Mar 08 21:32:05 crc kubenswrapper[4885]: I0308 21:32:05.390748 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dfa212bca25c37e1b5db777ec7ce620194a8ed2afe9236e6a4d1b3837b2e6a4" Mar 08 21:32:05 crc kubenswrapper[4885]: I0308 21:32:05.390844 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550092-kjpvt" Mar 08 21:32:05 crc kubenswrapper[4885]: I0308 21:32:05.436765 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550086-vhcf6"] Mar 08 21:32:05 crc kubenswrapper[4885]: I0308 21:32:05.449801 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550086-vhcf6"] Mar 08 21:32:07 crc kubenswrapper[4885]: I0308 21:32:07.400284 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e490c3-347d-4c4b-aa26-3f680e0bebc0" path="/var/lib/kubelet/pods/e5e490c3-347d-4c4b-aa26-3f680e0bebc0/volumes" Mar 08 21:32:12 crc kubenswrapper[4885]: I0308 21:32:12.052505 4885 scope.go:117] "RemoveContainer" containerID="a2343f387ee044683e4b5c10c184262a0a1ddcb6bceb9b6768e6cef7d9c4c637" Mar 08 21:32:30 crc kubenswrapper[4885]: I0308 21:32:30.705631 4885 generic.go:334] "Generic (PLEG): container finished" podID="51b71742-3986-42a4-a016-eeecb3a7ba16" containerID="406060aa77426fd158b89e1d5655fcca0b66ca3902f98a8c32bb3f98e32a6fac" exitCode=0 Mar 08 21:32:30 crc kubenswrapper[4885]: I0308 21:32:30.705700 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" event={"ID":"51b71742-3986-42a4-a016-eeecb3a7ba16","Type":"ContainerDied","Data":"406060aa77426fd158b89e1d5655fcca0b66ca3902f98a8c32bb3f98e32a6fac"} Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.259655 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.389716 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-bootstrap-combined-ca-bundle\") pod \"51b71742-3986-42a4-a016-eeecb3a7ba16\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.389769 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ceph\") pod \"51b71742-3986-42a4-a016-eeecb3a7ba16\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.389819 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh6cn\" (UniqueName: \"kubernetes.io/projected/51b71742-3986-42a4-a016-eeecb3a7ba16-kube-api-access-gh6cn\") pod \"51b71742-3986-42a4-a016-eeecb3a7ba16\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.389984 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-inventory\") pod \"51b71742-3986-42a4-a016-eeecb3a7ba16\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.390026 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ssh-key-openstack-cell1\") pod \"51b71742-3986-42a4-a016-eeecb3a7ba16\" (UID: \"51b71742-3986-42a4-a016-eeecb3a7ba16\") " Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.398206 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ceph" (OuterVolumeSpecName: "ceph") pod "51b71742-3986-42a4-a016-eeecb3a7ba16" (UID: "51b71742-3986-42a4-a016-eeecb3a7ba16"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.399182 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51b71742-3986-42a4-a016-eeecb3a7ba16-kube-api-access-gh6cn" (OuterVolumeSpecName: "kube-api-access-gh6cn") pod "51b71742-3986-42a4-a016-eeecb3a7ba16" (UID: "51b71742-3986-42a4-a016-eeecb3a7ba16"). InnerVolumeSpecName "kube-api-access-gh6cn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.399347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "51b71742-3986-42a4-a016-eeecb3a7ba16" (UID: "51b71742-3986-42a4-a016-eeecb3a7ba16"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.421954 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-inventory" (OuterVolumeSpecName: "inventory") pod "51b71742-3986-42a4-a016-eeecb3a7ba16" (UID: "51b71742-3986-42a4-a016-eeecb3a7ba16"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.444093 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "51b71742-3986-42a4-a016-eeecb3a7ba16" (UID: "51b71742-3986-42a4-a016-eeecb3a7ba16"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.492780 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.492835 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh6cn\" (UniqueName: \"kubernetes.io/projected/51b71742-3986-42a4-a016-eeecb3a7ba16-kube-api-access-gh6cn\") on node \"crc\" DevicePath \"\"" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.492854 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.492872 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.492889 4885 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51b71742-3986-42a4-a016-eeecb3a7ba16-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.736470 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" event={"ID":"51b71742-3986-42a4-a016-eeecb3a7ba16","Type":"ContainerDied","Data":"402a266dd7afc4a5f19cb1569e07873933799091a2ea37f980ac79de9b2acced"} Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.736810 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="402a266dd7afc4a5f19cb1569e07873933799091a2ea37f980ac79de9b2acced" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.736566 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-bcvmz" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.818109 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.818205 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.865198 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wl2gp"] Mar 08 21:32:32 crc kubenswrapper[4885]: E0308 21:32:32.866010 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="244b80f2-9a2b-4db4-a451-086baed68f2a" containerName="oc" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.866040 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="244b80f2-9a2b-4db4-a451-086baed68f2a" containerName="oc" Mar 08 21:32:32 crc kubenswrapper[4885]: E0308 21:32:32.866087 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51b71742-3986-42a4-a016-eeecb3a7ba16" containerName="bootstrap-openstack-openstack-cell1" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.866104 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="51b71742-3986-42a4-a016-eeecb3a7ba16" containerName="bootstrap-openstack-openstack-cell1" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.866550 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="244b80f2-9a2b-4db4-a451-086baed68f2a" containerName="oc" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.866586 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="51b71742-3986-42a4-a016-eeecb3a7ba16" containerName="bootstrap-openstack-openstack-cell1" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.867887 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.871399 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.871804 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.871868 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.872604 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wl2gp"] Mar 08 21:32:32 crc kubenswrapper[4885]: I0308 21:32:32.872770 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.011823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmwsg\" (UniqueName: \"kubernetes.io/projected/d2786842-7b37-4e0c-843e-9dc4467df6ad-kube-api-access-vmwsg\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.012213 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.012391 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-inventory\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.012523 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ceph\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.114674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmwsg\" (UniqueName: \"kubernetes.io/projected/d2786842-7b37-4e0c-843e-9dc4467df6ad-kube-api-access-vmwsg\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.114795 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.114860 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-inventory\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.114905 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ceph\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.121187 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ceph\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.121344 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ssh-key-openstack-cell1\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.126405 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-inventory\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.141101 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmwsg\" (UniqueName: \"kubernetes.io/projected/d2786842-7b37-4e0c-843e-9dc4467df6ad-kube-api-access-vmwsg\") pod \"download-cache-openstack-openstack-cell1-wl2gp\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.198068 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:32:33 crc kubenswrapper[4885]: I0308 21:32:33.869540 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-wl2gp"] Mar 08 21:32:34 crc kubenswrapper[4885]: I0308 21:32:34.761428 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" event={"ID":"d2786842-7b37-4e0c-843e-9dc4467df6ad","Type":"ContainerStarted","Data":"f102d4cb9581980afc0e918a1eb63d77a02bb60102eaf5747b80166ea030a1cd"} Mar 08 21:32:34 crc kubenswrapper[4885]: I0308 21:32:34.762031 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" event={"ID":"d2786842-7b37-4e0c-843e-9dc4467df6ad","Type":"ContainerStarted","Data":"8cd15d3c2a11100671ae9112dd774068c60e76586455679373e108e318186d7e"} Mar 08 21:32:34 crc kubenswrapper[4885]: I0308 21:32:34.783896 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" podStartSLOduration=2.250114322 podStartE2EDuration="2.783874815s" podCreationTimestamp="2026-03-08 21:32:32 +0000 UTC" firstStartedPulling="2026-03-08 21:32:33.879991096 +0000 UTC m=+7255.276045129" lastFinishedPulling="2026-03-08 21:32:34.413751559 +0000 UTC m=+7255.809805622" observedRunningTime="2026-03-08 21:32:34.780772563 +0000 UTC m=+7256.176826596" watchObservedRunningTime="2026-03-08 21:32:34.783874815 +0000 UTC m=+7256.179928848" Mar 08 21:33:02 crc kubenswrapper[4885]: I0308 21:33:02.817944 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:33:02 crc kubenswrapper[4885]: I0308 21:33:02.818512 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:33:32 crc kubenswrapper[4885]: I0308 21:33:32.818838 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:33:32 crc kubenswrapper[4885]: I0308 21:33:32.819687 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:33:32 crc kubenswrapper[4885]: I0308 21:33:32.819860 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:33:32 crc kubenswrapper[4885]: I0308 21:33:32.825310 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb4693f4eeb79088711f27b4882bee725d38950ce75255766be3668fb258c672"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:33:32 crc kubenswrapper[4885]: I0308 21:33:32.825438 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://eb4693f4eeb79088711f27b4882bee725d38950ce75255766be3668fb258c672" gracePeriod=600 Mar 08 21:33:33 crc kubenswrapper[4885]: I0308 21:33:33.479772 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="eb4693f4eeb79088711f27b4882bee725d38950ce75255766be3668fb258c672" exitCode=0 Mar 08 21:33:33 crc kubenswrapper[4885]: I0308 21:33:33.479854 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"eb4693f4eeb79088711f27b4882bee725d38950ce75255766be3668fb258c672"} Mar 08 21:33:33 crc kubenswrapper[4885]: I0308 21:33:33.480241 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9"} Mar 08 21:33:33 crc kubenswrapper[4885]: I0308 21:33:33.480266 4885 scope.go:117] "RemoveContainer" containerID="59687b50893668877c273da1c041aa8099577b12c5413bc5577e88b7ce130937" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.147574 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550094-mjnvj"] Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.150671 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.154714 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.154831 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.154841 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.159963 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550094-mjnvj"] Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.241228 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzp7n\" (UniqueName: \"kubernetes.io/projected/c2a860a0-84ad-49b7-8596-05521c33108a-kube-api-access-xzp7n\") pod \"auto-csr-approver-29550094-mjnvj\" (UID: \"c2a860a0-84ad-49b7-8596-05521c33108a\") " pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.343811 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzp7n\" (UniqueName: \"kubernetes.io/projected/c2a860a0-84ad-49b7-8596-05521c33108a-kube-api-access-xzp7n\") pod \"auto-csr-approver-29550094-mjnvj\" (UID: \"c2a860a0-84ad-49b7-8596-05521c33108a\") " pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.374240 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzp7n\" (UniqueName: \"kubernetes.io/projected/c2a860a0-84ad-49b7-8596-05521c33108a-kube-api-access-xzp7n\") pod \"auto-csr-approver-29550094-mjnvj\" (UID: \"c2a860a0-84ad-49b7-8596-05521c33108a\") " pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.484569 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:00 crc kubenswrapper[4885]: W0308 21:34:00.965730 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2a860a0_84ad_49b7_8596_05521c33108a.slice/crio-5c74c261032bd4fbd89427e7e6da688552739a6abdf84a63ee0998c914ed2f7a WatchSource:0}: Error finding container 5c74c261032bd4fbd89427e7e6da688552739a6abdf84a63ee0998c914ed2f7a: Status 404 returned error can't find the container with id 5c74c261032bd4fbd89427e7e6da688552739a6abdf84a63ee0998c914ed2f7a Mar 08 21:34:00 crc kubenswrapper[4885]: I0308 21:34:00.967143 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550094-mjnvj"] Mar 08 21:34:01 crc kubenswrapper[4885]: I0308 21:34:01.913198 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" event={"ID":"c2a860a0-84ad-49b7-8596-05521c33108a","Type":"ContainerStarted","Data":"5c74c261032bd4fbd89427e7e6da688552739a6abdf84a63ee0998c914ed2f7a"} Mar 08 21:34:02 crc kubenswrapper[4885]: I0308 21:34:02.929756 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" event={"ID":"c2a860a0-84ad-49b7-8596-05521c33108a","Type":"ContainerStarted","Data":"6d31a6020ea44ed51ad167034dfe4175ea1c3055421ddefd4060ab7f5195dfd9"} Mar 08 21:34:02 crc kubenswrapper[4885]: I0308 21:34:02.965645 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" podStartSLOduration=1.6333635480000002 podStartE2EDuration="2.965615244s" podCreationTimestamp="2026-03-08 21:34:00 +0000 UTC" firstStartedPulling="2026-03-08 21:34:00.972393367 +0000 UTC m=+7342.368447390" lastFinishedPulling="2026-03-08 21:34:02.304645033 +0000 UTC m=+7343.700699086" observedRunningTime="2026-03-08 21:34:02.946683569 +0000 UTC m=+7344.342737622" watchObservedRunningTime="2026-03-08 21:34:02.965615244 +0000 UTC m=+7344.361669307" Mar 08 21:34:03 crc kubenswrapper[4885]: I0308 21:34:03.949001 4885 generic.go:334] "Generic (PLEG): container finished" podID="c2a860a0-84ad-49b7-8596-05521c33108a" containerID="6d31a6020ea44ed51ad167034dfe4175ea1c3055421ddefd4060ab7f5195dfd9" exitCode=0 Mar 08 21:34:03 crc kubenswrapper[4885]: I0308 21:34:03.949310 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" event={"ID":"c2a860a0-84ad-49b7-8596-05521c33108a","Type":"ContainerDied","Data":"6d31a6020ea44ed51ad167034dfe4175ea1c3055421ddefd4060ab7f5195dfd9"} Mar 08 21:34:05 crc kubenswrapper[4885]: I0308 21:34:05.382068 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:05 crc kubenswrapper[4885]: I0308 21:34:05.499315 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzp7n\" (UniqueName: \"kubernetes.io/projected/c2a860a0-84ad-49b7-8596-05521c33108a-kube-api-access-xzp7n\") pod \"c2a860a0-84ad-49b7-8596-05521c33108a\" (UID: \"c2a860a0-84ad-49b7-8596-05521c33108a\") " Mar 08 21:34:05 crc kubenswrapper[4885]: I0308 21:34:05.504830 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a860a0-84ad-49b7-8596-05521c33108a-kube-api-access-xzp7n" (OuterVolumeSpecName: "kube-api-access-xzp7n") pod "c2a860a0-84ad-49b7-8596-05521c33108a" (UID: "c2a860a0-84ad-49b7-8596-05521c33108a"). InnerVolumeSpecName "kube-api-access-xzp7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:34:05 crc kubenswrapper[4885]: I0308 21:34:05.602982 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzp7n\" (UniqueName: \"kubernetes.io/projected/c2a860a0-84ad-49b7-8596-05521c33108a-kube-api-access-xzp7n\") on node \"crc\" DevicePath \"\"" Mar 08 21:34:06 crc kubenswrapper[4885]: I0308 21:34:06.012444 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" event={"ID":"c2a860a0-84ad-49b7-8596-05521c33108a","Type":"ContainerDied","Data":"5c74c261032bd4fbd89427e7e6da688552739a6abdf84a63ee0998c914ed2f7a"} Mar 08 21:34:06 crc kubenswrapper[4885]: I0308 21:34:06.012495 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c74c261032bd4fbd89427e7e6da688552739a6abdf84a63ee0998c914ed2f7a" Mar 08 21:34:06 crc kubenswrapper[4885]: I0308 21:34:06.012524 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550094-mjnvj" Mar 08 21:34:06 crc kubenswrapper[4885]: I0308 21:34:06.048516 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550088-xzfvm"] Mar 08 21:34:06 crc kubenswrapper[4885]: I0308 21:34:06.054844 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550088-xzfvm"] Mar 08 21:34:07 crc kubenswrapper[4885]: I0308 21:34:07.381174 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec092c86-e0c8-415e-bfe7-80914fe8ce5b" path="/var/lib/kubelet/pods/ec092c86-e0c8-415e-bfe7-80914fe8ce5b/volumes" Mar 08 21:34:12 crc kubenswrapper[4885]: I0308 21:34:12.187982 4885 scope.go:117] "RemoveContainer" containerID="38151e84b4b7428445936ba0b4e7f51f9bdc2be5ab2ec1353c272510a10895bd" Mar 08 21:34:42 crc kubenswrapper[4885]: I0308 21:34:42.438301 4885 generic.go:334] "Generic (PLEG): container finished" podID="d2786842-7b37-4e0c-843e-9dc4467df6ad" containerID="f102d4cb9581980afc0e918a1eb63d77a02bb60102eaf5747b80166ea030a1cd" exitCode=0 Mar 08 21:34:42 crc kubenswrapper[4885]: I0308 21:34:42.438589 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" event={"ID":"d2786842-7b37-4e0c-843e-9dc4467df6ad","Type":"ContainerDied","Data":"f102d4cb9581980afc0e918a1eb63d77a02bb60102eaf5747b80166ea030a1cd"} Mar 08 21:34:43 crc kubenswrapper[4885]: I0308 21:34:43.921470 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.023521 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmwsg\" (UniqueName: \"kubernetes.io/projected/d2786842-7b37-4e0c-843e-9dc4467df6ad-kube-api-access-vmwsg\") pod \"d2786842-7b37-4e0c-843e-9dc4467df6ad\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.023591 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ceph\") pod \"d2786842-7b37-4e0c-843e-9dc4467df6ad\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.023635 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ssh-key-openstack-cell1\") pod \"d2786842-7b37-4e0c-843e-9dc4467df6ad\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.023672 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-inventory\") pod \"d2786842-7b37-4e0c-843e-9dc4467df6ad\" (UID: \"d2786842-7b37-4e0c-843e-9dc4467df6ad\") " Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.034209 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ceph" (OuterVolumeSpecName: "ceph") pod "d2786842-7b37-4e0c-843e-9dc4467df6ad" (UID: "d2786842-7b37-4e0c-843e-9dc4467df6ad"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.034354 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2786842-7b37-4e0c-843e-9dc4467df6ad-kube-api-access-vmwsg" (OuterVolumeSpecName: "kube-api-access-vmwsg") pod "d2786842-7b37-4e0c-843e-9dc4467df6ad" (UID: "d2786842-7b37-4e0c-843e-9dc4467df6ad"). InnerVolumeSpecName "kube-api-access-vmwsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.069527 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d2786842-7b37-4e0c-843e-9dc4467df6ad" (UID: "d2786842-7b37-4e0c-843e-9dc4467df6ad"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.074136 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-inventory" (OuterVolumeSpecName: "inventory") pod "d2786842-7b37-4e0c-843e-9dc4467df6ad" (UID: "d2786842-7b37-4e0c-843e-9dc4467df6ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.125961 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.125998 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.126013 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d2786842-7b37-4e0c-843e-9dc4467df6ad-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.126025 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmwsg\" (UniqueName: \"kubernetes.io/projected/d2786842-7b37-4e0c-843e-9dc4467df6ad-kube-api-access-vmwsg\") on node \"crc\" DevicePath \"\"" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.465246 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" event={"ID":"d2786842-7b37-4e0c-843e-9dc4467df6ad","Type":"ContainerDied","Data":"8cd15d3c2a11100671ae9112dd774068c60e76586455679373e108e318186d7e"} Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.465295 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-wl2gp" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.465310 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cd15d3c2a11100671ae9112dd774068c60e76586455679373e108e318186d7e" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.577069 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-s5bq4"] Mar 08 21:34:44 crc kubenswrapper[4885]: E0308 21:34:44.577789 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2a860a0-84ad-49b7-8596-05521c33108a" containerName="oc" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.577810 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a860a0-84ad-49b7-8596-05521c33108a" containerName="oc" Mar 08 21:34:44 crc kubenswrapper[4885]: E0308 21:34:44.577836 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2786842-7b37-4e0c-843e-9dc4467df6ad" containerName="download-cache-openstack-openstack-cell1" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.577845 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2786842-7b37-4e0c-843e-9dc4467df6ad" containerName="download-cache-openstack-openstack-cell1" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.578247 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2a860a0-84ad-49b7-8596-05521c33108a" containerName="oc" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.578267 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2786842-7b37-4e0c-843e-9dc4467df6ad" containerName="download-cache-openstack-openstack-cell1" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.579134 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.582937 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.583129 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.583298 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.583440 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.603531 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-s5bq4"] Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.638616 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ceph\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.638706 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-inventory\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.638864 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2jb\" (UniqueName: \"kubernetes.io/projected/cd7ac915-62c8-4d95-96a3-899c245e685c-kube-api-access-dh2jb\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.638962 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.740528 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh2jb\" (UniqueName: \"kubernetes.io/projected/cd7ac915-62c8-4d95-96a3-899c245e685c-kube-api-access-dh2jb\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.740658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.740729 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ceph\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.740823 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-inventory\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.745892 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-inventory\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.746278 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ceph\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.746633 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ssh-key-openstack-cell1\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.760749 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh2jb\" (UniqueName: \"kubernetes.io/projected/cd7ac915-62c8-4d95-96a3-899c245e685c-kube-api-access-dh2jb\") pod \"configure-network-openstack-openstack-cell1-s5bq4\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:44 crc kubenswrapper[4885]: I0308 21:34:44.906514 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:34:45 crc kubenswrapper[4885]: I0308 21:34:45.554859 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-s5bq4"] Mar 08 21:34:46 crc kubenswrapper[4885]: I0308 21:34:46.109085 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:34:46 crc kubenswrapper[4885]: I0308 21:34:46.524889 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" event={"ID":"cd7ac915-62c8-4d95-96a3-899c245e685c","Type":"ContainerStarted","Data":"f1906982c8fca289eb830ed971294d6cad07ce24b9b4743a626905c8cd333806"} Mar 08 21:34:46 crc kubenswrapper[4885]: I0308 21:34:46.525208 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" event={"ID":"cd7ac915-62c8-4d95-96a3-899c245e685c","Type":"ContainerStarted","Data":"e287346a42ab649f4a5a092a71c3a58750abbf25d49cc2934c8a59d2497b54ac"} Mar 08 21:34:46 crc kubenswrapper[4885]: I0308 21:34:46.557600 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" podStartSLOduration=2.017231939 podStartE2EDuration="2.557582003s" podCreationTimestamp="2026-03-08 21:34:44 +0000 UTC" firstStartedPulling="2026-03-08 21:34:45.564507174 +0000 UTC m=+7386.960561197" lastFinishedPulling="2026-03-08 21:34:46.104857198 +0000 UTC m=+7387.500911261" observedRunningTime="2026-03-08 21:34:46.556505885 +0000 UTC m=+7387.952559918" watchObservedRunningTime="2026-03-08 21:34:46.557582003 +0000 UTC m=+7387.953636026" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.146480 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550096-4s5wc"] Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.148362 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.150261 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.152453 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.152465 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.159973 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550096-4s5wc"] Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.241693 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tswn9\" (UniqueName: \"kubernetes.io/projected/44e60165-e38f-4fbe-87a1-5908598e0e38-kube-api-access-tswn9\") pod \"auto-csr-approver-29550096-4s5wc\" (UID: \"44e60165-e38f-4fbe-87a1-5908598e0e38\") " pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.343354 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tswn9\" (UniqueName: \"kubernetes.io/projected/44e60165-e38f-4fbe-87a1-5908598e0e38-kube-api-access-tswn9\") pod \"auto-csr-approver-29550096-4s5wc\" (UID: \"44e60165-e38f-4fbe-87a1-5908598e0e38\") " pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.364598 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tswn9\" (UniqueName: \"kubernetes.io/projected/44e60165-e38f-4fbe-87a1-5908598e0e38-kube-api-access-tswn9\") pod \"auto-csr-approver-29550096-4s5wc\" (UID: \"44e60165-e38f-4fbe-87a1-5908598e0e38\") " pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.474095 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:00 crc kubenswrapper[4885]: I0308 21:36:00.992521 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550096-4s5wc"] Mar 08 21:36:01 crc kubenswrapper[4885]: I0308 21:36:01.408076 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" event={"ID":"44e60165-e38f-4fbe-87a1-5908598e0e38","Type":"ContainerStarted","Data":"3a98b6be408a0fe9e49da3722d528e5b8bf6f4d3016abf8e17fdd885370ac5e8"} Mar 08 21:36:02 crc kubenswrapper[4885]: I0308 21:36:02.421667 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" event={"ID":"44e60165-e38f-4fbe-87a1-5908598e0e38","Type":"ContainerStarted","Data":"59a022bee7d69812163e1296ed21c2217e23eb0a10b094d9fb3faabfbcba446f"} Mar 08 21:36:02 crc kubenswrapper[4885]: I0308 21:36:02.439968 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" podStartSLOduration=1.5337660130000002 podStartE2EDuration="2.439948205s" podCreationTimestamp="2026-03-08 21:36:00 +0000 UTC" firstStartedPulling="2026-03-08 21:36:00.993492682 +0000 UTC m=+7462.389546745" lastFinishedPulling="2026-03-08 21:36:01.899674904 +0000 UTC m=+7463.295728937" observedRunningTime="2026-03-08 21:36:02.43525746 +0000 UTC m=+7463.831311503" watchObservedRunningTime="2026-03-08 21:36:02.439948205 +0000 UTC m=+7463.836002238" Mar 08 21:36:02 crc kubenswrapper[4885]: I0308 21:36:02.818065 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:36:02 crc kubenswrapper[4885]: I0308 21:36:02.818162 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:36:03 crc kubenswrapper[4885]: I0308 21:36:03.436967 4885 generic.go:334] "Generic (PLEG): container finished" podID="44e60165-e38f-4fbe-87a1-5908598e0e38" containerID="59a022bee7d69812163e1296ed21c2217e23eb0a10b094d9fb3faabfbcba446f" exitCode=0 Mar 08 21:36:03 crc kubenswrapper[4885]: I0308 21:36:03.437029 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" event={"ID":"44e60165-e38f-4fbe-87a1-5908598e0e38","Type":"ContainerDied","Data":"59a022bee7d69812163e1296ed21c2217e23eb0a10b094d9fb3faabfbcba446f"} Mar 08 21:36:04 crc kubenswrapper[4885]: I0308 21:36:04.977275 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.069811 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tswn9\" (UniqueName: \"kubernetes.io/projected/44e60165-e38f-4fbe-87a1-5908598e0e38-kube-api-access-tswn9\") pod \"44e60165-e38f-4fbe-87a1-5908598e0e38\" (UID: \"44e60165-e38f-4fbe-87a1-5908598e0e38\") " Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.075762 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44e60165-e38f-4fbe-87a1-5908598e0e38-kube-api-access-tswn9" (OuterVolumeSpecName: "kube-api-access-tswn9") pod "44e60165-e38f-4fbe-87a1-5908598e0e38" (UID: "44e60165-e38f-4fbe-87a1-5908598e0e38"). InnerVolumeSpecName "kube-api-access-tswn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.172469 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tswn9\" (UniqueName: \"kubernetes.io/projected/44e60165-e38f-4fbe-87a1-5908598e0e38-kube-api-access-tswn9\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.460414 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" event={"ID":"44e60165-e38f-4fbe-87a1-5908598e0e38","Type":"ContainerDied","Data":"3a98b6be408a0fe9e49da3722d528e5b8bf6f4d3016abf8e17fdd885370ac5e8"} Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.460464 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a98b6be408a0fe9e49da3722d528e5b8bf6f4d3016abf8e17fdd885370ac5e8" Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.460515 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550096-4s5wc" Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.535104 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550090-hww59"] Mar 08 21:36:05 crc kubenswrapper[4885]: I0308 21:36:05.548690 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550090-hww59"] Mar 08 21:36:06 crc kubenswrapper[4885]: I0308 21:36:06.475225 4885 generic.go:334] "Generic (PLEG): container finished" podID="cd7ac915-62c8-4d95-96a3-899c245e685c" containerID="f1906982c8fca289eb830ed971294d6cad07ce24b9b4743a626905c8cd333806" exitCode=0 Mar 08 21:36:06 crc kubenswrapper[4885]: I0308 21:36:06.475283 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" event={"ID":"cd7ac915-62c8-4d95-96a3-899c245e685c","Type":"ContainerDied","Data":"f1906982c8fca289eb830ed971294d6cad07ce24b9b4743a626905c8cd333806"} Mar 08 21:36:07 crc kubenswrapper[4885]: I0308 21:36:07.389090 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab193869-7f8c-4475-8be4-393848bd54e3" path="/var/lib/kubelet/pods/ab193869-7f8c-4475-8be4-393848bd54e3/volumes" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.065849 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.246463 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ssh-key-openstack-cell1\") pod \"cd7ac915-62c8-4d95-96a3-899c245e685c\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.246575 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-inventory\") pod \"cd7ac915-62c8-4d95-96a3-899c245e685c\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.246638 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ceph\") pod \"cd7ac915-62c8-4d95-96a3-899c245e685c\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.247390 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dh2jb\" (UniqueName: \"kubernetes.io/projected/cd7ac915-62c8-4d95-96a3-899c245e685c-kube-api-access-dh2jb\") pod \"cd7ac915-62c8-4d95-96a3-899c245e685c\" (UID: \"cd7ac915-62c8-4d95-96a3-899c245e685c\") " Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.253061 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd7ac915-62c8-4d95-96a3-899c245e685c-kube-api-access-dh2jb" (OuterVolumeSpecName: "kube-api-access-dh2jb") pod "cd7ac915-62c8-4d95-96a3-899c245e685c" (UID: "cd7ac915-62c8-4d95-96a3-899c245e685c"). InnerVolumeSpecName "kube-api-access-dh2jb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.253295 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ceph" (OuterVolumeSpecName: "ceph") pod "cd7ac915-62c8-4d95-96a3-899c245e685c" (UID: "cd7ac915-62c8-4d95-96a3-899c245e685c"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.290239 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "cd7ac915-62c8-4d95-96a3-899c245e685c" (UID: "cd7ac915-62c8-4d95-96a3-899c245e685c"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.301400 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-inventory" (OuterVolumeSpecName: "inventory") pod "cd7ac915-62c8-4d95-96a3-899c245e685c" (UID: "cd7ac915-62c8-4d95-96a3-899c245e685c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.349611 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dh2jb\" (UniqueName: \"kubernetes.io/projected/cd7ac915-62c8-4d95-96a3-899c245e685c-kube-api-access-dh2jb\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.349641 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.349650 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.349659 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cd7ac915-62c8-4d95-96a3-899c245e685c-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.497455 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" event={"ID":"cd7ac915-62c8-4d95-96a3-899c245e685c","Type":"ContainerDied","Data":"e287346a42ab649f4a5a092a71c3a58750abbf25d49cc2934c8a59d2497b54ac"} Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.497496 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e287346a42ab649f4a5a092a71c3a58750abbf25d49cc2934c8a59d2497b54ac" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.497549 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-s5bq4" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.597680 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-lcz5l"] Mar 08 21:36:08 crc kubenswrapper[4885]: E0308 21:36:08.609961 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44e60165-e38f-4fbe-87a1-5908598e0e38" containerName="oc" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.609991 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="44e60165-e38f-4fbe-87a1-5908598e0e38" containerName="oc" Mar 08 21:36:08 crc kubenswrapper[4885]: E0308 21:36:08.610012 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7ac915-62c8-4d95-96a3-899c245e685c" containerName="configure-network-openstack-openstack-cell1" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.610021 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7ac915-62c8-4d95-96a3-899c245e685c" containerName="configure-network-openstack-openstack-cell1" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.610406 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd7ac915-62c8-4d95-96a3-899c245e685c" containerName="configure-network-openstack-openstack-cell1" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.610428 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="44e60165-e38f-4fbe-87a1-5908598e0e38" containerName="oc" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.611284 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-lcz5l"] Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.611377 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.615056 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.615357 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.615490 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.615591 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.762506 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-inventory\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.762739 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.762774 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grmfz\" (UniqueName: \"kubernetes.io/projected/df77d68a-3570-49fb-958b-c358543e661f-kube-api-access-grmfz\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.763014 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ceph\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.865994 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.866050 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grmfz\" (UniqueName: \"kubernetes.io/projected/df77d68a-3570-49fb-958b-c358543e661f-kube-api-access-grmfz\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.866110 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ceph\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.866217 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-inventory\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.873258 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ssh-key-openstack-cell1\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.875019 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ceph\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.891368 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-inventory\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.896089 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grmfz\" (UniqueName: \"kubernetes.io/projected/df77d68a-3570-49fb-958b-c358543e661f-kube-api-access-grmfz\") pod \"validate-network-openstack-openstack-cell1-lcz5l\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:08 crc kubenswrapper[4885]: I0308 21:36:08.953548 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:09 crc kubenswrapper[4885]: I0308 21:36:09.553328 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-lcz5l"] Mar 08 21:36:10 crc kubenswrapper[4885]: I0308 21:36:10.521503 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" event={"ID":"df77d68a-3570-49fb-958b-c358543e661f","Type":"ContainerStarted","Data":"304da361b0a033def3dd5828f9ff9b635b383deac93423f4fbd44a861033ac2e"} Mar 08 21:36:10 crc kubenswrapper[4885]: I0308 21:36:10.521959 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" event={"ID":"df77d68a-3570-49fb-958b-c358543e661f","Type":"ContainerStarted","Data":"a529fc78229bb701bf7db7c1f844ed80a6801a69c68f6af881483d9a5e7c0552"} Mar 08 21:36:10 crc kubenswrapper[4885]: I0308 21:36:10.548041 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" podStartSLOduration=2.107426309 podStartE2EDuration="2.548021221s" podCreationTimestamp="2026-03-08 21:36:08 +0000 UTC" firstStartedPulling="2026-03-08 21:36:09.555084386 +0000 UTC m=+7470.951138419" lastFinishedPulling="2026-03-08 21:36:09.995679268 +0000 UTC m=+7471.391733331" observedRunningTime="2026-03-08 21:36:10.538304212 +0000 UTC m=+7471.934358255" watchObservedRunningTime="2026-03-08 21:36:10.548021221 +0000 UTC m=+7471.944075254" Mar 08 21:36:12 crc kubenswrapper[4885]: I0308 21:36:12.361580 4885 scope.go:117] "RemoveContainer" containerID="7eee2ffb4dea4a4b434fae8ad567627bf150b9abb9d76f55cab57ee721350700" Mar 08 21:36:15 crc kubenswrapper[4885]: I0308 21:36:15.583747 4885 generic.go:334] "Generic (PLEG): container finished" podID="df77d68a-3570-49fb-958b-c358543e661f" containerID="304da361b0a033def3dd5828f9ff9b635b383deac93423f4fbd44a861033ac2e" exitCode=0 Mar 08 21:36:15 crc kubenswrapper[4885]: I0308 21:36:15.583881 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" event={"ID":"df77d68a-3570-49fb-958b-c358543e661f","Type":"ContainerDied","Data":"304da361b0a033def3dd5828f9ff9b635b383deac93423f4fbd44a861033ac2e"} Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.110734 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.167907 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ceph\") pod \"df77d68a-3570-49fb-958b-c358543e661f\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.168016 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ssh-key-openstack-cell1\") pod \"df77d68a-3570-49fb-958b-c358543e661f\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.168039 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grmfz\" (UniqueName: \"kubernetes.io/projected/df77d68a-3570-49fb-958b-c358543e661f-kube-api-access-grmfz\") pod \"df77d68a-3570-49fb-958b-c358543e661f\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.168142 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-inventory\") pod \"df77d68a-3570-49fb-958b-c358543e661f\" (UID: \"df77d68a-3570-49fb-958b-c358543e661f\") " Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.174059 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df77d68a-3570-49fb-958b-c358543e661f-kube-api-access-grmfz" (OuterVolumeSpecName: "kube-api-access-grmfz") pod "df77d68a-3570-49fb-958b-c358543e661f" (UID: "df77d68a-3570-49fb-958b-c358543e661f"). InnerVolumeSpecName "kube-api-access-grmfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.175155 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ceph" (OuterVolumeSpecName: "ceph") pod "df77d68a-3570-49fb-958b-c358543e661f" (UID: "df77d68a-3570-49fb-958b-c358543e661f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.203735 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "df77d68a-3570-49fb-958b-c358543e661f" (UID: "df77d68a-3570-49fb-958b-c358543e661f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.206283 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-inventory" (OuterVolumeSpecName: "inventory") pod "df77d68a-3570-49fb-958b-c358543e661f" (UID: "df77d68a-3570-49fb-958b-c358543e661f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.274081 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.274132 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.274173 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grmfz\" (UniqueName: \"kubernetes.io/projected/df77d68a-3570-49fb-958b-c358543e661f-kube-api-access-grmfz\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.274185 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/df77d68a-3570-49fb-958b-c358543e661f-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.601517 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" event={"ID":"df77d68a-3570-49fb-958b-c358543e661f","Type":"ContainerDied","Data":"a529fc78229bb701bf7db7c1f844ed80a6801a69c68f6af881483d9a5e7c0552"} Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.601899 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a529fc78229bb701bf7db7c1f844ed80a6801a69c68f6af881483d9a5e7c0552" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.601599 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-lcz5l" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.692165 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-mj45h"] Mar 08 21:36:17 crc kubenswrapper[4885]: E0308 21:36:17.692997 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df77d68a-3570-49fb-958b-c358543e661f" containerName="validate-network-openstack-openstack-cell1" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.693030 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="df77d68a-3570-49fb-958b-c358543e661f" containerName="validate-network-openstack-openstack-cell1" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.693393 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="df77d68a-3570-49fb-958b-c358543e661f" containerName="validate-network-openstack-openstack-cell1" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.694660 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.696827 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.697507 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.703071 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-mj45h"] Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.704719 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.704758 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.783943 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-inventory\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.784133 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jlv6\" (UniqueName: \"kubernetes.io/projected/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-kube-api-access-6jlv6\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.784255 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.784364 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ceph\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.886070 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jlv6\" (UniqueName: \"kubernetes.io/projected/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-kube-api-access-6jlv6\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.886253 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.886375 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ceph\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.886506 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-inventory\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.896039 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ceph\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.896283 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ssh-key-openstack-cell1\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.896457 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-inventory\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:17 crc kubenswrapper[4885]: I0308 21:36:17.907098 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jlv6\" (UniqueName: \"kubernetes.io/projected/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-kube-api-access-6jlv6\") pod \"install-os-openstack-openstack-cell1-mj45h\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:18 crc kubenswrapper[4885]: I0308 21:36:18.018600 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:36:18 crc kubenswrapper[4885]: I0308 21:36:18.626589 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-mj45h"] Mar 08 21:36:18 crc kubenswrapper[4885]: I0308 21:36:18.639062 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:36:19 crc kubenswrapper[4885]: I0308 21:36:19.631793 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-mj45h" event={"ID":"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4","Type":"ContainerStarted","Data":"240748009c7ed362d0f083a1ca76726a574cb7394b24e3c3af6ec8092554a1a3"} Mar 08 21:36:19 crc kubenswrapper[4885]: I0308 21:36:19.632523 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-mj45h" event={"ID":"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4","Type":"ContainerStarted","Data":"1a4b98e79973daa361c60c2a7c32c23c681788d67729efc27ee5b7535067e98d"} Mar 08 21:36:19 crc kubenswrapper[4885]: I0308 21:36:19.659079 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-mj45h" podStartSLOduration=2.239729067 podStartE2EDuration="2.659064112s" podCreationTimestamp="2026-03-08 21:36:17 +0000 UTC" firstStartedPulling="2026-03-08 21:36:18.638501179 +0000 UTC m=+7480.034555242" lastFinishedPulling="2026-03-08 21:36:19.057836264 +0000 UTC m=+7480.453890287" observedRunningTime="2026-03-08 21:36:19.658063655 +0000 UTC m=+7481.054117708" watchObservedRunningTime="2026-03-08 21:36:19.659064112 +0000 UTC m=+7481.055118125" Mar 08 21:36:32 crc kubenswrapper[4885]: I0308 21:36:32.819024 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:36:32 crc kubenswrapper[4885]: I0308 21:36:32.819689 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:37:02 crc kubenswrapper[4885]: I0308 21:37:02.818337 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:37:02 crc kubenswrapper[4885]: I0308 21:37:02.820608 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:37:02 crc kubenswrapper[4885]: I0308 21:37:02.820756 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:37:02 crc kubenswrapper[4885]: I0308 21:37:02.821901 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:37:02 crc kubenswrapper[4885]: I0308 21:37:02.822145 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" gracePeriod=600 Mar 08 21:37:02 crc kubenswrapper[4885]: E0308 21:37:02.953513 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:37:03 crc kubenswrapper[4885]: I0308 21:37:03.216478 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" exitCode=0 Mar 08 21:37:03 crc kubenswrapper[4885]: I0308 21:37:03.216528 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9"} Mar 08 21:37:03 crc kubenswrapper[4885]: I0308 21:37:03.216566 4885 scope.go:117] "RemoveContainer" containerID="eb4693f4eeb79088711f27b4882bee725d38950ce75255766be3668fb258c672" Mar 08 21:37:03 crc kubenswrapper[4885]: I0308 21:37:03.217288 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:37:03 crc kubenswrapper[4885]: E0308 21:37:03.217616 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:37:05 crc kubenswrapper[4885]: I0308 21:37:05.254844 4885 generic.go:334] "Generic (PLEG): container finished" podID="dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" containerID="240748009c7ed362d0f083a1ca76726a574cb7394b24e3c3af6ec8092554a1a3" exitCode=0 Mar 08 21:37:05 crc kubenswrapper[4885]: I0308 21:37:05.255159 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-mj45h" event={"ID":"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4","Type":"ContainerDied","Data":"240748009c7ed362d0f083a1ca76726a574cb7394b24e3c3af6ec8092554a1a3"} Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.766653 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.865738 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jlv6\" (UniqueName: \"kubernetes.io/projected/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-kube-api-access-6jlv6\") pod \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.865873 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ceph\") pod \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.865956 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ssh-key-openstack-cell1\") pod \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.866065 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-inventory\") pod \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\" (UID: \"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4\") " Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.872008 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ceph" (OuterVolumeSpecName: "ceph") pod "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" (UID: "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.876193 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-kube-api-access-6jlv6" (OuterVolumeSpecName: "kube-api-access-6jlv6") pod "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" (UID: "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4"). InnerVolumeSpecName "kube-api-access-6jlv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.908614 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" (UID: "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.914660 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-inventory" (OuterVolumeSpecName: "inventory") pod "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" (UID: "dcf02d39-6fe8-40ae-bd31-b7d1a38103b4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.968195 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.968240 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jlv6\" (UniqueName: \"kubernetes.io/projected/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-kube-api-access-6jlv6\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.968262 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:06 crc kubenswrapper[4885]: I0308 21:37:06.968280 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/dcf02d39-6fe8-40ae-bd31-b7d1a38103b4-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.279286 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-mj45h" event={"ID":"dcf02d39-6fe8-40ae-bd31-b7d1a38103b4","Type":"ContainerDied","Data":"1a4b98e79973daa361c60c2a7c32c23c681788d67729efc27ee5b7535067e98d"} Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.279610 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a4b98e79973daa361c60c2a7c32c23c681788d67729efc27ee5b7535067e98d" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.279405 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-mj45h" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.382840 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-9wz6s"] Mar 08 21:37:07 crc kubenswrapper[4885]: E0308 21:37:07.383276 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" containerName="install-os-openstack-openstack-cell1" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.383298 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" containerName="install-os-openstack-openstack-cell1" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.383656 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcf02d39-6fe8-40ae-bd31-b7d1a38103b4" containerName="install-os-openstack-openstack-cell1" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.384768 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.387671 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.387857 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.388122 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.388299 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.390166 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-9wz6s"] Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.477571 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ceph\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.477668 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xt2l\" (UniqueName: \"kubernetes.io/projected/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-kube-api-access-2xt2l\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.477698 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.477878 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-inventory\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.581563 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-inventory\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.581839 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ceph\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.582002 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xt2l\" (UniqueName: \"kubernetes.io/projected/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-kube-api-access-2xt2l\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.582064 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.588563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ssh-key-openstack-cell1\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.588822 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-inventory\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.590152 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ceph\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.608964 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xt2l\" (UniqueName: \"kubernetes.io/projected/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-kube-api-access-2xt2l\") pod \"configure-os-openstack-openstack-cell1-9wz6s\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:07 crc kubenswrapper[4885]: I0308 21:37:07.709814 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:08 crc kubenswrapper[4885]: I0308 21:37:08.340264 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-9wz6s"] Mar 08 21:37:08 crc kubenswrapper[4885]: W0308 21:37:08.350048 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeaa3ffc5_e09f_48b4_96b2_e2454bfe6251.slice/crio-38a03ebb26b5e6f459e6e7ab1ca821ebb2a492439f5922819dbcd0846139fa16 WatchSource:0}: Error finding container 38a03ebb26b5e6f459e6e7ab1ca821ebb2a492439f5922819dbcd0846139fa16: Status 404 returned error can't find the container with id 38a03ebb26b5e6f459e6e7ab1ca821ebb2a492439f5922819dbcd0846139fa16 Mar 08 21:37:09 crc kubenswrapper[4885]: I0308 21:37:09.304544 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" event={"ID":"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251","Type":"ContainerStarted","Data":"2fc93de39b9fb91c25b50ddd230fffb16d182eadf38c5d9957013bd4c9846bf8"} Mar 08 21:37:09 crc kubenswrapper[4885]: I0308 21:37:09.305198 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" event={"ID":"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251","Type":"ContainerStarted","Data":"38a03ebb26b5e6f459e6e7ab1ca821ebb2a492439f5922819dbcd0846139fa16"} Mar 08 21:37:09 crc kubenswrapper[4885]: I0308 21:37:09.340100 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" podStartSLOduration=1.8077163619999999 podStartE2EDuration="2.340081852s" podCreationTimestamp="2026-03-08 21:37:07 +0000 UTC" firstStartedPulling="2026-03-08 21:37:08.356348012 +0000 UTC m=+7529.752402035" lastFinishedPulling="2026-03-08 21:37:08.888713482 +0000 UTC m=+7530.284767525" observedRunningTime="2026-03-08 21:37:09.334578065 +0000 UTC m=+7530.730632108" watchObservedRunningTime="2026-03-08 21:37:09.340081852 +0000 UTC m=+7530.736135875" Mar 08 21:37:18 crc kubenswrapper[4885]: I0308 21:37:18.369026 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:37:18 crc kubenswrapper[4885]: E0308 21:37:18.370231 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:37:33 crc kubenswrapper[4885]: I0308 21:37:33.369527 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:37:33 crc kubenswrapper[4885]: E0308 21:37:33.370814 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:37:48 crc kubenswrapper[4885]: I0308 21:37:48.371153 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:37:48 crc kubenswrapper[4885]: E0308 21:37:48.373634 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:37:56 crc kubenswrapper[4885]: I0308 21:37:56.908309 4885 generic.go:334] "Generic (PLEG): container finished" podID="eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" containerID="2fc93de39b9fb91c25b50ddd230fffb16d182eadf38c5d9957013bd4c9846bf8" exitCode=0 Mar 08 21:37:56 crc kubenswrapper[4885]: I0308 21:37:56.908423 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" event={"ID":"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251","Type":"ContainerDied","Data":"2fc93de39b9fb91c25b50ddd230fffb16d182eadf38c5d9957013bd4c9846bf8"} Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.516728 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.555611 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ssh-key-openstack-cell1\") pod \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.555668 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ceph\") pod \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.555937 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-inventory\") pod \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.555972 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xt2l\" (UniqueName: \"kubernetes.io/projected/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-kube-api-access-2xt2l\") pod \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\" (UID: \"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251\") " Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.561987 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ceph" (OuterVolumeSpecName: "ceph") pod "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" (UID: "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.566162 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-kube-api-access-2xt2l" (OuterVolumeSpecName: "kube-api-access-2xt2l") pod "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" (UID: "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251"). InnerVolumeSpecName "kube-api-access-2xt2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.593387 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-inventory" (OuterVolumeSpecName: "inventory") pod "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" (UID: "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.610127 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" (UID: "eaa3ffc5-e09f-48b4-96b2-e2454bfe6251"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.659288 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.659324 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2xt2l\" (UniqueName: \"kubernetes.io/projected/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-kube-api-access-2xt2l\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.659339 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.659355 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eaa3ffc5-e09f-48b4-96b2-e2454bfe6251-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.952145 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" event={"ID":"eaa3ffc5-e09f-48b4-96b2-e2454bfe6251","Type":"ContainerDied","Data":"38a03ebb26b5e6f459e6e7ab1ca821ebb2a492439f5922819dbcd0846139fa16"} Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.952408 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38a03ebb26b5e6f459e6e7ab1ca821ebb2a492439f5922819dbcd0846139fa16" Mar 08 21:37:58 crc kubenswrapper[4885]: I0308 21:37:58.952538 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-9wz6s" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.063708 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-khtfv"] Mar 08 21:37:59 crc kubenswrapper[4885]: E0308 21:37:59.064241 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" containerName="configure-os-openstack-openstack-cell1" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.064259 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" containerName="configure-os-openstack-openstack-cell1" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.064457 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaa3ffc5-e09f-48b4-96b2-e2454bfe6251" containerName="configure-os-openstack-openstack-cell1" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.066496 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.070407 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.070656 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.075301 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.075608 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.079724 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-khtfv"] Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.167521 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-inventory-0\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.167570 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj6h7\" (UniqueName: \"kubernetes.io/projected/bef3d518-c413-4129-b022-dffb097239b2-kube-api-access-qj6h7\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.167599 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.167856 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ceph\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.269481 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ceph\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.269650 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-inventory-0\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.269678 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj6h7\" (UniqueName: \"kubernetes.io/projected/bef3d518-c413-4129-b022-dffb097239b2-kube-api-access-qj6h7\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.269716 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.275533 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.279143 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ceph\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.279312 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-inventory-0\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.285792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj6h7\" (UniqueName: \"kubernetes.io/projected/bef3d518-c413-4129-b022-dffb097239b2-kube-api-access-qj6h7\") pod \"ssh-known-hosts-openstack-khtfv\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.383868 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:37:59 crc kubenswrapper[4885]: I0308 21:37:59.998410 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-khtfv"] Mar 08 21:38:00 crc kubenswrapper[4885]: W0308 21:38:00.010045 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbef3d518_c413_4129_b022_dffb097239b2.slice/crio-4fa4901239fe0e57e4aa7c79357533b49db6f3b44573dee11b31455903ea65ea WatchSource:0}: Error finding container 4fa4901239fe0e57e4aa7c79357533b49db6f3b44573dee11b31455903ea65ea: Status 404 returned error can't find the container with id 4fa4901239fe0e57e4aa7c79357533b49db6f3b44573dee11b31455903ea65ea Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.132890 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550098-47g92"] Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.134768 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.137105 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.137417 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.137503 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.148528 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550098-47g92"] Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.195838 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j9qs\" (UniqueName: \"kubernetes.io/projected/67c51bcd-c065-4fa7-8318-0d0704836166-kube-api-access-4j9qs\") pod \"auto-csr-approver-29550098-47g92\" (UID: \"67c51bcd-c065-4fa7-8318-0d0704836166\") " pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.299262 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j9qs\" (UniqueName: \"kubernetes.io/projected/67c51bcd-c065-4fa7-8318-0d0704836166-kube-api-access-4j9qs\") pod \"auto-csr-approver-29550098-47g92\" (UID: \"67c51bcd-c065-4fa7-8318-0d0704836166\") " pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.320145 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j9qs\" (UniqueName: \"kubernetes.io/projected/67c51bcd-c065-4fa7-8318-0d0704836166-kube-api-access-4j9qs\") pod \"auto-csr-approver-29550098-47g92\" (UID: \"67c51bcd-c065-4fa7-8318-0d0704836166\") " pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.368398 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:38:00 crc kubenswrapper[4885]: E0308 21:38:00.368667 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.457480 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.997252 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-khtfv" event={"ID":"bef3d518-c413-4129-b022-dffb097239b2","Type":"ContainerStarted","Data":"afa5cae2f58904151725f1848fae62debdefd513cc156b71cc3e5d0e9576488b"} Mar 08 21:38:00 crc kubenswrapper[4885]: I0308 21:38:00.997916 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-khtfv" event={"ID":"bef3d518-c413-4129-b022-dffb097239b2","Type":"ContainerStarted","Data":"4fa4901239fe0e57e4aa7c79357533b49db6f3b44573dee11b31455903ea65ea"} Mar 08 21:38:01 crc kubenswrapper[4885]: I0308 21:38:01.022434 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-khtfv" podStartSLOduration=1.533455183 podStartE2EDuration="2.022414216s" podCreationTimestamp="2026-03-08 21:37:59 +0000 UTC" firstStartedPulling="2026-03-08 21:38:00.014163531 +0000 UTC m=+7581.410217564" lastFinishedPulling="2026-03-08 21:38:00.503122534 +0000 UTC m=+7581.899176597" observedRunningTime="2026-03-08 21:38:01.018755068 +0000 UTC m=+7582.414809101" watchObservedRunningTime="2026-03-08 21:38:01.022414216 +0000 UTC m=+7582.418468249" Mar 08 21:38:01 crc kubenswrapper[4885]: W0308 21:38:01.046543 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67c51bcd_c065_4fa7_8318_0d0704836166.slice/crio-1737b733db2d3d7457590dacd470eeb7f3f50a374831f5a96f49f8ba2b522742 WatchSource:0}: Error finding container 1737b733db2d3d7457590dacd470eeb7f3f50a374831f5a96f49f8ba2b522742: Status 404 returned error can't find the container with id 1737b733db2d3d7457590dacd470eeb7f3f50a374831f5a96f49f8ba2b522742 Mar 08 21:38:01 crc kubenswrapper[4885]: I0308 21:38:01.049645 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550098-47g92"] Mar 08 21:38:02 crc kubenswrapper[4885]: I0308 21:38:02.008195 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550098-47g92" event={"ID":"67c51bcd-c065-4fa7-8318-0d0704836166","Type":"ContainerStarted","Data":"1737b733db2d3d7457590dacd470eeb7f3f50a374831f5a96f49f8ba2b522742"} Mar 08 21:38:03 crc kubenswrapper[4885]: I0308 21:38:03.020968 4885 generic.go:334] "Generic (PLEG): container finished" podID="67c51bcd-c065-4fa7-8318-0d0704836166" containerID="2fc2dee49966150d464450c5304d2011d968fb7949c03e2bf89d92f9c82630c7" exitCode=0 Mar 08 21:38:03 crc kubenswrapper[4885]: I0308 21:38:03.021112 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550098-47g92" event={"ID":"67c51bcd-c065-4fa7-8318-0d0704836166","Type":"ContainerDied","Data":"2fc2dee49966150d464450c5304d2011d968fb7949c03e2bf89d92f9c82630c7"} Mar 08 21:38:04 crc kubenswrapper[4885]: I0308 21:38:04.454133 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:04 crc kubenswrapper[4885]: I0308 21:38:04.510762 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4j9qs\" (UniqueName: \"kubernetes.io/projected/67c51bcd-c065-4fa7-8318-0d0704836166-kube-api-access-4j9qs\") pod \"67c51bcd-c065-4fa7-8318-0d0704836166\" (UID: \"67c51bcd-c065-4fa7-8318-0d0704836166\") " Mar 08 21:38:04 crc kubenswrapper[4885]: I0308 21:38:04.517203 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67c51bcd-c065-4fa7-8318-0d0704836166-kube-api-access-4j9qs" (OuterVolumeSpecName: "kube-api-access-4j9qs") pod "67c51bcd-c065-4fa7-8318-0d0704836166" (UID: "67c51bcd-c065-4fa7-8318-0d0704836166"). InnerVolumeSpecName "kube-api-access-4j9qs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:38:04 crc kubenswrapper[4885]: I0308 21:38:04.613737 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4j9qs\" (UniqueName: \"kubernetes.io/projected/67c51bcd-c065-4fa7-8318-0d0704836166-kube-api-access-4j9qs\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:05 crc kubenswrapper[4885]: I0308 21:38:05.044481 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550098-47g92" event={"ID":"67c51bcd-c065-4fa7-8318-0d0704836166","Type":"ContainerDied","Data":"1737b733db2d3d7457590dacd470eeb7f3f50a374831f5a96f49f8ba2b522742"} Mar 08 21:38:05 crc kubenswrapper[4885]: I0308 21:38:05.044563 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1737b733db2d3d7457590dacd470eeb7f3f50a374831f5a96f49f8ba2b522742" Mar 08 21:38:05 crc kubenswrapper[4885]: I0308 21:38:05.044570 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550098-47g92" Mar 08 21:38:05 crc kubenswrapper[4885]: I0308 21:38:05.552425 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550092-kjpvt"] Mar 08 21:38:05 crc kubenswrapper[4885]: I0308 21:38:05.562301 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550092-kjpvt"] Mar 08 21:38:07 crc kubenswrapper[4885]: I0308 21:38:07.397820 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="244b80f2-9a2b-4db4-a451-086baed68f2a" path="/var/lib/kubelet/pods/244b80f2-9a2b-4db4-a451-086baed68f2a/volumes" Mar 08 21:38:10 crc kubenswrapper[4885]: I0308 21:38:10.098126 4885 generic.go:334] "Generic (PLEG): container finished" podID="bef3d518-c413-4129-b022-dffb097239b2" containerID="afa5cae2f58904151725f1848fae62debdefd513cc156b71cc3e5d0e9576488b" exitCode=0 Mar 08 21:38:10 crc kubenswrapper[4885]: I0308 21:38:10.098224 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-khtfv" event={"ID":"bef3d518-c413-4129-b022-dffb097239b2","Type":"ContainerDied","Data":"afa5cae2f58904151725f1848fae62debdefd513cc156b71cc3e5d0e9576488b"} Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.629139 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.686067 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-inventory-0\") pod \"bef3d518-c413-4129-b022-dffb097239b2\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.686268 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ssh-key-openstack-cell1\") pod \"bef3d518-c413-4129-b022-dffb097239b2\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.686497 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj6h7\" (UniqueName: \"kubernetes.io/projected/bef3d518-c413-4129-b022-dffb097239b2-kube-api-access-qj6h7\") pod \"bef3d518-c413-4129-b022-dffb097239b2\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.686558 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ceph\") pod \"bef3d518-c413-4129-b022-dffb097239b2\" (UID: \"bef3d518-c413-4129-b022-dffb097239b2\") " Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.692364 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ceph" (OuterVolumeSpecName: "ceph") pod "bef3d518-c413-4129-b022-dffb097239b2" (UID: "bef3d518-c413-4129-b022-dffb097239b2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.706191 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bef3d518-c413-4129-b022-dffb097239b2-kube-api-access-qj6h7" (OuterVolumeSpecName: "kube-api-access-qj6h7") pod "bef3d518-c413-4129-b022-dffb097239b2" (UID: "bef3d518-c413-4129-b022-dffb097239b2"). InnerVolumeSpecName "kube-api-access-qj6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.721725 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "bef3d518-c413-4129-b022-dffb097239b2" (UID: "bef3d518-c413-4129-b022-dffb097239b2"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.733428 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "bef3d518-c413-4129-b022-dffb097239b2" (UID: "bef3d518-c413-4129-b022-dffb097239b2"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.792547 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj6h7\" (UniqueName: \"kubernetes.io/projected/bef3d518-c413-4129-b022-dffb097239b2-kube-api-access-qj6h7\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.792588 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.792602 4885 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:11 crc kubenswrapper[4885]: I0308 21:38:11.792615 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/bef3d518-c413-4129-b022-dffb097239b2-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.127724 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-khtfv" event={"ID":"bef3d518-c413-4129-b022-dffb097239b2","Type":"ContainerDied","Data":"4fa4901239fe0e57e4aa7c79357533b49db6f3b44573dee11b31455903ea65ea"} Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.128065 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fa4901239fe0e57e4aa7c79357533b49db6f3b44573dee11b31455903ea65ea" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.127843 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-khtfv" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.208582 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fg8lj"] Mar 08 21:38:12 crc kubenswrapper[4885]: E0308 21:38:12.209055 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67c51bcd-c065-4fa7-8318-0d0704836166" containerName="oc" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.209073 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="67c51bcd-c065-4fa7-8318-0d0704836166" containerName="oc" Mar 08 21:38:12 crc kubenswrapper[4885]: E0308 21:38:12.209115 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bef3d518-c413-4129-b022-dffb097239b2" containerName="ssh-known-hosts-openstack" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.209123 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef3d518-c413-4129-b022-dffb097239b2" containerName="ssh-known-hosts-openstack" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.209316 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bef3d518-c413-4129-b022-dffb097239b2" containerName="ssh-known-hosts-openstack" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.209352 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="67c51bcd-c065-4fa7-8318-0d0704836166" containerName="oc" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.210096 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.213823 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.213828 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.214485 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.215307 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.234869 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fg8lj"] Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.368199 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:38:12 crc kubenswrapper[4885]: E0308 21:38:12.368485 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.406451 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ceph\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.406560 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.406761 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztjs9\" (UniqueName: \"kubernetes.io/projected/53bb70ab-feea-49a2-9850-fc72a2e0f650-kube-api-access-ztjs9\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.407079 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-inventory\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.509070 4885 scope.go:117] "RemoveContainer" containerID="3d584615f7c68fc963e45c7150d91d00d64b9ed657e10cd4826322e66a7ec964" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.510133 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ceph\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.510247 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.512432 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztjs9\" (UniqueName: \"kubernetes.io/projected/53bb70ab-feea-49a2-9850-fc72a2e0f650-kube-api-access-ztjs9\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.513083 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-inventory\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.516613 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ceph\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.517153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ssh-key-openstack-cell1\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.523399 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-inventory\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.541323 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztjs9\" (UniqueName: \"kubernetes.io/projected/53bb70ab-feea-49a2-9850-fc72a2e0f650-kube-api-access-ztjs9\") pod \"run-os-openstack-openstack-cell1-fg8lj\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:12 crc kubenswrapper[4885]: I0308 21:38:12.835461 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:13 crc kubenswrapper[4885]: I0308 21:38:13.498980 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fg8lj"] Mar 08 21:38:14 crc kubenswrapper[4885]: I0308 21:38:14.163662 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" event={"ID":"53bb70ab-feea-49a2-9850-fc72a2e0f650","Type":"ContainerStarted","Data":"1d18d479d41ae3fffb9ebe77aea5f15426176538191699b2053934f75f6d3be6"} Mar 08 21:38:15 crc kubenswrapper[4885]: I0308 21:38:15.178179 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" event={"ID":"53bb70ab-feea-49a2-9850-fc72a2e0f650","Type":"ContainerStarted","Data":"5cd8a7bb3282f36a39f24e29a6895e785eda2c152071f1591e21df14927a2b59"} Mar 08 21:38:15 crc kubenswrapper[4885]: I0308 21:38:15.204476 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" podStartSLOduration=2.674378032 podStartE2EDuration="3.2044426s" podCreationTimestamp="2026-03-08 21:38:12 +0000 UTC" firstStartedPulling="2026-03-08 21:38:13.506219642 +0000 UTC m=+7594.902273685" lastFinishedPulling="2026-03-08 21:38:14.03628424 +0000 UTC m=+7595.432338253" observedRunningTime="2026-03-08 21:38:15.199999091 +0000 UTC m=+7596.596053124" watchObservedRunningTime="2026-03-08 21:38:15.2044426 +0000 UTC m=+7596.600496673" Mar 08 21:38:22 crc kubenswrapper[4885]: I0308 21:38:22.306714 4885 generic.go:334] "Generic (PLEG): container finished" podID="53bb70ab-feea-49a2-9850-fc72a2e0f650" containerID="5cd8a7bb3282f36a39f24e29a6895e785eda2c152071f1591e21df14927a2b59" exitCode=0 Mar 08 21:38:22 crc kubenswrapper[4885]: I0308 21:38:22.307051 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" event={"ID":"53bb70ab-feea-49a2-9850-fc72a2e0f650","Type":"ContainerDied","Data":"5cd8a7bb3282f36a39f24e29a6895e785eda2c152071f1591e21df14927a2b59"} Mar 08 21:38:23 crc kubenswrapper[4885]: I0308 21:38:23.937224 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:23 crc kubenswrapper[4885]: I0308 21:38:23.995136 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-inventory\") pod \"53bb70ab-feea-49a2-9850-fc72a2e0f650\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " Mar 08 21:38:23 crc kubenswrapper[4885]: I0308 21:38:23.995365 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztjs9\" (UniqueName: \"kubernetes.io/projected/53bb70ab-feea-49a2-9850-fc72a2e0f650-kube-api-access-ztjs9\") pod \"53bb70ab-feea-49a2-9850-fc72a2e0f650\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " Mar 08 21:38:23 crc kubenswrapper[4885]: I0308 21:38:23.995681 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ssh-key-openstack-cell1\") pod \"53bb70ab-feea-49a2-9850-fc72a2e0f650\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " Mar 08 21:38:23 crc kubenswrapper[4885]: I0308 21:38:23.995723 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ceph\") pod \"53bb70ab-feea-49a2-9850-fc72a2e0f650\" (UID: \"53bb70ab-feea-49a2-9850-fc72a2e0f650\") " Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.004435 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ceph" (OuterVolumeSpecName: "ceph") pod "53bb70ab-feea-49a2-9850-fc72a2e0f650" (UID: "53bb70ab-feea-49a2-9850-fc72a2e0f650"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.008702 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53bb70ab-feea-49a2-9850-fc72a2e0f650-kube-api-access-ztjs9" (OuterVolumeSpecName: "kube-api-access-ztjs9") pod "53bb70ab-feea-49a2-9850-fc72a2e0f650" (UID: "53bb70ab-feea-49a2-9850-fc72a2e0f650"). InnerVolumeSpecName "kube-api-access-ztjs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.034528 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "53bb70ab-feea-49a2-9850-fc72a2e0f650" (UID: "53bb70ab-feea-49a2-9850-fc72a2e0f650"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.034690 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-inventory" (OuterVolumeSpecName: "inventory") pod "53bb70ab-feea-49a2-9850-fc72a2e0f650" (UID: "53bb70ab-feea-49a2-9850-fc72a2e0f650"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.098551 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.098587 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.098597 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/53bb70ab-feea-49a2-9850-fc72a2e0f650-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.098608 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztjs9\" (UniqueName: \"kubernetes.io/projected/53bb70ab-feea-49a2-9850-fc72a2e0f650-kube-api-access-ztjs9\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.335699 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" event={"ID":"53bb70ab-feea-49a2-9850-fc72a2e0f650","Type":"ContainerDied","Data":"1d18d479d41ae3fffb9ebe77aea5f15426176538191699b2053934f75f6d3be6"} Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.336052 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d18d479d41ae3fffb9ebe77aea5f15426176538191699b2053934f75f6d3be6" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.335793 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fg8lj" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.431854 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-k564w"] Mar 08 21:38:24 crc kubenswrapper[4885]: E0308 21:38:24.432887 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53bb70ab-feea-49a2-9850-fc72a2e0f650" containerName="run-os-openstack-openstack-cell1" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.432965 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="53bb70ab-feea-49a2-9850-fc72a2e0f650" containerName="run-os-openstack-openstack-cell1" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.433502 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="53bb70ab-feea-49a2-9850-fc72a2e0f650" containerName="run-os-openstack-openstack-cell1" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.435277 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.440615 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.444663 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.444686 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.448446 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-k564w"] Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.449540 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.508382 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ceph\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.508459 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-inventory\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.508501 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2rl\" (UniqueName: \"kubernetes.io/projected/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-kube-api-access-vr2rl\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.508529 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.610807 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2rl\" (UniqueName: \"kubernetes.io/projected/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-kube-api-access-vr2rl\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.610863 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.611154 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ceph\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.611246 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-inventory\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.616499 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ceph\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.619888 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ssh-key-openstack-cell1\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.625462 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-inventory\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.646546 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2rl\" (UniqueName: \"kubernetes.io/projected/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-kube-api-access-vr2rl\") pod \"reboot-os-openstack-openstack-cell1-k564w\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:24 crc kubenswrapper[4885]: I0308 21:38:24.764494 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:25 crc kubenswrapper[4885]: I0308 21:38:25.348792 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-k564w"] Mar 08 21:38:25 crc kubenswrapper[4885]: W0308 21:38:25.355253 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf062e16_c6d1_4d3c_b0aa_ca00e9740bcb.slice/crio-969391d4ee2c0ef7d476a520a4efb215b45a35a16cc2f8c015679c2f571278d3 WatchSource:0}: Error finding container 969391d4ee2c0ef7d476a520a4efb215b45a35a16cc2f8c015679c2f571278d3: Status 404 returned error can't find the container with id 969391d4ee2c0ef7d476a520a4efb215b45a35a16cc2f8c015679c2f571278d3 Mar 08 21:38:25 crc kubenswrapper[4885]: I0308 21:38:25.369679 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:38:25 crc kubenswrapper[4885]: E0308 21:38:25.370098 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:38:26 crc kubenswrapper[4885]: I0308 21:38:26.364393 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" event={"ID":"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb","Type":"ContainerStarted","Data":"2867e8b1c59eb05f63d8e88e194178c1995075af4091bf7e0f00769e0787cb27"} Mar 08 21:38:26 crc kubenswrapper[4885]: I0308 21:38:26.365127 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" event={"ID":"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb","Type":"ContainerStarted","Data":"969391d4ee2c0ef7d476a520a4efb215b45a35a16cc2f8c015679c2f571278d3"} Mar 08 21:38:26 crc kubenswrapper[4885]: I0308 21:38:26.393770 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" podStartSLOduration=1.8649856919999999 podStartE2EDuration="2.393750676s" podCreationTimestamp="2026-03-08 21:38:24 +0000 UTC" firstStartedPulling="2026-03-08 21:38:25.360722021 +0000 UTC m=+7606.756776034" lastFinishedPulling="2026-03-08 21:38:25.889486995 +0000 UTC m=+7607.285541018" observedRunningTime="2026-03-08 21:38:26.386712138 +0000 UTC m=+7607.782766211" watchObservedRunningTime="2026-03-08 21:38:26.393750676 +0000 UTC m=+7607.789804699" Mar 08 21:38:37 crc kubenswrapper[4885]: I0308 21:38:37.369465 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:38:37 crc kubenswrapper[4885]: E0308 21:38:37.370353 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:38:42 crc kubenswrapper[4885]: I0308 21:38:42.562548 4885 generic.go:334] "Generic (PLEG): container finished" podID="cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" containerID="2867e8b1c59eb05f63d8e88e194178c1995075af4091bf7e0f00769e0787cb27" exitCode=0 Mar 08 21:38:42 crc kubenswrapper[4885]: I0308 21:38:42.562648 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" event={"ID":"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb","Type":"ContainerDied","Data":"2867e8b1c59eb05f63d8e88e194178c1995075af4091bf7e0f00769e0787cb27"} Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.262718 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.293742 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ceph\") pod \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.293851 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-inventory\") pod \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.293904 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ssh-key-openstack-cell1\") pod \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.294182 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr2rl\" (UniqueName: \"kubernetes.io/projected/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-kube-api-access-vr2rl\") pod \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\" (UID: \"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb\") " Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.300249 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ceph" (OuterVolumeSpecName: "ceph") pod "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" (UID: "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.300734 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-kube-api-access-vr2rl" (OuterVolumeSpecName: "kube-api-access-vr2rl") pod "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" (UID: "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb"). InnerVolumeSpecName "kube-api-access-vr2rl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.325438 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" (UID: "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.338110 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-inventory" (OuterVolumeSpecName: "inventory") pod "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" (UID: "cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.397389 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.397429 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.397442 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr2rl\" (UniqueName: \"kubernetes.io/projected/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-kube-api-access-vr2rl\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.397454 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.591136 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" event={"ID":"cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb","Type":"ContainerDied","Data":"969391d4ee2c0ef7d476a520a4efb215b45a35a16cc2f8c015679c2f571278d3"} Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.591179 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="969391d4ee2c0ef7d476a520a4efb215b45a35a16cc2f8c015679c2f571278d3" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.591186 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-k564w" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.708300 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-8t465"] Mar 08 21:38:44 crc kubenswrapper[4885]: E0308 21:38:44.709196 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" containerName="reboot-os-openstack-openstack-cell1" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.709218 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" containerName="reboot-os-openstack-openstack-cell1" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.709483 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb" containerName="reboot-os-openstack-openstack-cell1" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.710536 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.712823 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.713153 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.713308 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.713466 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.722793 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-8t465"] Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.811641 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bscmr\" (UniqueName: \"kubernetes.io/projected/125b54e2-cc1e-4a7f-83b6-1474e89bad11-kube-api-access-bscmr\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.811685 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.811711 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812392 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812517 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812552 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812586 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812640 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ceph\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812748 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.812964 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.813051 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.813104 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-inventory\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914227 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914274 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914303 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914338 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ceph\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914381 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914433 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914463 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914482 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-inventory\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914535 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bscmr\" (UniqueName: \"kubernetes.io/projected/125b54e2-cc1e-4a7f-83b6-1474e89bad11-kube-api-access-bscmr\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914555 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914574 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.914596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.920441 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.921458 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.921455 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ceph\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.922282 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.922789 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.922860 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.923240 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-inventory\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.924353 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.924614 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.925996 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.930900 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ssh-key-openstack-cell1\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:44 crc kubenswrapper[4885]: I0308 21:38:44.938977 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bscmr\" (UniqueName: \"kubernetes.io/projected/125b54e2-cc1e-4a7f-83b6-1474e89bad11-kube-api-access-bscmr\") pod \"install-certs-openstack-openstack-cell1-8t465\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:45 crc kubenswrapper[4885]: I0308 21:38:45.032535 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:38:45 crc kubenswrapper[4885]: I0308 21:38:45.041467 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:38:45 crc kubenswrapper[4885]: I0308 21:38:45.647539 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-8t465"] Mar 08 21:38:46 crc kubenswrapper[4885]: I0308 21:38:46.075448 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:38:46 crc kubenswrapper[4885]: I0308 21:38:46.621034 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8t465" event={"ID":"125b54e2-cc1e-4a7f-83b6-1474e89bad11","Type":"ContainerStarted","Data":"b19a08d9ecc12b68e8b0b0593cef1a99e9fb0b0ad20005abcd3260a1bc1db5a8"} Mar 08 21:38:46 crc kubenswrapper[4885]: I0308 21:38:46.621466 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8t465" event={"ID":"125b54e2-cc1e-4a7f-83b6-1474e89bad11","Type":"ContainerStarted","Data":"79c3cc2d0faf9b3d13a3d7ce0804ff729b53636e7a0b97cc80b2e52d48a4b2a6"} Mar 08 21:38:46 crc kubenswrapper[4885]: I0308 21:38:46.665472 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-8t465" podStartSLOduration=2.2464177579999998 podStartE2EDuration="2.665443826s" podCreationTimestamp="2026-03-08 21:38:44 +0000 UTC" firstStartedPulling="2026-03-08 21:38:45.65297546 +0000 UTC m=+7627.049029483" lastFinishedPulling="2026-03-08 21:38:46.072001478 +0000 UTC m=+7627.468055551" observedRunningTime="2026-03-08 21:38:46.652273895 +0000 UTC m=+7628.048327938" watchObservedRunningTime="2026-03-08 21:38:46.665443826 +0000 UTC m=+7628.061497879" Mar 08 21:38:50 crc kubenswrapper[4885]: I0308 21:38:50.368180 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:38:50 crc kubenswrapper[4885]: E0308 21:38:50.369126 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:39:01 crc kubenswrapper[4885]: I0308 21:39:01.368872 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:39:01 crc kubenswrapper[4885]: E0308 21:39:01.369811 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.322994 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7qwl7"] Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.326889 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.335693 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qwl7"] Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.463075 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-catalog-content\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.463515 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nwhf\" (UniqueName: \"kubernetes.io/projected/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-kube-api-access-8nwhf\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.463684 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-utilities\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.566086 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-catalog-content\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.566174 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nwhf\" (UniqueName: \"kubernetes.io/projected/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-kube-api-access-8nwhf\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.566237 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-utilities\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.566627 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-catalog-content\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.566680 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-utilities\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.586129 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nwhf\" (UniqueName: \"kubernetes.io/projected/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-kube-api-access-8nwhf\") pod \"redhat-operators-7qwl7\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.689799 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.910475 4885 generic.go:334] "Generic (PLEG): container finished" podID="125b54e2-cc1e-4a7f-83b6-1474e89bad11" containerID="b19a08d9ecc12b68e8b0b0593cef1a99e9fb0b0ad20005abcd3260a1bc1db5a8" exitCode=0 Mar 08 21:39:06 crc kubenswrapper[4885]: I0308 21:39:06.910855 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8t465" event={"ID":"125b54e2-cc1e-4a7f-83b6-1474e89bad11","Type":"ContainerDied","Data":"b19a08d9ecc12b68e8b0b0593cef1a99e9fb0b0ad20005abcd3260a1bc1db5a8"} Mar 08 21:39:07 crc kubenswrapper[4885]: I0308 21:39:07.205949 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7qwl7"] Mar 08 21:39:07 crc kubenswrapper[4885]: I0308 21:39:07.921747 4885 generic.go:334] "Generic (PLEG): container finished" podID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerID="a423eca068764bdfe8800013c4a2fb15c30071d300eaa2bf147699572dd7d03e" exitCode=0 Mar 08 21:39:07 crc kubenswrapper[4885]: I0308 21:39:07.921820 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerDied","Data":"a423eca068764bdfe8800013c4a2fb15c30071d300eaa2bf147699572dd7d03e"} Mar 08 21:39:07 crc kubenswrapper[4885]: I0308 21:39:07.922450 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerStarted","Data":"00691bce0bb761767b57e77ad9015e8de45354bfaf62e2946b1fde8b97c48cf8"} Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.497603 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.538775 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-metadata-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.538822 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ssh-key-openstack-cell1\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.538856 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-telemetry-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.538889 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-libvirt-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539060 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bscmr\" (UniqueName: \"kubernetes.io/projected/125b54e2-cc1e-4a7f-83b6-1474e89bad11-kube-api-access-bscmr\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539129 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-sriov-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539183 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ovn-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539203 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-bootstrap-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539233 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-nova-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539253 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-inventory\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539284 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ceph\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.539309 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-dhcp-combined-ca-bundle\") pod \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\" (UID: \"125b54e2-cc1e-4a7f-83b6-1474e89bad11\") " Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.544215 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.545753 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.552344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.555604 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.556526 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ceph" (OuterVolumeSpecName: "ceph") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.556562 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/125b54e2-cc1e-4a7f-83b6-1474e89bad11-kube-api-access-bscmr" (OuterVolumeSpecName: "kube-api-access-bscmr") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "kube-api-access-bscmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.559215 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.559744 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.560236 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.563502 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.603467 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-inventory" (OuterVolumeSpecName: "inventory") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643403 4885 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643442 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643451 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643463 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643475 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643485 4885 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643493 4885 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643503 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bscmr\" (UniqueName: \"kubernetes.io/projected/125b54e2-cc1e-4a7f-83b6-1474e89bad11-kube-api-access-bscmr\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643511 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643521 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.643530 4885 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.659865 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "125b54e2-cc1e-4a7f-83b6-1474e89bad11" (UID: "125b54e2-cc1e-4a7f-83b6-1474e89bad11"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.745486 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/125b54e2-cc1e-4a7f-83b6-1474e89bad11-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.938776 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerStarted","Data":"9e9db438534feaa447a73410606a84e2839386c0c4482c114e68bcc17de2fb97"} Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.941788 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-8t465" event={"ID":"125b54e2-cc1e-4a7f-83b6-1474e89bad11","Type":"ContainerDied","Data":"79c3cc2d0faf9b3d13a3d7ce0804ff729b53636e7a0b97cc80b2e52d48a4b2a6"} Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.941814 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79c3cc2d0faf9b3d13a3d7ce0804ff729b53636e7a0b97cc80b2e52d48a4b2a6" Mar 08 21:39:08 crc kubenswrapper[4885]: I0308 21:39:08.941854 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-8t465" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.042039 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-jr59q"] Mar 08 21:39:09 crc kubenswrapper[4885]: E0308 21:39:09.042682 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="125b54e2-cc1e-4a7f-83b6-1474e89bad11" containerName="install-certs-openstack-openstack-cell1" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.042783 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="125b54e2-cc1e-4a7f-83b6-1474e89bad11" containerName="install-certs-openstack-openstack-cell1" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.043081 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="125b54e2-cc1e-4a7f-83b6-1474e89bad11" containerName="install-certs-openstack-openstack-cell1" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.043863 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.053698 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.053937 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.054465 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.054583 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.062503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-jr59q"] Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.156890 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ceph\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.157010 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.157071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-inventory\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.157109 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-796ks\" (UniqueName: \"kubernetes.io/projected/9ef426ef-0010-4b6f-8b94-b45e726c2f02-kube-api-access-796ks\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.259705 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ceph\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.259827 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.259961 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-inventory\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.260027 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-796ks\" (UniqueName: \"kubernetes.io/projected/9ef426ef-0010-4b6f-8b94-b45e726c2f02-kube-api-access-796ks\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.263751 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ceph\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.264703 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ssh-key-openstack-cell1\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.265395 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-inventory\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.275690 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-796ks\" (UniqueName: \"kubernetes.io/projected/9ef426ef-0010-4b6f-8b94-b45e726c2f02-kube-api-access-796ks\") pod \"ceph-client-openstack-openstack-cell1-jr59q\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:09 crc kubenswrapper[4885]: I0308 21:39:09.366138 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:11 crc kubenswrapper[4885]: I0308 21:39:09.992664 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-jr59q"] Mar 08 21:39:11 crc kubenswrapper[4885]: I0308 21:39:10.966293 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" event={"ID":"9ef426ef-0010-4b6f-8b94-b45e726c2f02","Type":"ContainerStarted","Data":"549241e500a6e1c6bb8dd5a138d7b1eeefe0ecc38aa23303a9bfb803f38670ca"} Mar 08 21:39:11 crc kubenswrapper[4885]: I0308 21:39:11.997362 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" event={"ID":"9ef426ef-0010-4b6f-8b94-b45e726c2f02","Type":"ContainerStarted","Data":"81c147a1ae1e07d5d3e12c387e25ea5cc1f30efc8f47f3b7b07230af9ffa1c96"} Mar 08 21:39:12 crc kubenswrapper[4885]: I0308 21:39:12.026668 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" podStartSLOduration=1.7864100939999998 podStartE2EDuration="3.026645767s" podCreationTimestamp="2026-03-08 21:39:09 +0000 UTC" firstStartedPulling="2026-03-08 21:39:09.997262864 +0000 UTC m=+7651.393316917" lastFinishedPulling="2026-03-08 21:39:11.237498557 +0000 UTC m=+7652.633552590" observedRunningTime="2026-03-08 21:39:12.016784614 +0000 UTC m=+7653.412838627" watchObservedRunningTime="2026-03-08 21:39:12.026645767 +0000 UTC m=+7653.422699790" Mar 08 21:39:14 crc kubenswrapper[4885]: I0308 21:39:14.368773 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:39:14 crc kubenswrapper[4885]: E0308 21:39:14.369707 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:39:17 crc kubenswrapper[4885]: I0308 21:39:17.103740 4885 generic.go:334] "Generic (PLEG): container finished" podID="9ef426ef-0010-4b6f-8b94-b45e726c2f02" containerID="81c147a1ae1e07d5d3e12c387e25ea5cc1f30efc8f47f3b7b07230af9ffa1c96" exitCode=0 Mar 08 21:39:17 crc kubenswrapper[4885]: I0308 21:39:17.103871 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" event={"ID":"9ef426ef-0010-4b6f-8b94-b45e726c2f02","Type":"ContainerDied","Data":"81c147a1ae1e07d5d3e12c387e25ea5cc1f30efc8f47f3b7b07230af9ffa1c96"} Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.601555 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.739948 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-inventory\") pod \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.740171 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ceph\") pod \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.740303 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ssh-key-openstack-cell1\") pod \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.740345 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-796ks\" (UniqueName: \"kubernetes.io/projected/9ef426ef-0010-4b6f-8b94-b45e726c2f02-kube-api-access-796ks\") pod \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\" (UID: \"9ef426ef-0010-4b6f-8b94-b45e726c2f02\") " Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.746903 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef426ef-0010-4b6f-8b94-b45e726c2f02-kube-api-access-796ks" (OuterVolumeSpecName: "kube-api-access-796ks") pod "9ef426ef-0010-4b6f-8b94-b45e726c2f02" (UID: "9ef426ef-0010-4b6f-8b94-b45e726c2f02"). InnerVolumeSpecName "kube-api-access-796ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.747995 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ceph" (OuterVolumeSpecName: "ceph") pod "9ef426ef-0010-4b6f-8b94-b45e726c2f02" (UID: "9ef426ef-0010-4b6f-8b94-b45e726c2f02"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.777344 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-inventory" (OuterVolumeSpecName: "inventory") pod "9ef426ef-0010-4b6f-8b94-b45e726c2f02" (UID: "9ef426ef-0010-4b6f-8b94-b45e726c2f02"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.778624 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "9ef426ef-0010-4b6f-8b94-b45e726c2f02" (UID: "9ef426ef-0010-4b6f-8b94-b45e726c2f02"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.842622 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.842659 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.842671 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/9ef426ef-0010-4b6f-8b94-b45e726c2f02-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:18 crc kubenswrapper[4885]: I0308 21:39:18.842687 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-796ks\" (UniqueName: \"kubernetes.io/projected/9ef426ef-0010-4b6f-8b94-b45e726c2f02-kube-api-access-796ks\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.121942 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" event={"ID":"9ef426ef-0010-4b6f-8b94-b45e726c2f02","Type":"ContainerDied","Data":"549241e500a6e1c6bb8dd5a138d7b1eeefe0ecc38aa23303a9bfb803f38670ca"} Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.122276 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="549241e500a6e1c6bb8dd5a138d7b1eeefe0ecc38aa23303a9bfb803f38670ca" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.121983 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-jr59q" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.204996 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-mz9gg"] Mar 08 21:39:19 crc kubenswrapper[4885]: E0308 21:39:19.205433 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef426ef-0010-4b6f-8b94-b45e726c2f02" containerName="ceph-client-openstack-openstack-cell1" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.205451 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef426ef-0010-4b6f-8b94-b45e726c2f02" containerName="ceph-client-openstack-openstack-cell1" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.205658 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef426ef-0010-4b6f-8b94-b45e726c2f02" containerName="ceph-client-openstack-openstack-cell1" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.206366 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.208942 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.209009 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.209128 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.209761 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.215286 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.221197 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-mz9gg"] Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.350775 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4f5n\" (UniqueName: \"kubernetes.io/projected/ba3efa94-310a-4c53-ac95-2444759b8574-kube-api-access-b4f5n\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.350842 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ceph\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.350891 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.350911 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.351004 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ba3efa94-310a-4c53-ac95-2444759b8574-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.351047 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-inventory\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.452903 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-inventory\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.453062 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4f5n\" (UniqueName: \"kubernetes.io/projected/ba3efa94-310a-4c53-ac95-2444759b8574-kube-api-access-b4f5n\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.453127 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ceph\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.453192 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.453219 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.453680 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ba3efa94-310a-4c53-ac95-2444759b8574-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.454913 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ba3efa94-310a-4c53-ac95-2444759b8574-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.457600 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ssh-key-openstack-cell1\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.460165 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.461145 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ceph\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.461876 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-inventory\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.474573 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4f5n\" (UniqueName: \"kubernetes.io/projected/ba3efa94-310a-4c53-ac95-2444759b8574-kube-api-access-b4f5n\") pod \"ovn-openstack-openstack-cell1-mz9gg\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:19 crc kubenswrapper[4885]: I0308 21:39:19.529721 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:39:20 crc kubenswrapper[4885]: I0308 21:39:20.149873 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-mz9gg"] Mar 08 21:39:21 crc kubenswrapper[4885]: I0308 21:39:21.143202 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" event={"ID":"ba3efa94-310a-4c53-ac95-2444759b8574","Type":"ContainerStarted","Data":"1a75e0057e0ad508c06200fa47793eaf63479725ebb9ae99a8b16d6f40fbf84d"} Mar 08 21:39:21 crc kubenswrapper[4885]: I0308 21:39:21.143649 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" event={"ID":"ba3efa94-310a-4c53-ac95-2444759b8574","Type":"ContainerStarted","Data":"f4748d2e3bbfc369443a264393dc5de79561afca8381bfe0809066f999cc7b2e"} Mar 08 21:39:21 crc kubenswrapper[4885]: I0308 21:39:21.147535 4885 generic.go:334] "Generic (PLEG): container finished" podID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerID="9e9db438534feaa447a73410606a84e2839386c0c4482c114e68bcc17de2fb97" exitCode=0 Mar 08 21:39:21 crc kubenswrapper[4885]: I0308 21:39:21.147574 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerDied","Data":"9e9db438534feaa447a73410606a84e2839386c0c4482c114e68bcc17de2fb97"} Mar 08 21:39:21 crc kubenswrapper[4885]: I0308 21:39:21.204429 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" podStartSLOduration=1.7192958470000002 podStartE2EDuration="2.204407006s" podCreationTimestamp="2026-03-08 21:39:19 +0000 UTC" firstStartedPulling="2026-03-08 21:39:20.1646049 +0000 UTC m=+7661.560658933" lastFinishedPulling="2026-03-08 21:39:20.649716069 +0000 UTC m=+7662.045770092" observedRunningTime="2026-03-08 21:39:21.171531618 +0000 UTC m=+7662.567585641" watchObservedRunningTime="2026-03-08 21:39:21.204407006 +0000 UTC m=+7662.600461039" Mar 08 21:39:22 crc kubenswrapper[4885]: I0308 21:39:22.163858 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerStarted","Data":"ec0aa2a4bbb0d043b313bc957b82a674549664f5372584f054d108aebd0c5d14"} Mar 08 21:39:22 crc kubenswrapper[4885]: I0308 21:39:22.196783 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7qwl7" podStartSLOduration=2.46678899 podStartE2EDuration="16.196760755s" podCreationTimestamp="2026-03-08 21:39:06 +0000 UTC" firstStartedPulling="2026-03-08 21:39:07.923793626 +0000 UTC m=+7649.319847669" lastFinishedPulling="2026-03-08 21:39:21.653765371 +0000 UTC m=+7663.049819434" observedRunningTime="2026-03-08 21:39:22.184734135 +0000 UTC m=+7663.580788198" watchObservedRunningTime="2026-03-08 21:39:22.196760755 +0000 UTC m=+7663.592814788" Mar 08 21:39:26 crc kubenswrapper[4885]: I0308 21:39:26.368588 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:39:26 crc kubenswrapper[4885]: E0308 21:39:26.369489 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:39:26 crc kubenswrapper[4885]: I0308 21:39:26.690169 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:26 crc kubenswrapper[4885]: I0308 21:39:26.690257 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:27 crc kubenswrapper[4885]: I0308 21:39:27.764605 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7qwl7" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="registry-server" probeResult="failure" output=< Mar 08 21:39:27 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 21:39:27 crc kubenswrapper[4885]: > Mar 08 21:39:37 crc kubenswrapper[4885]: I0308 21:39:37.743726 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7qwl7" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="registry-server" probeResult="failure" output=< Mar 08 21:39:37 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 21:39:37 crc kubenswrapper[4885]: > Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.368363 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:39:41 crc kubenswrapper[4885]: E0308 21:39:41.369077 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.638426 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6bspm"] Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.642258 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.676971 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bspm"] Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.680204 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plh2g\" (UniqueName: \"kubernetes.io/projected/76335e5d-7b7b-457d-a69c-90318e5cbbb4-kube-api-access-plh2g\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.680340 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-utilities\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.680428 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-catalog-content\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.781520 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plh2g\" (UniqueName: \"kubernetes.io/projected/76335e5d-7b7b-457d-a69c-90318e5cbbb4-kube-api-access-plh2g\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.781825 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-utilities\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.781859 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-catalog-content\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.782463 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-catalog-content\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.782755 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-utilities\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.804142 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plh2g\" (UniqueName: \"kubernetes.io/projected/76335e5d-7b7b-457d-a69c-90318e5cbbb4-kube-api-access-plh2g\") pod \"community-operators-6bspm\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:41 crc kubenswrapper[4885]: I0308 21:39:41.974115 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:42 crc kubenswrapper[4885]: I0308 21:39:42.567726 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6bspm"] Mar 08 21:39:43 crc kubenswrapper[4885]: I0308 21:39:43.410999 4885 generic.go:334] "Generic (PLEG): container finished" podID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerID="6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254" exitCode=0 Mar 08 21:39:43 crc kubenswrapper[4885]: I0308 21:39:43.411720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerDied","Data":"6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254"} Mar 08 21:39:43 crc kubenswrapper[4885]: I0308 21:39:43.411805 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerStarted","Data":"ea86a8b7754eeae84af3d80c0bdfbba0a325e85525e457e4c2467add29827e9c"} Mar 08 21:39:44 crc kubenswrapper[4885]: I0308 21:39:44.437966 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerStarted","Data":"19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77"} Mar 08 21:39:46 crc kubenswrapper[4885]: I0308 21:39:46.465666 4885 generic.go:334] "Generic (PLEG): container finished" podID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerID="19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77" exitCode=0 Mar 08 21:39:46 crc kubenswrapper[4885]: I0308 21:39:46.465782 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerDied","Data":"19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77"} Mar 08 21:39:46 crc kubenswrapper[4885]: I0308 21:39:46.782352 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:46 crc kubenswrapper[4885]: I0308 21:39:46.874756 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:47 crc kubenswrapper[4885]: I0308 21:39:47.479450 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerStarted","Data":"d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d"} Mar 08 21:39:47 crc kubenswrapper[4885]: I0308 21:39:47.508803 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6bspm" podStartSLOduration=3.013178101 podStartE2EDuration="6.508784734s" podCreationTimestamp="2026-03-08 21:39:41 +0000 UTC" firstStartedPulling="2026-03-08 21:39:43.41927684 +0000 UTC m=+7684.815330893" lastFinishedPulling="2026-03-08 21:39:46.914883463 +0000 UTC m=+7688.310937526" observedRunningTime="2026-03-08 21:39:47.500234546 +0000 UTC m=+7688.896288569" watchObservedRunningTime="2026-03-08 21:39:47.508784734 +0000 UTC m=+7688.904838757" Mar 08 21:39:49 crc kubenswrapper[4885]: I0308 21:39:49.215036 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qwl7"] Mar 08 21:39:49 crc kubenswrapper[4885]: I0308 21:39:49.215510 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7qwl7" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="registry-server" containerID="cri-o://ec0aa2a4bbb0d043b313bc957b82a674549664f5372584f054d108aebd0c5d14" gracePeriod=2 Mar 08 21:39:49 crc kubenswrapper[4885]: I0308 21:39:49.510035 4885 generic.go:334] "Generic (PLEG): container finished" podID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerID="ec0aa2a4bbb0d043b313bc957b82a674549664f5372584f054d108aebd0c5d14" exitCode=0 Mar 08 21:39:49 crc kubenswrapper[4885]: I0308 21:39:49.510200 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerDied","Data":"ec0aa2a4bbb0d043b313bc957b82a674549664f5372584f054d108aebd0c5d14"} Mar 08 21:39:49 crc kubenswrapper[4885]: I0308 21:39:49.928220 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.023354 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-utilities\") pod \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.023531 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nwhf\" (UniqueName: \"kubernetes.io/projected/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-kube-api-access-8nwhf\") pod \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.023732 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-catalog-content\") pod \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\" (UID: \"ebd3b0b8-3fd7-4386-b558-e083d8665e9e\") " Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.024048 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-utilities" (OuterVolumeSpecName: "utilities") pod "ebd3b0b8-3fd7-4386-b558-e083d8665e9e" (UID: "ebd3b0b8-3fd7-4386-b558-e083d8665e9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.024325 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.029639 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-kube-api-access-8nwhf" (OuterVolumeSpecName: "kube-api-access-8nwhf") pod "ebd3b0b8-3fd7-4386-b558-e083d8665e9e" (UID: "ebd3b0b8-3fd7-4386-b558-e083d8665e9e"). InnerVolumeSpecName "kube-api-access-8nwhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.126582 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nwhf\" (UniqueName: \"kubernetes.io/projected/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-kube-api-access-8nwhf\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.139442 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ebd3b0b8-3fd7-4386-b558-e083d8665e9e" (UID: "ebd3b0b8-3fd7-4386-b558-e083d8665e9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.228525 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebd3b0b8-3fd7-4386-b558-e083d8665e9e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.524045 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7qwl7" event={"ID":"ebd3b0b8-3fd7-4386-b558-e083d8665e9e","Type":"ContainerDied","Data":"00691bce0bb761767b57e77ad9015e8de45354bfaf62e2946b1fde8b97c48cf8"} Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.524422 4885 scope.go:117] "RemoveContainer" containerID="ec0aa2a4bbb0d043b313bc957b82a674549664f5372584f054d108aebd0c5d14" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.524718 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7qwl7" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.584370 4885 scope.go:117] "RemoveContainer" containerID="9e9db438534feaa447a73410606a84e2839386c0c4482c114e68bcc17de2fb97" Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.588787 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7qwl7"] Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.604808 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7qwl7"] Mar 08 21:39:50 crc kubenswrapper[4885]: I0308 21:39:50.621874 4885 scope.go:117] "RemoveContainer" containerID="a423eca068764bdfe8800013c4a2fb15c30071d300eaa2bf147699572dd7d03e" Mar 08 21:39:51 crc kubenswrapper[4885]: I0308 21:39:51.382270 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" path="/var/lib/kubelet/pods/ebd3b0b8-3fd7-4386-b558-e083d8665e9e/volumes" Mar 08 21:39:51 crc kubenswrapper[4885]: I0308 21:39:51.974480 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:51 crc kubenswrapper[4885]: I0308 21:39:51.974539 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:52 crc kubenswrapper[4885]: I0308 21:39:52.045687 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:52 crc kubenswrapper[4885]: I0308 21:39:52.648634 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:53 crc kubenswrapper[4885]: I0308 21:39:53.368385 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:39:53 crc kubenswrapper[4885]: E0308 21:39:53.368656 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:39:53 crc kubenswrapper[4885]: I0308 21:39:53.422847 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6bspm"] Mar 08 21:39:54 crc kubenswrapper[4885]: I0308 21:39:54.570065 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6bspm" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="registry-server" containerID="cri-o://d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d" gracePeriod=2 Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.169659 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.264240 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plh2g\" (UniqueName: \"kubernetes.io/projected/76335e5d-7b7b-457d-a69c-90318e5cbbb4-kube-api-access-plh2g\") pod \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.264304 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-catalog-content\") pod \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.264532 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-utilities\") pod \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\" (UID: \"76335e5d-7b7b-457d-a69c-90318e5cbbb4\") " Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.266149 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-utilities" (OuterVolumeSpecName: "utilities") pod "76335e5d-7b7b-457d-a69c-90318e5cbbb4" (UID: "76335e5d-7b7b-457d-a69c-90318e5cbbb4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.276227 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76335e5d-7b7b-457d-a69c-90318e5cbbb4-kube-api-access-plh2g" (OuterVolumeSpecName: "kube-api-access-plh2g") pod "76335e5d-7b7b-457d-a69c-90318e5cbbb4" (UID: "76335e5d-7b7b-457d-a69c-90318e5cbbb4"). InnerVolumeSpecName "kube-api-access-plh2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.347779 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76335e5d-7b7b-457d-a69c-90318e5cbbb4" (UID: "76335e5d-7b7b-457d-a69c-90318e5cbbb4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.366307 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.366362 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plh2g\" (UniqueName: \"kubernetes.io/projected/76335e5d-7b7b-457d-a69c-90318e5cbbb4-kube-api-access-plh2g\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.366378 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76335e5d-7b7b-457d-a69c-90318e5cbbb4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.582740 4885 generic.go:334] "Generic (PLEG): container finished" podID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerID="d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d" exitCode=0 Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.583137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerDied","Data":"d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d"} Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.583179 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6bspm" event={"ID":"76335e5d-7b7b-457d-a69c-90318e5cbbb4","Type":"ContainerDied","Data":"ea86a8b7754eeae84af3d80c0bdfbba0a325e85525e457e4c2467add29827e9c"} Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.583209 4885 scope.go:117] "RemoveContainer" containerID="d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.583423 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6bspm" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.608237 4885 scope.go:117] "RemoveContainer" containerID="19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.626449 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6bspm"] Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.640910 4885 scope.go:117] "RemoveContainer" containerID="6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.644292 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6bspm"] Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.703319 4885 scope.go:117] "RemoveContainer" containerID="d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d" Mar 08 21:39:55 crc kubenswrapper[4885]: E0308 21:39:55.704423 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d\": container with ID starting with d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d not found: ID does not exist" containerID="d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.704489 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d"} err="failed to get container status \"d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d\": rpc error: code = NotFound desc = could not find container \"d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d\": container with ID starting with d7a11ae7aeb732b51e9fabf763c6ebea4f469ce7fce25d663225b8ddcd361f0d not found: ID does not exist" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.704529 4885 scope.go:117] "RemoveContainer" containerID="19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77" Mar 08 21:39:55 crc kubenswrapper[4885]: E0308 21:39:55.704862 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77\": container with ID starting with 19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77 not found: ID does not exist" containerID="19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.704899 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77"} err="failed to get container status \"19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77\": rpc error: code = NotFound desc = could not find container \"19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77\": container with ID starting with 19a2802cb26cf64ffda2b48ed0e6d03c21889c75e8fc23475b6cb85ce2238a77 not found: ID does not exist" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.704944 4885 scope.go:117] "RemoveContainer" containerID="6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254" Mar 08 21:39:55 crc kubenswrapper[4885]: E0308 21:39:55.705241 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254\": container with ID starting with 6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254 not found: ID does not exist" containerID="6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254" Mar 08 21:39:55 crc kubenswrapper[4885]: I0308 21:39:55.705266 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254"} err="failed to get container status \"6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254\": rpc error: code = NotFound desc = could not find container \"6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254\": container with ID starting with 6e6e038f1bbfe19d1702494c9c02bfdf4a225fa844ac4386450cb15e976e0254 not found: ID does not exist" Mar 08 21:39:57 crc kubenswrapper[4885]: I0308 21:39:57.386522 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" path="/var/lib/kubelet/pods/76335e5d-7b7b-457d-a69c-90318e5cbbb4/volumes" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.155071 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550100-8r628"] Mar 08 21:40:00 crc kubenswrapper[4885]: E0308 21:40:00.155883 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="registry-server" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.155899 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="registry-server" Mar 08 21:40:00 crc kubenswrapper[4885]: E0308 21:40:00.155914 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="extract-content" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.155938 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="extract-content" Mar 08 21:40:00 crc kubenswrapper[4885]: E0308 21:40:00.155959 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="registry-server" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.155968 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="registry-server" Mar 08 21:40:00 crc kubenswrapper[4885]: E0308 21:40:00.155979 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="extract-utilities" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.155987 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="extract-utilities" Mar 08 21:40:00 crc kubenswrapper[4885]: E0308 21:40:00.155996 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="extract-content" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.156003 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="extract-content" Mar 08 21:40:00 crc kubenswrapper[4885]: E0308 21:40:00.156029 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="extract-utilities" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.156037 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="extract-utilities" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.156319 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebd3b0b8-3fd7-4386-b558-e083d8665e9e" containerName="registry-server" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.156357 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="76335e5d-7b7b-457d-a69c-90318e5cbbb4" containerName="registry-server" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.157247 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.159853 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.160163 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.160364 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.168743 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550100-8r628"] Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.182103 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlffd\" (UniqueName: \"kubernetes.io/projected/70f4584f-eeba-4b88-b31d-79a39f062bd3-kube-api-access-vlffd\") pod \"auto-csr-approver-29550100-8r628\" (UID: \"70f4584f-eeba-4b88-b31d-79a39f062bd3\") " pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.284629 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlffd\" (UniqueName: \"kubernetes.io/projected/70f4584f-eeba-4b88-b31d-79a39f062bd3-kube-api-access-vlffd\") pod \"auto-csr-approver-29550100-8r628\" (UID: \"70f4584f-eeba-4b88-b31d-79a39f062bd3\") " pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.304949 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlffd\" (UniqueName: \"kubernetes.io/projected/70f4584f-eeba-4b88-b31d-79a39f062bd3-kube-api-access-vlffd\") pod \"auto-csr-approver-29550100-8r628\" (UID: \"70f4584f-eeba-4b88-b31d-79a39f062bd3\") " pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:00 crc kubenswrapper[4885]: I0308 21:40:00.528403 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:01 crc kubenswrapper[4885]: I0308 21:40:01.019169 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550100-8r628"] Mar 08 21:40:01 crc kubenswrapper[4885]: I0308 21:40:01.669471 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550100-8r628" event={"ID":"70f4584f-eeba-4b88-b31d-79a39f062bd3","Type":"ContainerStarted","Data":"4a8818f60510fa842b65ccbd2ec3a3a10b4c232cb4ef875cfbab6d8dda0fafaf"} Mar 08 21:40:02 crc kubenswrapper[4885]: I0308 21:40:02.690999 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550100-8r628" event={"ID":"70f4584f-eeba-4b88-b31d-79a39f062bd3","Type":"ContainerStarted","Data":"51c3dd5d7c040092d94f90c0958c32d896513ce729f2d08313181e0c6cc74c41"} Mar 08 21:40:02 crc kubenswrapper[4885]: I0308 21:40:02.714506 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550100-8r628" podStartSLOduration=1.506346207 podStartE2EDuration="2.714485693s" podCreationTimestamp="2026-03-08 21:40:00 +0000 UTC" firstStartedPulling="2026-03-08 21:40:01.025012898 +0000 UTC m=+7702.421066921" lastFinishedPulling="2026-03-08 21:40:02.233152384 +0000 UTC m=+7703.629206407" observedRunningTime="2026-03-08 21:40:02.70238994 +0000 UTC m=+7704.098443993" watchObservedRunningTime="2026-03-08 21:40:02.714485693 +0000 UTC m=+7704.110539726" Mar 08 21:40:03 crc kubenswrapper[4885]: I0308 21:40:03.705154 4885 generic.go:334] "Generic (PLEG): container finished" podID="70f4584f-eeba-4b88-b31d-79a39f062bd3" containerID="51c3dd5d7c040092d94f90c0958c32d896513ce729f2d08313181e0c6cc74c41" exitCode=0 Mar 08 21:40:03 crc kubenswrapper[4885]: I0308 21:40:03.705228 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550100-8r628" event={"ID":"70f4584f-eeba-4b88-b31d-79a39f062bd3","Type":"ContainerDied","Data":"51c3dd5d7c040092d94f90c0958c32d896513ce729f2d08313181e0c6cc74c41"} Mar 08 21:40:04 crc kubenswrapper[4885]: I0308 21:40:04.369006 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:40:04 crc kubenswrapper[4885]: E0308 21:40:04.369735 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.133944 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.214249 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlffd\" (UniqueName: \"kubernetes.io/projected/70f4584f-eeba-4b88-b31d-79a39f062bd3-kube-api-access-vlffd\") pod \"70f4584f-eeba-4b88-b31d-79a39f062bd3\" (UID: \"70f4584f-eeba-4b88-b31d-79a39f062bd3\") " Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.219303 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f4584f-eeba-4b88-b31d-79a39f062bd3-kube-api-access-vlffd" (OuterVolumeSpecName: "kube-api-access-vlffd") pod "70f4584f-eeba-4b88-b31d-79a39f062bd3" (UID: "70f4584f-eeba-4b88-b31d-79a39f062bd3"). InnerVolumeSpecName "kube-api-access-vlffd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.317396 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlffd\" (UniqueName: \"kubernetes.io/projected/70f4584f-eeba-4b88-b31d-79a39f062bd3-kube-api-access-vlffd\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.728771 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550100-8r628" event={"ID":"70f4584f-eeba-4b88-b31d-79a39f062bd3","Type":"ContainerDied","Data":"4a8818f60510fa842b65ccbd2ec3a3a10b4c232cb4ef875cfbab6d8dda0fafaf"} Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.729150 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a8818f60510fa842b65ccbd2ec3a3a10b4c232cb4ef875cfbab6d8dda0fafaf" Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.728868 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550100-8r628" Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.788079 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550094-mjnvj"] Mar 08 21:40:05 crc kubenswrapper[4885]: I0308 21:40:05.798063 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550094-mjnvj"] Mar 08 21:40:07 crc kubenswrapper[4885]: I0308 21:40:07.385027 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a860a0-84ad-49b7-8596-05521c33108a" path="/var/lib/kubelet/pods/c2a860a0-84ad-49b7-8596-05521c33108a/volumes" Mar 08 21:40:12 crc kubenswrapper[4885]: I0308 21:40:12.782749 4885 scope.go:117] "RemoveContainer" containerID="6d31a6020ea44ed51ad167034dfe4175ea1c3055421ddefd4060ab7f5195dfd9" Mar 08 21:40:16 crc kubenswrapper[4885]: I0308 21:40:16.370579 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:40:16 crc kubenswrapper[4885]: E0308 21:40:16.371414 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:40:28 crc kubenswrapper[4885]: I0308 21:40:28.368863 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:40:28 crc kubenswrapper[4885]: E0308 21:40:28.369811 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:40:29 crc kubenswrapper[4885]: I0308 21:40:29.161692 4885 generic.go:334] "Generic (PLEG): container finished" podID="ba3efa94-310a-4c53-ac95-2444759b8574" containerID="1a75e0057e0ad508c06200fa47793eaf63479725ebb9ae99a8b16d6f40fbf84d" exitCode=0 Mar 08 21:40:29 crc kubenswrapper[4885]: I0308 21:40:29.161846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" event={"ID":"ba3efa94-310a-4c53-ac95-2444759b8574","Type":"ContainerDied","Data":"1a75e0057e0ad508c06200fa47793eaf63479725ebb9ae99a8b16d6f40fbf84d"} Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.616793 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.711183 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ba3efa94-310a-4c53-ac95-2444759b8574-ovncontroller-config-0\") pod \"ba3efa94-310a-4c53-ac95-2444759b8574\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.711312 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ssh-key-openstack-cell1\") pod \"ba3efa94-310a-4c53-ac95-2444759b8574\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.711388 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-inventory\") pod \"ba3efa94-310a-4c53-ac95-2444759b8574\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.711408 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ovn-combined-ca-bundle\") pod \"ba3efa94-310a-4c53-ac95-2444759b8574\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.711477 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ceph\") pod \"ba3efa94-310a-4c53-ac95-2444759b8574\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.711512 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4f5n\" (UniqueName: \"kubernetes.io/projected/ba3efa94-310a-4c53-ac95-2444759b8574-kube-api-access-b4f5n\") pod \"ba3efa94-310a-4c53-ac95-2444759b8574\" (UID: \"ba3efa94-310a-4c53-ac95-2444759b8574\") " Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.716888 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3efa94-310a-4c53-ac95-2444759b8574-kube-api-access-b4f5n" (OuterVolumeSpecName: "kube-api-access-b4f5n") pod "ba3efa94-310a-4c53-ac95-2444759b8574" (UID: "ba3efa94-310a-4c53-ac95-2444759b8574"). InnerVolumeSpecName "kube-api-access-b4f5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.717083 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ceph" (OuterVolumeSpecName: "ceph") pod "ba3efa94-310a-4c53-ac95-2444759b8574" (UID: "ba3efa94-310a-4c53-ac95-2444759b8574"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.721054 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ba3efa94-310a-4c53-ac95-2444759b8574" (UID: "ba3efa94-310a-4c53-ac95-2444759b8574"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.738186 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3efa94-310a-4c53-ac95-2444759b8574-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "ba3efa94-310a-4c53-ac95-2444759b8574" (UID: "ba3efa94-310a-4c53-ac95-2444759b8574"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.742792 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-inventory" (OuterVolumeSpecName: "inventory") pod "ba3efa94-310a-4c53-ac95-2444759b8574" (UID: "ba3efa94-310a-4c53-ac95-2444759b8574"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.746349 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "ba3efa94-310a-4c53-ac95-2444759b8574" (UID: "ba3efa94-310a-4c53-ac95-2444759b8574"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.813545 4885 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/ba3efa94-310a-4c53-ac95-2444759b8574-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.813586 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.813600 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.813615 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.813626 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/ba3efa94-310a-4c53-ac95-2444759b8574-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:30 crc kubenswrapper[4885]: I0308 21:40:30.813637 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4f5n\" (UniqueName: \"kubernetes.io/projected/ba3efa94-310a-4c53-ac95-2444759b8574-kube-api-access-b4f5n\") on node \"crc\" DevicePath \"\"" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.184101 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" event={"ID":"ba3efa94-310a-4c53-ac95-2444759b8574","Type":"ContainerDied","Data":"f4748d2e3bbfc369443a264393dc5de79561afca8381bfe0809066f999cc7b2e"} Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.184159 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4748d2e3bbfc369443a264393dc5de79561afca8381bfe0809066f999cc7b2e" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.184222 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-mz9gg" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.345452 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-hwhq9"] Mar 08 21:40:31 crc kubenswrapper[4885]: E0308 21:40:31.346171 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3efa94-310a-4c53-ac95-2444759b8574" containerName="ovn-openstack-openstack-cell1" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.346255 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3efa94-310a-4c53-ac95-2444759b8574" containerName="ovn-openstack-openstack-cell1" Mar 08 21:40:31 crc kubenswrapper[4885]: E0308 21:40:31.346367 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f4584f-eeba-4b88-b31d-79a39f062bd3" containerName="oc" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.346432 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f4584f-eeba-4b88-b31d-79a39f062bd3" containerName="oc" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.346737 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3efa94-310a-4c53-ac95-2444759b8574" containerName="ovn-openstack-openstack-cell1" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.346824 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f4584f-eeba-4b88-b31d-79a39f062bd3" containerName="oc" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.347742 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.350500 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.350789 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.351367 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.352866 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.353207 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.357062 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.360479 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-hwhq9"] Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429199 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429240 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429276 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429309 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9mhx\" (UniqueName: \"kubernetes.io/projected/12740b7f-a6a2-45e2-a288-fbb880a2c72b-kube-api-access-t9mhx\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429451 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429639 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.429989 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532537 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532605 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9mhx\" (UniqueName: \"kubernetes.io/projected/12740b7f-a6a2-45e2-a288-fbb880a2c72b-kube-api-access-t9mhx\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532658 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532713 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532806 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532895 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.532916 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.537176 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.538883 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.539516 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.540885 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.541395 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.541576 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ssh-key-openstack-cell1\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.555141 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9mhx\" (UniqueName: \"kubernetes.io/projected/12740b7f-a6a2-45e2-a288-fbb880a2c72b-kube-api-access-t9mhx\") pod \"neutron-metadata-openstack-openstack-cell1-hwhq9\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:31 crc kubenswrapper[4885]: I0308 21:40:31.677178 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:40:32 crc kubenswrapper[4885]: I0308 21:40:32.353274 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-hwhq9"] Mar 08 21:40:33 crc kubenswrapper[4885]: I0308 21:40:33.204032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" event={"ID":"12740b7f-a6a2-45e2-a288-fbb880a2c72b","Type":"ContainerStarted","Data":"fcaf2ec05dbabc2bcbe53d0ade85f799708a197bedd504536ecc8c4e202e5c29"} Mar 08 21:40:34 crc kubenswrapper[4885]: I0308 21:40:34.219675 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" event={"ID":"12740b7f-a6a2-45e2-a288-fbb880a2c72b","Type":"ContainerStarted","Data":"46937056048ab7bfd833e96e09493fd380296e1d284b835fc82f2a3fb07daf4f"} Mar 08 21:40:34 crc kubenswrapper[4885]: I0308 21:40:34.256055 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" podStartSLOduration=2.764488997 podStartE2EDuration="3.256031279s" podCreationTimestamp="2026-03-08 21:40:31 +0000 UTC" firstStartedPulling="2026-03-08 21:40:32.358092772 +0000 UTC m=+7733.754146795" lastFinishedPulling="2026-03-08 21:40:32.849635054 +0000 UTC m=+7734.245689077" observedRunningTime="2026-03-08 21:40:34.249669509 +0000 UTC m=+7735.645723562" watchObservedRunningTime="2026-03-08 21:40:34.256031279 +0000 UTC m=+7735.652085332" Mar 08 21:40:43 crc kubenswrapper[4885]: I0308 21:40:43.369353 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:40:43 crc kubenswrapper[4885]: E0308 21:40:43.370403 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:40:58 crc kubenswrapper[4885]: I0308 21:40:58.369015 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:40:58 crc kubenswrapper[4885]: E0308 21:40:58.370044 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:41:09 crc kubenswrapper[4885]: I0308 21:41:09.406103 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:41:09 crc kubenswrapper[4885]: E0308 21:41:09.410345 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:41:23 crc kubenswrapper[4885]: I0308 21:41:23.369775 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:41:23 crc kubenswrapper[4885]: E0308 21:41:23.371011 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:41:26 crc kubenswrapper[4885]: I0308 21:41:26.872985 4885 generic.go:334] "Generic (PLEG): container finished" podID="12740b7f-a6a2-45e2-a288-fbb880a2c72b" containerID="46937056048ab7bfd833e96e09493fd380296e1d284b835fc82f2a3fb07daf4f" exitCode=0 Mar 08 21:41:26 crc kubenswrapper[4885]: I0308 21:41:26.873022 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" event={"ID":"12740b7f-a6a2-45e2-a288-fbb880a2c72b","Type":"ContainerDied","Data":"46937056048ab7bfd833e96e09493fd380296e1d284b835fc82f2a3fb07daf4f"} Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.428497 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551171 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ceph\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551328 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-metadata-combined-ca-bundle\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551465 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9mhx\" (UniqueName: \"kubernetes.io/projected/12740b7f-a6a2-45e2-a288-fbb880a2c72b-kube-api-access-t9mhx\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551498 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-ovn-metadata-agent-neutron-config-0\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551541 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-nova-metadata-neutron-config-0\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551599 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-inventory\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.551625 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ssh-key-openstack-cell1\") pod \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\" (UID: \"12740b7f-a6a2-45e2-a288-fbb880a2c72b\") " Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.560645 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.560713 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ceph" (OuterVolumeSpecName: "ceph") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.563331 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12740b7f-a6a2-45e2-a288-fbb880a2c72b-kube-api-access-t9mhx" (OuterVolumeSpecName: "kube-api-access-t9mhx") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "kube-api-access-t9mhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.582347 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.603420 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-inventory" (OuterVolumeSpecName: "inventory") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.612337 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.614441 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "12740b7f-a6a2-45e2-a288-fbb880a2c72b" (UID: "12740b7f-a6a2-45e2-a288-fbb880a2c72b"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654179 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9mhx\" (UniqueName: \"kubernetes.io/projected/12740b7f-a6a2-45e2-a288-fbb880a2c72b-kube-api-access-t9mhx\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654217 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654236 4885 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654250 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654263 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654274 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.654285 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12740b7f-a6a2-45e2-a288-fbb880a2c72b-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.902651 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" event={"ID":"12740b7f-a6a2-45e2-a288-fbb880a2c72b","Type":"ContainerDied","Data":"fcaf2ec05dbabc2bcbe53d0ade85f799708a197bedd504536ecc8c4e202e5c29"} Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.902701 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcaf2ec05dbabc2bcbe53d0ade85f799708a197bedd504536ecc8c4e202e5c29" Mar 08 21:41:28 crc kubenswrapper[4885]: I0308 21:41:28.903104 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-hwhq9" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.048214 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-rd94l"] Mar 08 21:41:29 crc kubenswrapper[4885]: E0308 21:41:29.048812 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12740b7f-a6a2-45e2-a288-fbb880a2c72b" containerName="neutron-metadata-openstack-openstack-cell1" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.048845 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="12740b7f-a6a2-45e2-a288-fbb880a2c72b" containerName="neutron-metadata-openstack-openstack-cell1" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.049201 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="12740b7f-a6a2-45e2-a288-fbb880a2c72b" containerName="neutron-metadata-openstack-openstack-cell1" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.050215 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.053390 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.053622 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.053834 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.054658 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.055625 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.060650 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-rd94l"] Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.183475 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.183536 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ceph\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.183588 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.184153 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dz7c\" (UniqueName: \"kubernetes.io/projected/992d3500-f892-42c6-805f-ae9c96793d0f-kube-api-access-2dz7c\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.184247 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.184267 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-inventory\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.286748 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ceph\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.286947 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.287166 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dz7c\" (UniqueName: \"kubernetes.io/projected/992d3500-f892-42c6-805f-ae9c96793d0f-kube-api-access-2dz7c\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.287224 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.287262 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-inventory\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.287410 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.292900 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ssh-key-openstack-cell1\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.293344 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.293893 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-inventory\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.294823 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ceph\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.294884 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.319208 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dz7c\" (UniqueName: \"kubernetes.io/projected/992d3500-f892-42c6-805f-ae9c96793d0f-kube-api-access-2dz7c\") pod \"libvirt-openstack-openstack-cell1-rd94l\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:29 crc kubenswrapper[4885]: I0308 21:41:29.373168 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:41:30 crc kubenswrapper[4885]: I0308 21:41:30.046523 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:41:30 crc kubenswrapper[4885]: I0308 21:41:30.050112 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-rd94l"] Mar 08 21:41:30 crc kubenswrapper[4885]: I0308 21:41:30.944539 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" event={"ID":"992d3500-f892-42c6-805f-ae9c96793d0f","Type":"ContainerStarted","Data":"135b71ffe2c5954d98583a529a1ec42b83ce4fb4d31e613f230e8d7339fd376b"} Mar 08 21:41:30 crc kubenswrapper[4885]: I0308 21:41:30.945054 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" event={"ID":"992d3500-f892-42c6-805f-ae9c96793d0f","Type":"ContainerStarted","Data":"2ee7a4abb905c5d1b376a7b80993ad31e44b7c26d66ceb2dc9a19d787251b55b"} Mar 08 21:41:30 crc kubenswrapper[4885]: I0308 21:41:30.990798 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" podStartSLOduration=1.54456602 podStartE2EDuration="1.990769752s" podCreationTimestamp="2026-03-08 21:41:29 +0000 UTC" firstStartedPulling="2026-03-08 21:41:30.046220797 +0000 UTC m=+7791.442274820" lastFinishedPulling="2026-03-08 21:41:30.492424489 +0000 UTC m=+7791.888478552" observedRunningTime="2026-03-08 21:41:30.971400415 +0000 UTC m=+7792.367454478" watchObservedRunningTime="2026-03-08 21:41:30.990769752 +0000 UTC m=+7792.386823805" Mar 08 21:41:35 crc kubenswrapper[4885]: I0308 21:41:35.368133 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:41:35 crc kubenswrapper[4885]: E0308 21:41:35.369207 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:41:46 crc kubenswrapper[4885]: I0308 21:41:46.369330 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:41:46 crc kubenswrapper[4885]: E0308 21:41:46.370515 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.155891 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550102-v4g2f"] Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.158161 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.176355 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550102-v4g2f"] Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.187871 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.187974 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.188187 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.203708 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6mf\" (UniqueName: \"kubernetes.io/projected/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2-kube-api-access-9n6mf\") pod \"auto-csr-approver-29550102-v4g2f\" (UID: \"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2\") " pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.305759 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6mf\" (UniqueName: \"kubernetes.io/projected/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2-kube-api-access-9n6mf\") pod \"auto-csr-approver-29550102-v4g2f\" (UID: \"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2\") " pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.335158 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6mf\" (UniqueName: \"kubernetes.io/projected/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2-kube-api-access-9n6mf\") pod \"auto-csr-approver-29550102-v4g2f\" (UID: \"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2\") " pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.368214 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:42:00 crc kubenswrapper[4885]: E0308 21:42:00.368591 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:42:00 crc kubenswrapper[4885]: I0308 21:42:00.510665 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:01 crc kubenswrapper[4885]: I0308 21:42:01.067393 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550102-v4g2f"] Mar 08 21:42:01 crc kubenswrapper[4885]: I0308 21:42:01.393511 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" event={"ID":"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2","Type":"ContainerStarted","Data":"bb21f52acf1e88f6ef71055d97c4e3a5f049f05fa8b3d22063698f1482b4cc18"} Mar 08 21:42:03 crc kubenswrapper[4885]: I0308 21:42:03.422130 4885 generic.go:334] "Generic (PLEG): container finished" podID="ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2" containerID="353dc00bbe28654011d7f403f11d20bd383c705d36f6523ce418dd57c34d32f5" exitCode=0 Mar 08 21:42:03 crc kubenswrapper[4885]: I0308 21:42:03.422190 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" event={"ID":"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2","Type":"ContainerDied","Data":"353dc00bbe28654011d7f403f11d20bd383c705d36f6523ce418dd57c34d32f5"} Mar 08 21:42:04 crc kubenswrapper[4885]: I0308 21:42:04.888794 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.033712 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n6mf\" (UniqueName: \"kubernetes.io/projected/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2-kube-api-access-9n6mf\") pod \"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2\" (UID: \"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2\") " Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.041165 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2-kube-api-access-9n6mf" (OuterVolumeSpecName: "kube-api-access-9n6mf") pod "ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2" (UID: "ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2"). InnerVolumeSpecName "kube-api-access-9n6mf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.137258 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n6mf\" (UniqueName: \"kubernetes.io/projected/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2-kube-api-access-9n6mf\") on node \"crc\" DevicePath \"\"" Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.449586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" event={"ID":"ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2","Type":"ContainerDied","Data":"bb21f52acf1e88f6ef71055d97c4e3a5f049f05fa8b3d22063698f1482b4cc18"} Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.449911 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb21f52acf1e88f6ef71055d97c4e3a5f049f05fa8b3d22063698f1482b4cc18" Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.449680 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550102-v4g2f" Mar 08 21:42:05 crc kubenswrapper[4885]: I0308 21:42:05.987999 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550096-4s5wc"] Mar 08 21:42:06 crc kubenswrapper[4885]: I0308 21:42:06.001713 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550096-4s5wc"] Mar 08 21:42:07 crc kubenswrapper[4885]: I0308 21:42:07.384444 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44e60165-e38f-4fbe-87a1-5908598e0e38" path="/var/lib/kubelet/pods/44e60165-e38f-4fbe-87a1-5908598e0e38/volumes" Mar 08 21:42:12 crc kubenswrapper[4885]: I0308 21:42:12.987684 4885 scope.go:117] "RemoveContainer" containerID="59a022bee7d69812163e1296ed21c2217e23eb0a10b094d9fb3faabfbcba446f" Mar 08 21:42:15 crc kubenswrapper[4885]: I0308 21:42:15.368522 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:42:16 crc kubenswrapper[4885]: I0308 21:42:16.596483 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"5e834ea73f0f8d2a810f3e37feb7181558037ef9392084f1f1fea4e210684a34"} Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.281992 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-js2l5"] Mar 08 21:42:33 crc kubenswrapper[4885]: E0308 21:42:33.283081 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2" containerName="oc" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.283097 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2" containerName="oc" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.283353 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2" containerName="oc" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.290852 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.327123 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-js2l5"] Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.412096 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jl6v\" (UniqueName: \"kubernetes.io/projected/72aa9440-1700-4412-a459-8a62c4ee863b-kube-api-access-8jl6v\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.412161 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-catalog-content\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.412189 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-utilities\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.514349 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-catalog-content\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.514400 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-utilities\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.514583 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jl6v\" (UniqueName: \"kubernetes.io/projected/72aa9440-1700-4412-a459-8a62c4ee863b-kube-api-access-8jl6v\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.514997 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-utilities\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.515140 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-catalog-content\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.551202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jl6v\" (UniqueName: \"kubernetes.io/projected/72aa9440-1700-4412-a459-8a62c4ee863b-kube-api-access-8jl6v\") pod \"certified-operators-js2l5\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:33 crc kubenswrapper[4885]: I0308 21:42:33.622441 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:34 crc kubenswrapper[4885]: I0308 21:42:34.155051 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-js2l5"] Mar 08 21:42:34 crc kubenswrapper[4885]: I0308 21:42:34.834486 4885 generic.go:334] "Generic (PLEG): container finished" podID="72aa9440-1700-4412-a459-8a62c4ee863b" containerID="11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab" exitCode=0 Mar 08 21:42:34 crc kubenswrapper[4885]: I0308 21:42:34.834868 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerDied","Data":"11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab"} Mar 08 21:42:34 crc kubenswrapper[4885]: I0308 21:42:34.834905 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerStarted","Data":"5ffb02efe981c8b3a4fd909c1a9ac7dac5ae00bcc0dc55e21f8153d606172eb7"} Mar 08 21:42:36 crc kubenswrapper[4885]: I0308 21:42:36.868866 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerStarted","Data":"02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079"} Mar 08 21:42:37 crc kubenswrapper[4885]: I0308 21:42:37.884117 4885 generic.go:334] "Generic (PLEG): container finished" podID="72aa9440-1700-4412-a459-8a62c4ee863b" containerID="02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079" exitCode=0 Mar 08 21:42:37 crc kubenswrapper[4885]: I0308 21:42:37.884211 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerDied","Data":"02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079"} Mar 08 21:42:38 crc kubenswrapper[4885]: I0308 21:42:38.901997 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerStarted","Data":"4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e"} Mar 08 21:42:38 crc kubenswrapper[4885]: I0308 21:42:38.933724 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-js2l5" podStartSLOduration=2.362670671 podStartE2EDuration="5.933710615s" podCreationTimestamp="2026-03-08 21:42:33 +0000 UTC" firstStartedPulling="2026-03-08 21:42:34.839144426 +0000 UTC m=+7856.235198489" lastFinishedPulling="2026-03-08 21:42:38.41018441 +0000 UTC m=+7859.806238433" observedRunningTime="2026-03-08 21:42:38.930601722 +0000 UTC m=+7860.326655775" watchObservedRunningTime="2026-03-08 21:42:38.933710615 +0000 UTC m=+7860.329764638" Mar 08 21:42:43 crc kubenswrapper[4885]: I0308 21:42:43.623124 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:43 crc kubenswrapper[4885]: I0308 21:42:43.623907 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:43 crc kubenswrapper[4885]: I0308 21:42:43.721151 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:44 crc kubenswrapper[4885]: I0308 21:42:44.072750 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:44 crc kubenswrapper[4885]: I0308 21:42:44.155626 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-js2l5"] Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.046291 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-js2l5" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="registry-server" containerID="cri-o://4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e" gracePeriod=2 Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.604467 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.787186 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jl6v\" (UniqueName: \"kubernetes.io/projected/72aa9440-1700-4412-a459-8a62c4ee863b-kube-api-access-8jl6v\") pod \"72aa9440-1700-4412-a459-8a62c4ee863b\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.787516 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-catalog-content\") pod \"72aa9440-1700-4412-a459-8a62c4ee863b\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.787612 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-utilities\") pod \"72aa9440-1700-4412-a459-8a62c4ee863b\" (UID: \"72aa9440-1700-4412-a459-8a62c4ee863b\") " Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.788741 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-utilities" (OuterVolumeSpecName: "utilities") pod "72aa9440-1700-4412-a459-8a62c4ee863b" (UID: "72aa9440-1700-4412-a459-8a62c4ee863b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.799718 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72aa9440-1700-4412-a459-8a62c4ee863b-kube-api-access-8jl6v" (OuterVolumeSpecName: "kube-api-access-8jl6v") pod "72aa9440-1700-4412-a459-8a62c4ee863b" (UID: "72aa9440-1700-4412-a459-8a62c4ee863b"). InnerVolumeSpecName "kube-api-access-8jl6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.858229 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72aa9440-1700-4412-a459-8a62c4ee863b" (UID: "72aa9440-1700-4412-a459-8a62c4ee863b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.890237 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jl6v\" (UniqueName: \"kubernetes.io/projected/72aa9440-1700-4412-a459-8a62c4ee863b-kube-api-access-8jl6v\") on node \"crc\" DevicePath \"\"" Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.890277 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:42:46 crc kubenswrapper[4885]: I0308 21:42:46.890286 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72aa9440-1700-4412-a459-8a62c4ee863b-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.059391 4885 generic.go:334] "Generic (PLEG): container finished" podID="72aa9440-1700-4412-a459-8a62c4ee863b" containerID="4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e" exitCode=0 Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.059446 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerDied","Data":"4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e"} Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.059462 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js2l5" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.059482 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js2l5" event={"ID":"72aa9440-1700-4412-a459-8a62c4ee863b","Type":"ContainerDied","Data":"5ffb02efe981c8b3a4fd909c1a9ac7dac5ae00bcc0dc55e21f8153d606172eb7"} Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.059500 4885 scope.go:117] "RemoveContainer" containerID="4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.085881 4885 scope.go:117] "RemoveContainer" containerID="02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.105014 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-js2l5"] Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.118372 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-js2l5"] Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.130546 4885 scope.go:117] "RemoveContainer" containerID="11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.169566 4885 scope.go:117] "RemoveContainer" containerID="4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e" Mar 08 21:42:47 crc kubenswrapper[4885]: E0308 21:42:47.170245 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e\": container with ID starting with 4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e not found: ID does not exist" containerID="4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.170303 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e"} err="failed to get container status \"4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e\": rpc error: code = NotFound desc = could not find container \"4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e\": container with ID starting with 4e46c48b110ecfca885c755c950dd6d265fce267bc479eeb0e97cde26302063e not found: ID does not exist" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.170336 4885 scope.go:117] "RemoveContainer" containerID="02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079" Mar 08 21:42:47 crc kubenswrapper[4885]: E0308 21:42:47.170805 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079\": container with ID starting with 02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079 not found: ID does not exist" containerID="02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.170841 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079"} err="failed to get container status \"02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079\": rpc error: code = NotFound desc = could not find container \"02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079\": container with ID starting with 02aad4f6086812ec51cb3f4c38cb19cdc0c87d6b9c9e269f4012ff3970edb079 not found: ID does not exist" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.170865 4885 scope.go:117] "RemoveContainer" containerID="11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab" Mar 08 21:42:47 crc kubenswrapper[4885]: E0308 21:42:47.171156 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab\": container with ID starting with 11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab not found: ID does not exist" containerID="11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.171187 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab"} err="failed to get container status \"11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab\": rpc error: code = NotFound desc = could not find container \"11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab\": container with ID starting with 11f94bdfcaa2656f36a63b2854e5f1bfe7f8f8886582514463be8cf65ae8c3ab not found: ID does not exist" Mar 08 21:42:47 crc kubenswrapper[4885]: I0308 21:42:47.382640 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" path="/var/lib/kubelet/pods/72aa9440-1700-4412-a459-8a62c4ee863b/volumes" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.866423 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fs7zb"] Mar 08 21:43:02 crc kubenswrapper[4885]: E0308 21:43:02.867528 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="extract-utilities" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.867546 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="extract-utilities" Mar 08 21:43:02 crc kubenswrapper[4885]: E0308 21:43:02.867567 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="registry-server" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.867575 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="registry-server" Mar 08 21:43:02 crc kubenswrapper[4885]: E0308 21:43:02.867590 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="extract-content" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.867599 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="extract-content" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.867899 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="72aa9440-1700-4412-a459-8a62c4ee863b" containerName="registry-server" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.869853 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:02 crc kubenswrapper[4885]: I0308 21:43:02.894050 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs7zb"] Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.061274 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-utilities\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.061607 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlwtq\" (UniqueName: \"kubernetes.io/projected/f64d2024-4751-4d9e-8565-bd06234f5388-kube-api-access-tlwtq\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.061647 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-catalog-content\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.163975 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-utilities\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.164045 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlwtq\" (UniqueName: \"kubernetes.io/projected/f64d2024-4751-4d9e-8565-bd06234f5388-kube-api-access-tlwtq\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.164096 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-catalog-content\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.164498 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-utilities\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.164713 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-catalog-content\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.185180 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlwtq\" (UniqueName: \"kubernetes.io/projected/f64d2024-4751-4d9e-8565-bd06234f5388-kube-api-access-tlwtq\") pod \"redhat-marketplace-fs7zb\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.202531 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:03 crc kubenswrapper[4885]: I0308 21:43:03.690691 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs7zb"] Mar 08 21:43:04 crc kubenswrapper[4885]: I0308 21:43:04.295867 4885 generic.go:334] "Generic (PLEG): container finished" podID="f64d2024-4751-4d9e-8565-bd06234f5388" containerID="f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90" exitCode=0 Mar 08 21:43:04 crc kubenswrapper[4885]: I0308 21:43:04.296176 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerDied","Data":"f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90"} Mar 08 21:43:04 crc kubenswrapper[4885]: I0308 21:43:04.296207 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerStarted","Data":"0b7182a340904e31bab48ebea5ffe43ac333be6ef60964d49770c8a77207a9b7"} Mar 08 21:43:05 crc kubenswrapper[4885]: I0308 21:43:05.312990 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerStarted","Data":"4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec"} Mar 08 21:43:06 crc kubenswrapper[4885]: I0308 21:43:06.325711 4885 generic.go:334] "Generic (PLEG): container finished" podID="f64d2024-4751-4d9e-8565-bd06234f5388" containerID="4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec" exitCode=0 Mar 08 21:43:06 crc kubenswrapper[4885]: I0308 21:43:06.325761 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerDied","Data":"4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec"} Mar 08 21:43:07 crc kubenswrapper[4885]: I0308 21:43:07.351555 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerStarted","Data":"70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e"} Mar 08 21:43:07 crc kubenswrapper[4885]: I0308 21:43:07.380540 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fs7zb" podStartSLOduration=2.9091557200000002 podStartE2EDuration="5.380521391s" podCreationTimestamp="2026-03-08 21:43:02 +0000 UTC" firstStartedPulling="2026-03-08 21:43:04.298425579 +0000 UTC m=+7885.694479622" lastFinishedPulling="2026-03-08 21:43:06.76979124 +0000 UTC m=+7888.165845293" observedRunningTime="2026-03-08 21:43:07.373802321 +0000 UTC m=+7888.769856344" watchObservedRunningTime="2026-03-08 21:43:07.380521391 +0000 UTC m=+7888.776575414" Mar 08 21:43:13 crc kubenswrapper[4885]: I0308 21:43:13.203174 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:13 crc kubenswrapper[4885]: I0308 21:43:13.203773 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:13 crc kubenswrapper[4885]: I0308 21:43:13.280500 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:13 crc kubenswrapper[4885]: I0308 21:43:13.505899 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:13 crc kubenswrapper[4885]: I0308 21:43:13.585104 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs7zb"] Mar 08 21:43:15 crc kubenswrapper[4885]: I0308 21:43:15.451817 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fs7zb" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="registry-server" containerID="cri-o://70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e" gracePeriod=2 Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.162979 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.302137 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlwtq\" (UniqueName: \"kubernetes.io/projected/f64d2024-4751-4d9e-8565-bd06234f5388-kube-api-access-tlwtq\") pod \"f64d2024-4751-4d9e-8565-bd06234f5388\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.302373 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-utilities\") pod \"f64d2024-4751-4d9e-8565-bd06234f5388\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.302622 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-catalog-content\") pod \"f64d2024-4751-4d9e-8565-bd06234f5388\" (UID: \"f64d2024-4751-4d9e-8565-bd06234f5388\") " Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.303695 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-utilities" (OuterVolumeSpecName: "utilities") pod "f64d2024-4751-4d9e-8565-bd06234f5388" (UID: "f64d2024-4751-4d9e-8565-bd06234f5388"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.311276 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f64d2024-4751-4d9e-8565-bd06234f5388-kube-api-access-tlwtq" (OuterVolumeSpecName: "kube-api-access-tlwtq") pod "f64d2024-4751-4d9e-8565-bd06234f5388" (UID: "f64d2024-4751-4d9e-8565-bd06234f5388"). InnerVolumeSpecName "kube-api-access-tlwtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.312479 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlwtq\" (UniqueName: \"kubernetes.io/projected/f64d2024-4751-4d9e-8565-bd06234f5388-kube-api-access-tlwtq\") on node \"crc\" DevicePath \"\"" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.312560 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.340295 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f64d2024-4751-4d9e-8565-bd06234f5388" (UID: "f64d2024-4751-4d9e-8565-bd06234f5388"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.413937 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f64d2024-4751-4d9e-8565-bd06234f5388-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.464748 4885 generic.go:334] "Generic (PLEG): container finished" podID="f64d2024-4751-4d9e-8565-bd06234f5388" containerID="70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e" exitCode=0 Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.464799 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerDied","Data":"70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e"} Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.464828 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fs7zb" event={"ID":"f64d2024-4751-4d9e-8565-bd06234f5388","Type":"ContainerDied","Data":"0b7182a340904e31bab48ebea5ffe43ac333be6ef60964d49770c8a77207a9b7"} Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.464847 4885 scope.go:117] "RemoveContainer" containerID="70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.465080 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fs7zb" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.509127 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs7zb"] Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.511700 4885 scope.go:117] "RemoveContainer" containerID="4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.519746 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fs7zb"] Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.540209 4885 scope.go:117] "RemoveContainer" containerID="f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.601434 4885 scope.go:117] "RemoveContainer" containerID="70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e" Mar 08 21:43:16 crc kubenswrapper[4885]: E0308 21:43:16.602244 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e\": container with ID starting with 70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e not found: ID does not exist" containerID="70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.602370 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e"} err="failed to get container status \"70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e\": rpc error: code = NotFound desc = could not find container \"70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e\": container with ID starting with 70fc3f767115b84c571a1b7df8819922497540a7684ce43b352222c0ee374b7e not found: ID does not exist" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.602487 4885 scope.go:117] "RemoveContainer" containerID="4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec" Mar 08 21:43:16 crc kubenswrapper[4885]: E0308 21:43:16.603195 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec\": container with ID starting with 4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec not found: ID does not exist" containerID="4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.603272 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec"} err="failed to get container status \"4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec\": rpc error: code = NotFound desc = could not find container \"4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec\": container with ID starting with 4104e1ca6a8561b68a300d51a65c70ce99a2c19fbaea43e29343c7a5091846ec not found: ID does not exist" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.603322 4885 scope.go:117] "RemoveContainer" containerID="f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90" Mar 08 21:43:16 crc kubenswrapper[4885]: E0308 21:43:16.603852 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90\": container with ID starting with f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90 not found: ID does not exist" containerID="f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90" Mar 08 21:43:16 crc kubenswrapper[4885]: I0308 21:43:16.603890 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90"} err="failed to get container status \"f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90\": rpc error: code = NotFound desc = could not find container \"f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90\": container with ID starting with f620d4e44d37721a145064b69d8be6e4c8796319d3a6a3d0c13a985f09fdea90 not found: ID does not exist" Mar 08 21:43:17 crc kubenswrapper[4885]: I0308 21:43:17.379221 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" path="/var/lib/kubelet/pods/f64d2024-4751-4d9e-8565-bd06234f5388/volumes" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.158905 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550104-7bjtf"] Mar 08 21:44:00 crc kubenswrapper[4885]: E0308 21:44:00.160247 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="registry-server" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.160273 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="registry-server" Mar 08 21:44:00 crc kubenswrapper[4885]: E0308 21:44:00.160339 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="extract-utilities" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.160353 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="extract-utilities" Mar 08 21:44:00 crc kubenswrapper[4885]: E0308 21:44:00.160395 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="extract-content" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.160409 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="extract-content" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.160840 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64d2024-4751-4d9e-8565-bd06234f5388" containerName="registry-server" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.162296 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.166692 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.166946 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.173504 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.176495 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550104-7bjtf"] Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.295411 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5r9l\" (UniqueName: \"kubernetes.io/projected/9887f7af-0796-4481-aa3d-5f4996f9ed47-kube-api-access-s5r9l\") pod \"auto-csr-approver-29550104-7bjtf\" (UID: \"9887f7af-0796-4481-aa3d-5f4996f9ed47\") " pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.397722 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5r9l\" (UniqueName: \"kubernetes.io/projected/9887f7af-0796-4481-aa3d-5f4996f9ed47-kube-api-access-s5r9l\") pod \"auto-csr-approver-29550104-7bjtf\" (UID: \"9887f7af-0796-4481-aa3d-5f4996f9ed47\") " pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.417244 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5r9l\" (UniqueName: \"kubernetes.io/projected/9887f7af-0796-4481-aa3d-5f4996f9ed47-kube-api-access-s5r9l\") pod \"auto-csr-approver-29550104-7bjtf\" (UID: \"9887f7af-0796-4481-aa3d-5f4996f9ed47\") " pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:00 crc kubenswrapper[4885]: I0308 21:44:00.485580 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:01 crc kubenswrapper[4885]: I0308 21:44:01.000014 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550104-7bjtf"] Mar 08 21:44:01 crc kubenswrapper[4885]: I0308 21:44:01.067232 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" event={"ID":"9887f7af-0796-4481-aa3d-5f4996f9ed47","Type":"ContainerStarted","Data":"ee0809ff828598e522807f85caad1a38502c44983ae469e801fb1077a072ad13"} Mar 08 21:44:03 crc kubenswrapper[4885]: I0308 21:44:03.087727 4885 generic.go:334] "Generic (PLEG): container finished" podID="9887f7af-0796-4481-aa3d-5f4996f9ed47" containerID="dc8414267440eda43954aa07f3b3a3139275d86a35c8b1f3e17198abfe4f8b5c" exitCode=0 Mar 08 21:44:03 crc kubenswrapper[4885]: I0308 21:44:03.087856 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" event={"ID":"9887f7af-0796-4481-aa3d-5f4996f9ed47","Type":"ContainerDied","Data":"dc8414267440eda43954aa07f3b3a3139275d86a35c8b1f3e17198abfe4f8b5c"} Mar 08 21:44:04 crc kubenswrapper[4885]: I0308 21:44:04.573971 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:04 crc kubenswrapper[4885]: I0308 21:44:04.587265 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5r9l\" (UniqueName: \"kubernetes.io/projected/9887f7af-0796-4481-aa3d-5f4996f9ed47-kube-api-access-s5r9l\") pod \"9887f7af-0796-4481-aa3d-5f4996f9ed47\" (UID: \"9887f7af-0796-4481-aa3d-5f4996f9ed47\") " Mar 08 21:44:04 crc kubenswrapper[4885]: I0308 21:44:04.603343 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9887f7af-0796-4481-aa3d-5f4996f9ed47-kube-api-access-s5r9l" (OuterVolumeSpecName: "kube-api-access-s5r9l") pod "9887f7af-0796-4481-aa3d-5f4996f9ed47" (UID: "9887f7af-0796-4481-aa3d-5f4996f9ed47"). InnerVolumeSpecName "kube-api-access-s5r9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:44:04 crc kubenswrapper[4885]: I0308 21:44:04.692123 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5r9l\" (UniqueName: \"kubernetes.io/projected/9887f7af-0796-4481-aa3d-5f4996f9ed47-kube-api-access-s5r9l\") on node \"crc\" DevicePath \"\"" Mar 08 21:44:05 crc kubenswrapper[4885]: I0308 21:44:05.115045 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" event={"ID":"9887f7af-0796-4481-aa3d-5f4996f9ed47","Type":"ContainerDied","Data":"ee0809ff828598e522807f85caad1a38502c44983ae469e801fb1077a072ad13"} Mar 08 21:44:05 crc kubenswrapper[4885]: I0308 21:44:05.115121 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee0809ff828598e522807f85caad1a38502c44983ae469e801fb1077a072ad13" Mar 08 21:44:05 crc kubenswrapper[4885]: I0308 21:44:05.115199 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550104-7bjtf" Mar 08 21:44:05 crc kubenswrapper[4885]: I0308 21:44:05.685160 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550098-47g92"] Mar 08 21:44:05 crc kubenswrapper[4885]: I0308 21:44:05.696028 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550098-47g92"] Mar 08 21:44:07 crc kubenswrapper[4885]: I0308 21:44:07.413381 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67c51bcd-c065-4fa7-8318-0d0704836166" path="/var/lib/kubelet/pods/67c51bcd-c065-4fa7-8318-0d0704836166/volumes" Mar 08 21:44:13 crc kubenswrapper[4885]: I0308 21:44:13.149342 4885 scope.go:117] "RemoveContainer" containerID="2fc2dee49966150d464450c5304d2011d968fb7949c03e2bf89d92f9c82630c7" Mar 08 21:44:32 crc kubenswrapper[4885]: I0308 21:44:32.817968 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:44:32 crc kubenswrapper[4885]: I0308 21:44:32.819045 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.164337 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6"] Mar 08 21:45:00 crc kubenswrapper[4885]: E0308 21:45:00.165436 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9887f7af-0796-4481-aa3d-5f4996f9ed47" containerName="oc" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.165453 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9887f7af-0796-4481-aa3d-5f4996f9ed47" containerName="oc" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.165707 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9887f7af-0796-4481-aa3d-5f4996f9ed47" containerName="oc" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.166598 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.170601 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.170857 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.176065 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6"] Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.250751 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwp26\" (UniqueName: \"kubernetes.io/projected/1e47a3ad-b255-4044-b68b-42a99706339d-kube-api-access-gwp26\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.251264 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e47a3ad-b255-4044-b68b-42a99706339d-config-volume\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.251377 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e47a3ad-b255-4044-b68b-42a99706339d-secret-volume\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.353598 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwp26\" (UniqueName: \"kubernetes.io/projected/1e47a3ad-b255-4044-b68b-42a99706339d-kube-api-access-gwp26\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.353648 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e47a3ad-b255-4044-b68b-42a99706339d-config-volume\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.353705 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e47a3ad-b255-4044-b68b-42a99706339d-secret-volume\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.354696 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e47a3ad-b255-4044-b68b-42a99706339d-config-volume\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.360796 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e47a3ad-b255-4044-b68b-42a99706339d-secret-volume\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.377202 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwp26\" (UniqueName: \"kubernetes.io/projected/1e47a3ad-b255-4044-b68b-42a99706339d-kube-api-access-gwp26\") pod \"collect-profiles-29550105-8tjd6\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:00 crc kubenswrapper[4885]: I0308 21:45:00.496381 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:01 crc kubenswrapper[4885]: I0308 21:45:01.006899 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6"] Mar 08 21:45:01 crc kubenswrapper[4885]: W0308 21:45:01.011226 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e47a3ad_b255_4044_b68b_42a99706339d.slice/crio-362995c739d2874fc4e2b8639fed3ed6179cbd1a45288b99d7cd14a62ceb1a14 WatchSource:0}: Error finding container 362995c739d2874fc4e2b8639fed3ed6179cbd1a45288b99d7cd14a62ceb1a14: Status 404 returned error can't find the container with id 362995c739d2874fc4e2b8639fed3ed6179cbd1a45288b99d7cd14a62ceb1a14 Mar 08 21:45:01 crc kubenswrapper[4885]: I0308 21:45:01.877724 4885 generic.go:334] "Generic (PLEG): container finished" podID="1e47a3ad-b255-4044-b68b-42a99706339d" containerID="7169885beaf1c90ab8d33a9c81aa17dfc1d7ecd0abd67f01755ae2b597cfaa85" exitCode=0 Mar 08 21:45:01 crc kubenswrapper[4885]: I0308 21:45:01.877950 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" event={"ID":"1e47a3ad-b255-4044-b68b-42a99706339d","Type":"ContainerDied","Data":"7169885beaf1c90ab8d33a9c81aa17dfc1d7ecd0abd67f01755ae2b597cfaa85"} Mar 08 21:45:01 crc kubenswrapper[4885]: I0308 21:45:01.878162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" event={"ID":"1e47a3ad-b255-4044-b68b-42a99706339d","Type":"ContainerStarted","Data":"362995c739d2874fc4e2b8639fed3ed6179cbd1a45288b99d7cd14a62ceb1a14"} Mar 08 21:45:02 crc kubenswrapper[4885]: I0308 21:45:02.818589 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:45:02 crc kubenswrapper[4885]: I0308 21:45:02.818887 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.345422 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.521451 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e47a3ad-b255-4044-b68b-42a99706339d-secret-volume\") pod \"1e47a3ad-b255-4044-b68b-42a99706339d\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.521541 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwp26\" (UniqueName: \"kubernetes.io/projected/1e47a3ad-b255-4044-b68b-42a99706339d-kube-api-access-gwp26\") pod \"1e47a3ad-b255-4044-b68b-42a99706339d\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.521580 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e47a3ad-b255-4044-b68b-42a99706339d-config-volume\") pod \"1e47a3ad-b255-4044-b68b-42a99706339d\" (UID: \"1e47a3ad-b255-4044-b68b-42a99706339d\") " Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.522709 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e47a3ad-b255-4044-b68b-42a99706339d-config-volume" (OuterVolumeSpecName: "config-volume") pod "1e47a3ad-b255-4044-b68b-42a99706339d" (UID: "1e47a3ad-b255-4044-b68b-42a99706339d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.529638 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e47a3ad-b255-4044-b68b-42a99706339d-kube-api-access-gwp26" (OuterVolumeSpecName: "kube-api-access-gwp26") pod "1e47a3ad-b255-4044-b68b-42a99706339d" (UID: "1e47a3ad-b255-4044-b68b-42a99706339d"). InnerVolumeSpecName "kube-api-access-gwp26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.531598 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e47a3ad-b255-4044-b68b-42a99706339d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1e47a3ad-b255-4044-b68b-42a99706339d" (UID: "1e47a3ad-b255-4044-b68b-42a99706339d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.625516 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1e47a3ad-b255-4044-b68b-42a99706339d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.625574 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwp26\" (UniqueName: \"kubernetes.io/projected/1e47a3ad-b255-4044-b68b-42a99706339d-kube-api-access-gwp26\") on node \"crc\" DevicePath \"\"" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.625595 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e47a3ad-b255-4044-b68b-42a99706339d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.906602 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" event={"ID":"1e47a3ad-b255-4044-b68b-42a99706339d","Type":"ContainerDied","Data":"362995c739d2874fc4e2b8639fed3ed6179cbd1a45288b99d7cd14a62ceb1a14"} Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.907079 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="362995c739d2874fc4e2b8639fed3ed6179cbd1a45288b99d7cd14a62ceb1a14" Mar 08 21:45:03 crc kubenswrapper[4885]: I0308 21:45:03.906705 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550105-8tjd6" Mar 08 21:45:04 crc kubenswrapper[4885]: I0308 21:45:04.462782 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829"] Mar 08 21:45:04 crc kubenswrapper[4885]: I0308 21:45:04.475857 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550060-4d829"] Mar 08 21:45:05 crc kubenswrapper[4885]: I0308 21:45:05.390340 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62c3bcc1-5dd6-411d-8030-a152617aa0a3" path="/var/lib/kubelet/pods/62c3bcc1-5dd6-411d-8030-a152617aa0a3/volumes" Mar 08 21:45:13 crc kubenswrapper[4885]: I0308 21:45:13.284161 4885 scope.go:117] "RemoveContainer" containerID="e07b444034fa8d1cd5c5dd9ad29413db942d04d6d0dccd4d4f03d228986183ea" Mar 08 21:45:32 crc kubenswrapper[4885]: I0308 21:45:32.818733 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:45:32 crc kubenswrapper[4885]: I0308 21:45:32.819334 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:45:32 crc kubenswrapper[4885]: I0308 21:45:32.819407 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:45:32 crc kubenswrapper[4885]: I0308 21:45:32.820653 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e834ea73f0f8d2a810f3e37feb7181558037ef9392084f1f1fea4e210684a34"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:45:32 crc kubenswrapper[4885]: I0308 21:45:32.820750 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://5e834ea73f0f8d2a810f3e37feb7181558037ef9392084f1f1fea4e210684a34" gracePeriod=600 Mar 08 21:45:33 crc kubenswrapper[4885]: I0308 21:45:33.259492 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="5e834ea73f0f8d2a810f3e37feb7181558037ef9392084f1f1fea4e210684a34" exitCode=0 Mar 08 21:45:33 crc kubenswrapper[4885]: I0308 21:45:33.259575 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"5e834ea73f0f8d2a810f3e37feb7181558037ef9392084f1f1fea4e210684a34"} Mar 08 21:45:33 crc kubenswrapper[4885]: I0308 21:45:33.260206 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732"} Mar 08 21:45:33 crc kubenswrapper[4885]: I0308 21:45:33.260345 4885 scope.go:117] "RemoveContainer" containerID="8d16a688d646d8459412934f47e88a8aa0113b747ae366583498877782c74fd9" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.198233 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550106-t45l9"] Mar 08 21:46:00 crc kubenswrapper[4885]: E0308 21:46:00.199684 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e47a3ad-b255-4044-b68b-42a99706339d" containerName="collect-profiles" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.199708 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e47a3ad-b255-4044-b68b-42a99706339d" containerName="collect-profiles" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.200174 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e47a3ad-b255-4044-b68b-42a99706339d" containerName="collect-profiles" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.201532 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.204519 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.204599 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.204715 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.215984 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550106-t45l9"] Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.278177 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqs4m\" (UniqueName: \"kubernetes.io/projected/7e0c5067-1184-4a75-a80b-35b1d03f2a47-kube-api-access-gqs4m\") pod \"auto-csr-approver-29550106-t45l9\" (UID: \"7e0c5067-1184-4a75-a80b-35b1d03f2a47\") " pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.380089 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqs4m\" (UniqueName: \"kubernetes.io/projected/7e0c5067-1184-4a75-a80b-35b1d03f2a47-kube-api-access-gqs4m\") pod \"auto-csr-approver-29550106-t45l9\" (UID: \"7e0c5067-1184-4a75-a80b-35b1d03f2a47\") " pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.399667 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqs4m\" (UniqueName: \"kubernetes.io/projected/7e0c5067-1184-4a75-a80b-35b1d03f2a47-kube-api-access-gqs4m\") pod \"auto-csr-approver-29550106-t45l9\" (UID: \"7e0c5067-1184-4a75-a80b-35b1d03f2a47\") " pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:00 crc kubenswrapper[4885]: I0308 21:46:00.540167 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:01 crc kubenswrapper[4885]: I0308 21:46:01.086134 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550106-t45l9"] Mar 08 21:46:01 crc kubenswrapper[4885]: W0308 21:46:01.094138 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e0c5067_1184_4a75_a80b_35b1d03f2a47.slice/crio-94dedf7fe8110058a0893f00b962c87725b7b729e1acd5480005c03497dec56a WatchSource:0}: Error finding container 94dedf7fe8110058a0893f00b962c87725b7b729e1acd5480005c03497dec56a: Status 404 returned error can't find the container with id 94dedf7fe8110058a0893f00b962c87725b7b729e1acd5480005c03497dec56a Mar 08 21:46:01 crc kubenswrapper[4885]: I0308 21:46:01.631177 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550106-t45l9" event={"ID":"7e0c5067-1184-4a75-a80b-35b1d03f2a47","Type":"ContainerStarted","Data":"94dedf7fe8110058a0893f00b962c87725b7b729e1acd5480005c03497dec56a"} Mar 08 21:46:03 crc kubenswrapper[4885]: I0308 21:46:03.658125 4885 generic.go:334] "Generic (PLEG): container finished" podID="7e0c5067-1184-4a75-a80b-35b1d03f2a47" containerID="ac6daccc4f232068706f6a472830b30704b48c1d2e179f38f98cf3dcf6cbd7b0" exitCode=0 Mar 08 21:46:03 crc kubenswrapper[4885]: I0308 21:46:03.658207 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550106-t45l9" event={"ID":"7e0c5067-1184-4a75-a80b-35b1d03f2a47","Type":"ContainerDied","Data":"ac6daccc4f232068706f6a472830b30704b48c1d2e179f38f98cf3dcf6cbd7b0"} Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.176286 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.336361 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqs4m\" (UniqueName: \"kubernetes.io/projected/7e0c5067-1184-4a75-a80b-35b1d03f2a47-kube-api-access-gqs4m\") pod \"7e0c5067-1184-4a75-a80b-35b1d03f2a47\" (UID: \"7e0c5067-1184-4a75-a80b-35b1d03f2a47\") " Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.342337 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0c5067-1184-4a75-a80b-35b1d03f2a47-kube-api-access-gqs4m" (OuterVolumeSpecName: "kube-api-access-gqs4m") pod "7e0c5067-1184-4a75-a80b-35b1d03f2a47" (UID: "7e0c5067-1184-4a75-a80b-35b1d03f2a47"). InnerVolumeSpecName "kube-api-access-gqs4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.439515 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqs4m\" (UniqueName: \"kubernetes.io/projected/7e0c5067-1184-4a75-a80b-35b1d03f2a47-kube-api-access-gqs4m\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.692670 4885 generic.go:334] "Generic (PLEG): container finished" podID="992d3500-f892-42c6-805f-ae9c96793d0f" containerID="135b71ffe2c5954d98583a529a1ec42b83ce4fb4d31e613f230e8d7339fd376b" exitCode=0 Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.692809 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" event={"ID":"992d3500-f892-42c6-805f-ae9c96793d0f","Type":"ContainerDied","Data":"135b71ffe2c5954d98583a529a1ec42b83ce4fb4d31e613f230e8d7339fd376b"} Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.697132 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550106-t45l9" event={"ID":"7e0c5067-1184-4a75-a80b-35b1d03f2a47","Type":"ContainerDied","Data":"94dedf7fe8110058a0893f00b962c87725b7b729e1acd5480005c03497dec56a"} Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.697182 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94dedf7fe8110058a0893f00b962c87725b7b729e1acd5480005c03497dec56a" Mar 08 21:46:05 crc kubenswrapper[4885]: I0308 21:46:05.697190 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550106-t45l9" Mar 08 21:46:06 crc kubenswrapper[4885]: I0308 21:46:06.280915 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550100-8r628"] Mar 08 21:46:06 crc kubenswrapper[4885]: I0308 21:46:06.296324 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550100-8r628"] Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.231208 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382411 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-combined-ca-bundle\") pod \"992d3500-f892-42c6-805f-ae9c96793d0f\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382492 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dz7c\" (UniqueName: \"kubernetes.io/projected/992d3500-f892-42c6-805f-ae9c96793d0f-kube-api-access-2dz7c\") pod \"992d3500-f892-42c6-805f-ae9c96793d0f\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382594 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ceph\") pod \"992d3500-f892-42c6-805f-ae9c96793d0f\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382665 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-secret-0\") pod \"992d3500-f892-42c6-805f-ae9c96793d0f\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382688 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ssh-key-openstack-cell1\") pod \"992d3500-f892-42c6-805f-ae9c96793d0f\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382727 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-inventory\") pod \"992d3500-f892-42c6-805f-ae9c96793d0f\" (UID: \"992d3500-f892-42c6-805f-ae9c96793d0f\") " Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.382764 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f4584f-eeba-4b88-b31d-79a39f062bd3" path="/var/lib/kubelet/pods/70f4584f-eeba-4b88-b31d-79a39f062bd3/volumes" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.389413 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ceph" (OuterVolumeSpecName: "ceph") pod "992d3500-f892-42c6-805f-ae9c96793d0f" (UID: "992d3500-f892-42c6-805f-ae9c96793d0f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.389836 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/992d3500-f892-42c6-805f-ae9c96793d0f-kube-api-access-2dz7c" (OuterVolumeSpecName: "kube-api-access-2dz7c") pod "992d3500-f892-42c6-805f-ae9c96793d0f" (UID: "992d3500-f892-42c6-805f-ae9c96793d0f"). InnerVolumeSpecName "kube-api-access-2dz7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.401239 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "992d3500-f892-42c6-805f-ae9c96793d0f" (UID: "992d3500-f892-42c6-805f-ae9c96793d0f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.427525 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "992d3500-f892-42c6-805f-ae9c96793d0f" (UID: "992d3500-f892-42c6-805f-ae9c96793d0f"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.431998 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "992d3500-f892-42c6-805f-ae9c96793d0f" (UID: "992d3500-f892-42c6-805f-ae9c96793d0f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.438031 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-inventory" (OuterVolumeSpecName: "inventory") pod "992d3500-f892-42c6-805f-ae9c96793d0f" (UID: "992d3500-f892-42c6-805f-ae9c96793d0f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.486552 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.486597 4885 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.486615 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.486630 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.486643 4885 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/992d3500-f892-42c6-805f-ae9c96793d0f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.486656 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dz7c\" (UniqueName: \"kubernetes.io/projected/992d3500-f892-42c6-805f-ae9c96793d0f-kube-api-access-2dz7c\") on node \"crc\" DevicePath \"\"" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.724620 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.724604 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-rd94l" event={"ID":"992d3500-f892-42c6-805f-ae9c96793d0f","Type":"ContainerDied","Data":"2ee7a4abb905c5d1b376a7b80993ad31e44b7c26d66ceb2dc9a19d787251b55b"} Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.724889 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ee7a4abb905c5d1b376a7b80993ad31e44b7c26d66ceb2dc9a19d787251b55b" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.832019 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-thkw7"] Mar 08 21:46:07 crc kubenswrapper[4885]: E0308 21:46:07.832479 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0c5067-1184-4a75-a80b-35b1d03f2a47" containerName="oc" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.832500 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0c5067-1184-4a75-a80b-35b1d03f2a47" containerName="oc" Mar 08 21:46:07 crc kubenswrapper[4885]: E0308 21:46:07.832519 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="992d3500-f892-42c6-805f-ae9c96793d0f" containerName="libvirt-openstack-openstack-cell1" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.832526 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="992d3500-f892-42c6-805f-ae9c96793d0f" containerName="libvirt-openstack-openstack-cell1" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.832729 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0c5067-1184-4a75-a80b-35b1d03f2a47" containerName="oc" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.832765 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="992d3500-f892-42c6-805f-ae9c96793d0f" containerName="libvirt-openstack-openstack-cell1" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.833519 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.835514 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.835639 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.837095 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.837140 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.837388 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.837724 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.841838 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.851782 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-thkw7"] Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.894808 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.894850 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8bgm\" (UniqueName: \"kubernetes.io/projected/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-kube-api-access-j8bgm\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895058 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895141 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895187 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895209 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895230 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895313 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895358 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895389 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895445 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895482 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:07 crc kubenswrapper[4885]: I0308 21:46:07.895506 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000109 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000206 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000249 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000279 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000350 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000404 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000441 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000487 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000536 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000573 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000613 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000638 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8bgm\" (UniqueName: \"kubernetes.io/projected/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-kube-api-access-j8bgm\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.000681 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.003672 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.003739 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.006743 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.007311 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.007458 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.010311 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.010403 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.011301 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.011461 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ceph\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.014685 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.022898 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-inventory\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.023359 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.026163 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8bgm\" (UniqueName: \"kubernetes.io/projected/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-kube-api-access-j8bgm\") pod \"nova-cell1-openstack-openstack-cell1-thkw7\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.158855 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:46:08 crc kubenswrapper[4885]: I0308 21:46:08.736138 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-thkw7"] Mar 08 21:46:08 crc kubenswrapper[4885]: W0308 21:46:08.741061 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaecb4202_1208_4ba5_8515_2ecf99c8c7d1.slice/crio-fff640cd9dc49ab765e2a27fe434f07099b8eab2a80637829232f43855d5b797 WatchSource:0}: Error finding container fff640cd9dc49ab765e2a27fe434f07099b8eab2a80637829232f43855d5b797: Status 404 returned error can't find the container with id fff640cd9dc49ab765e2a27fe434f07099b8eab2a80637829232f43855d5b797 Mar 08 21:46:09 crc kubenswrapper[4885]: I0308 21:46:09.749540 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" event={"ID":"aecb4202-1208-4ba5-8515-2ecf99c8c7d1","Type":"ContainerStarted","Data":"fcb12117529fd7a6a205069e64cae7ffc810c030e4a281bfe35c2567f1161039"} Mar 08 21:46:09 crc kubenswrapper[4885]: I0308 21:46:09.749840 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" event={"ID":"aecb4202-1208-4ba5-8515-2ecf99c8c7d1","Type":"ContainerStarted","Data":"fff640cd9dc49ab765e2a27fe434f07099b8eab2a80637829232f43855d5b797"} Mar 08 21:46:09 crc kubenswrapper[4885]: I0308 21:46:09.785503 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" podStartSLOduration=2.208969537 podStartE2EDuration="2.785482945s" podCreationTimestamp="2026-03-08 21:46:07 +0000 UTC" firstStartedPulling="2026-03-08 21:46:08.744682173 +0000 UTC m=+8070.140736206" lastFinishedPulling="2026-03-08 21:46:09.321195581 +0000 UTC m=+8070.717249614" observedRunningTime="2026-03-08 21:46:09.776465755 +0000 UTC m=+8071.172519788" watchObservedRunningTime="2026-03-08 21:46:09.785482945 +0000 UTC m=+8071.181536988" Mar 08 21:46:13 crc kubenswrapper[4885]: I0308 21:46:13.384713 4885 scope.go:117] "RemoveContainer" containerID="51c3dd5d7c040092d94f90c0958c32d896513ce729f2d08313181e0c6cc74c41" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.148463 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550108-z4l8k"] Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.150141 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.152345 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.152863 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.152892 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.174710 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550108-z4l8k"] Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.315781 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxttp\" (UniqueName: \"kubernetes.io/projected/ea33e917-a39b-4c83-b80a-9562ddbc2459-kube-api-access-hxttp\") pod \"auto-csr-approver-29550108-z4l8k\" (UID: \"ea33e917-a39b-4c83-b80a-9562ddbc2459\") " pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.418135 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxttp\" (UniqueName: \"kubernetes.io/projected/ea33e917-a39b-4c83-b80a-9562ddbc2459-kube-api-access-hxttp\") pod \"auto-csr-approver-29550108-z4l8k\" (UID: \"ea33e917-a39b-4c83-b80a-9562ddbc2459\") " pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.449224 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxttp\" (UniqueName: \"kubernetes.io/projected/ea33e917-a39b-4c83-b80a-9562ddbc2459-kube-api-access-hxttp\") pod \"auto-csr-approver-29550108-z4l8k\" (UID: \"ea33e917-a39b-4c83-b80a-9562ddbc2459\") " pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:00 crc kubenswrapper[4885]: I0308 21:48:00.486577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:01 crc kubenswrapper[4885]: I0308 21:48:01.057488 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:48:01 crc kubenswrapper[4885]: I0308 21:48:01.059050 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550108-z4l8k"] Mar 08 21:48:01 crc kubenswrapper[4885]: I0308 21:48:01.240891 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" event={"ID":"ea33e917-a39b-4c83-b80a-9562ddbc2459","Type":"ContainerStarted","Data":"2c8d71d1878de46bf1f1c9fc6533a206e3060af4e112eb8cee4d9782d74090f1"} Mar 08 21:48:02 crc kubenswrapper[4885]: I0308 21:48:02.819172 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:48:02 crc kubenswrapper[4885]: I0308 21:48:02.819926 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:48:03 crc kubenswrapper[4885]: I0308 21:48:03.265434 4885 generic.go:334] "Generic (PLEG): container finished" podID="ea33e917-a39b-4c83-b80a-9562ddbc2459" containerID="d4e92452936a7719768a56376a13fed75b8884b1fc954bbcc0c2fcc22f2c6332" exitCode=0 Mar 08 21:48:03 crc kubenswrapper[4885]: I0308 21:48:03.265753 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" event={"ID":"ea33e917-a39b-4c83-b80a-9562ddbc2459","Type":"ContainerDied","Data":"d4e92452936a7719768a56376a13fed75b8884b1fc954bbcc0c2fcc22f2c6332"} Mar 08 21:48:04 crc kubenswrapper[4885]: I0308 21:48:04.700551 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:04 crc kubenswrapper[4885]: I0308 21:48:04.827435 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxttp\" (UniqueName: \"kubernetes.io/projected/ea33e917-a39b-4c83-b80a-9562ddbc2459-kube-api-access-hxttp\") pod \"ea33e917-a39b-4c83-b80a-9562ddbc2459\" (UID: \"ea33e917-a39b-4c83-b80a-9562ddbc2459\") " Mar 08 21:48:04 crc kubenswrapper[4885]: I0308 21:48:04.841378 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea33e917-a39b-4c83-b80a-9562ddbc2459-kube-api-access-hxttp" (OuterVolumeSpecName: "kube-api-access-hxttp") pod "ea33e917-a39b-4c83-b80a-9562ddbc2459" (UID: "ea33e917-a39b-4c83-b80a-9562ddbc2459"). InnerVolumeSpecName "kube-api-access-hxttp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:48:04 crc kubenswrapper[4885]: I0308 21:48:04.929927 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxttp\" (UniqueName: \"kubernetes.io/projected/ea33e917-a39b-4c83-b80a-9562ddbc2459-kube-api-access-hxttp\") on node \"crc\" DevicePath \"\"" Mar 08 21:48:05 crc kubenswrapper[4885]: I0308 21:48:05.297193 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" event={"ID":"ea33e917-a39b-4c83-b80a-9562ddbc2459","Type":"ContainerDied","Data":"2c8d71d1878de46bf1f1c9fc6533a206e3060af4e112eb8cee4d9782d74090f1"} Mar 08 21:48:05 crc kubenswrapper[4885]: I0308 21:48:05.297984 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c8d71d1878de46bf1f1c9fc6533a206e3060af4e112eb8cee4d9782d74090f1" Mar 08 21:48:05 crc kubenswrapper[4885]: I0308 21:48:05.297288 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550108-z4l8k" Mar 08 21:48:05 crc kubenswrapper[4885]: I0308 21:48:05.785699 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550102-v4g2f"] Mar 08 21:48:05 crc kubenswrapper[4885]: I0308 21:48:05.796370 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550102-v4g2f"] Mar 08 21:48:07 crc kubenswrapper[4885]: I0308 21:48:07.397798 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2" path="/var/lib/kubelet/pods/ccbc4f08-5d7d-459e-a254-abeb0ac2d5a2/volumes" Mar 08 21:48:13 crc kubenswrapper[4885]: I0308 21:48:13.546502 4885 scope.go:117] "RemoveContainer" containerID="353dc00bbe28654011d7f403f11d20bd383c705d36f6523ce418dd57c34d32f5" Mar 08 21:48:32 crc kubenswrapper[4885]: I0308 21:48:32.818145 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:48:32 crc kubenswrapper[4885]: I0308 21:48:32.818815 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:48:59 crc kubenswrapper[4885]: I0308 21:48:59.096090 4885 generic.go:334] "Generic (PLEG): container finished" podID="aecb4202-1208-4ba5-8515-2ecf99c8c7d1" containerID="fcb12117529fd7a6a205069e64cae7ffc810c030e4a281bfe35c2567f1161039" exitCode=0 Mar 08 21:48:59 crc kubenswrapper[4885]: I0308 21:48:59.096203 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" event={"ID":"aecb4202-1208-4ba5-8515-2ecf99c8c7d1","Type":"ContainerDied","Data":"fcb12117529fd7a6a205069e64cae7ffc810c030e4a281bfe35c2567f1161039"} Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.618797 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.743876 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8bgm\" (UniqueName: \"kubernetes.io/projected/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-kube-api-access-j8bgm\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.743995 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ssh-key-openstack-cell1\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744128 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-1\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744211 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-0\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744334 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-2\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744400 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-0\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744456 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ceph\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744495 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-1\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744552 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-0\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744661 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-inventory\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744710 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-1\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744744 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-combined-ca-bundle\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.744790 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-3\") pod \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\" (UID: \"aecb4202-1208-4ba5-8515-2ecf99c8c7d1\") " Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.749937 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-kube-api-access-j8bgm" (OuterVolumeSpecName: "kube-api-access-j8bgm") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "kube-api-access-j8bgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.750590 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ceph" (OuterVolumeSpecName: "ceph") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.763514 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.774556 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-inventory" (OuterVolumeSpecName: "inventory") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.777017 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.785804 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.788327 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.800805 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.806313 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.811699 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.812147 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.814037 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.822636 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "aecb4202-1208-4ba5-8515-2ecf99c8c7d1" (UID: "aecb4202-1208-4ba5-8515-2ecf99c8c7d1"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.846938 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.846968 4885 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.846980 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.846989 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847000 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8bgm\" (UniqueName: \"kubernetes.io/projected/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-kube-api-access-j8bgm\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847007 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847015 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847024 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847032 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847040 4885 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847047 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847057 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:00 crc kubenswrapper[4885]: I0308 21:49:00.847066 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/aecb4202-1208-4ba5-8515-2ecf99c8c7d1-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.123520 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" event={"ID":"aecb4202-1208-4ba5-8515-2ecf99c8c7d1","Type":"ContainerDied","Data":"fff640cd9dc49ab765e2a27fe434f07099b8eab2a80637829232f43855d5b797"} Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.123576 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fff640cd9dc49ab765e2a27fe434f07099b8eab2a80637829232f43855d5b797" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.123851 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-thkw7" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.245724 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9mchk"] Mar 08 21:49:01 crc kubenswrapper[4885]: E0308 21:49:01.246267 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea33e917-a39b-4c83-b80a-9562ddbc2459" containerName="oc" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.246291 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea33e917-a39b-4c83-b80a-9562ddbc2459" containerName="oc" Mar 08 21:49:01 crc kubenswrapper[4885]: E0308 21:49:01.246315 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aecb4202-1208-4ba5-8515-2ecf99c8c7d1" containerName="nova-cell1-openstack-openstack-cell1" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.246325 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aecb4202-1208-4ba5-8515-2ecf99c8c7d1" containerName="nova-cell1-openstack-openstack-cell1" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.246601 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aecb4202-1208-4ba5-8515-2ecf99c8c7d1" containerName="nova-cell1-openstack-openstack-cell1" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.246622 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea33e917-a39b-4c83-b80a-9562ddbc2459" containerName="oc" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.247602 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.250832 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.250855 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.251024 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.251207 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.251622 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.255198 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9mchk"] Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.362736 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.362844 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.362902 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.362961 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.363317 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-774f5\" (UniqueName: \"kubernetes.io/projected/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-kube-api-access-774f5\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.363445 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.363706 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceph\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.363823 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-inventory\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.466512 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceph\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.466856 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-inventory\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.467169 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.467480 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.467713 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.467894 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.468157 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-774f5\" (UniqueName: \"kubernetes.io/projected/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-kube-api-access-774f5\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.468344 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.471418 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.471900 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.473854 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceph\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.474155 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ssh-key-openstack-cell1\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.475156 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.476082 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.476439 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-inventory\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.495659 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-774f5\" (UniqueName: \"kubernetes.io/projected/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-kube-api-access-774f5\") pod \"telemetry-openstack-openstack-cell1-9mchk\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:01 crc kubenswrapper[4885]: I0308 21:49:01.619410 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:49:02 crc kubenswrapper[4885]: I0308 21:49:02.181786 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-9mchk"] Mar 08 21:49:02 crc kubenswrapper[4885]: I0308 21:49:02.818752 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:49:02 crc kubenswrapper[4885]: I0308 21:49:02.819094 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:49:02 crc kubenswrapper[4885]: I0308 21:49:02.819141 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:49:02 crc kubenswrapper[4885]: I0308 21:49:02.819829 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:49:02 crc kubenswrapper[4885]: I0308 21:49:02.819886 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" gracePeriod=600 Mar 08 21:49:02 crc kubenswrapper[4885]: E0308 21:49:02.949388 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.146843 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" exitCode=0 Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.146973 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732"} Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.147050 4885 scope.go:117] "RemoveContainer" containerID="5e834ea73f0f8d2a810f3e37feb7181558037ef9392084f1f1fea4e210684a34" Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.147866 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:49:03 crc kubenswrapper[4885]: E0308 21:49:03.148315 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.148753 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" event={"ID":"5583daa6-0c35-4fde-8580-2a4d7ccbfb17","Type":"ContainerStarted","Data":"3deaad07a3fde91f11eb384ddd753833e0df6e58738141160f380ba138e7d97a"} Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.148800 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" event={"ID":"5583daa6-0c35-4fde-8580-2a4d7ccbfb17","Type":"ContainerStarted","Data":"7810ccd43709ca010d0df55ac127e2ace5d877bcf88c0cdb3c388a67357a2ff3"} Mar 08 21:49:03 crc kubenswrapper[4885]: I0308 21:49:03.200556 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" podStartSLOduration=1.683922531 podStartE2EDuration="2.200538341s" podCreationTimestamp="2026-03-08 21:49:01 +0000 UTC" firstStartedPulling="2026-03-08 21:49:02.191365342 +0000 UTC m=+8243.587419405" lastFinishedPulling="2026-03-08 21:49:02.707981182 +0000 UTC m=+8244.104035215" observedRunningTime="2026-03-08 21:49:03.190069092 +0000 UTC m=+8244.586123155" watchObservedRunningTime="2026-03-08 21:49:03.200538341 +0000 UTC m=+8244.596592364" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.681655 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-988dx"] Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.684200 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.712198 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-988dx"] Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.803794 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pslq\" (UniqueName: \"kubernetes.io/projected/943c3a62-1a72-444a-b860-733dfdac5b16-kube-api-access-6pslq\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.804035 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-catalog-content\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.804289 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-utilities\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.906308 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-utilities\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.906390 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pslq\" (UniqueName: \"kubernetes.io/projected/943c3a62-1a72-444a-b860-733dfdac5b16-kube-api-access-6pslq\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.906416 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-catalog-content\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.907120 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-utilities\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.907169 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-catalog-content\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:14 crc kubenswrapper[4885]: I0308 21:49:14.930582 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pslq\" (UniqueName: \"kubernetes.io/projected/943c3a62-1a72-444a-b860-733dfdac5b16-kube-api-access-6pslq\") pod \"redhat-operators-988dx\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:15 crc kubenswrapper[4885]: I0308 21:49:15.006103 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:15 crc kubenswrapper[4885]: I0308 21:49:15.527717 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-988dx"] Mar 08 21:49:16 crc kubenswrapper[4885]: I0308 21:49:16.300379 4885 generic.go:334] "Generic (PLEG): container finished" podID="943c3a62-1a72-444a-b860-733dfdac5b16" containerID="28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3" exitCode=0 Mar 08 21:49:16 crc kubenswrapper[4885]: I0308 21:49:16.300492 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerDied","Data":"28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3"} Mar 08 21:49:16 crc kubenswrapper[4885]: I0308 21:49:16.301049 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerStarted","Data":"07317635e0f0429e716d66e0e8c7d0b42354a41fa7123aaa571931c9b0b000f0"} Mar 08 21:49:16 crc kubenswrapper[4885]: I0308 21:49:16.368045 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:49:16 crc kubenswrapper[4885]: E0308 21:49:16.368296 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:49:18 crc kubenswrapper[4885]: I0308 21:49:18.355602 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerStarted","Data":"d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c"} Mar 08 21:49:22 crc kubenswrapper[4885]: I0308 21:49:22.399593 4885 generic.go:334] "Generic (PLEG): container finished" podID="943c3a62-1a72-444a-b860-733dfdac5b16" containerID="d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c" exitCode=0 Mar 08 21:49:22 crc kubenswrapper[4885]: I0308 21:49:22.399675 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerDied","Data":"d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c"} Mar 08 21:49:23 crc kubenswrapper[4885]: I0308 21:49:23.413505 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerStarted","Data":"0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6"} Mar 08 21:49:23 crc kubenswrapper[4885]: I0308 21:49:23.446216 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-988dx" podStartSLOduration=2.8726832460000002 podStartE2EDuration="9.446194379s" podCreationTimestamp="2026-03-08 21:49:14 +0000 UTC" firstStartedPulling="2026-03-08 21:49:16.302899347 +0000 UTC m=+8257.698953370" lastFinishedPulling="2026-03-08 21:49:22.87641048 +0000 UTC m=+8264.272464503" observedRunningTime="2026-03-08 21:49:23.4342472 +0000 UTC m=+8264.830301243" watchObservedRunningTime="2026-03-08 21:49:23.446194379 +0000 UTC m=+8264.842248412" Mar 08 21:49:25 crc kubenswrapper[4885]: I0308 21:49:25.006307 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:25 crc kubenswrapper[4885]: I0308 21:49:25.006639 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:26 crc kubenswrapper[4885]: I0308 21:49:26.069829 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-988dx" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="registry-server" probeResult="failure" output=< Mar 08 21:49:26 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 21:49:26 crc kubenswrapper[4885]: > Mar 08 21:49:30 crc kubenswrapper[4885]: I0308 21:49:30.369385 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:49:30 crc kubenswrapper[4885]: E0308 21:49:30.370284 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:49:35 crc kubenswrapper[4885]: I0308 21:49:35.067677 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:35 crc kubenswrapper[4885]: I0308 21:49:35.136566 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:35 crc kubenswrapper[4885]: I0308 21:49:35.313464 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-988dx"] Mar 08 21:49:36 crc kubenswrapper[4885]: I0308 21:49:36.583324 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-988dx" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="registry-server" containerID="cri-o://0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6" gracePeriod=2 Mar 08 21:49:36 crc kubenswrapper[4885]: E0308 21:49:36.731344 4885 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod943c3a62_1a72_444a_b860_733dfdac5b16.slice/crio-conmon-0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6.scope\": RecentStats: unable to find data in memory cache]" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.209171 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.240212 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pslq\" (UniqueName: \"kubernetes.io/projected/943c3a62-1a72-444a-b860-733dfdac5b16-kube-api-access-6pslq\") pod \"943c3a62-1a72-444a-b860-733dfdac5b16\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.240444 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-utilities\") pod \"943c3a62-1a72-444a-b860-733dfdac5b16\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.240471 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-catalog-content\") pod \"943c3a62-1a72-444a-b860-733dfdac5b16\" (UID: \"943c3a62-1a72-444a-b860-733dfdac5b16\") " Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.241902 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-utilities" (OuterVolumeSpecName: "utilities") pod "943c3a62-1a72-444a-b860-733dfdac5b16" (UID: "943c3a62-1a72-444a-b860-733dfdac5b16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.251903 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943c3a62-1a72-444a-b860-733dfdac5b16-kube-api-access-6pslq" (OuterVolumeSpecName: "kube-api-access-6pslq") pod "943c3a62-1a72-444a-b860-733dfdac5b16" (UID: "943c3a62-1a72-444a-b860-733dfdac5b16"). InnerVolumeSpecName "kube-api-access-6pslq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.343463 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.343520 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pslq\" (UniqueName: \"kubernetes.io/projected/943c3a62-1a72-444a-b860-733dfdac5b16-kube-api-access-6pslq\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.400783 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "943c3a62-1a72-444a-b860-733dfdac5b16" (UID: "943c3a62-1a72-444a-b860-733dfdac5b16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.446256 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/943c3a62-1a72-444a-b860-733dfdac5b16-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.598904 4885 generic.go:334] "Generic (PLEG): container finished" podID="943c3a62-1a72-444a-b860-733dfdac5b16" containerID="0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6" exitCode=0 Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.598991 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerDied","Data":"0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6"} Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.599061 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-988dx" event={"ID":"943c3a62-1a72-444a-b860-733dfdac5b16","Type":"ContainerDied","Data":"07317635e0f0429e716d66e0e8c7d0b42354a41fa7123aaa571931c9b0b000f0"} Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.599090 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-988dx" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.599096 4885 scope.go:117] "RemoveContainer" containerID="0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.658952 4885 scope.go:117] "RemoveContainer" containerID="d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.669147 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-988dx"] Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.690994 4885 scope.go:117] "RemoveContainer" containerID="28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.694073 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-988dx"] Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.769270 4885 scope.go:117] "RemoveContainer" containerID="0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6" Mar 08 21:49:37 crc kubenswrapper[4885]: E0308 21:49:37.769782 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6\": container with ID starting with 0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6 not found: ID does not exist" containerID="0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.769830 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6"} err="failed to get container status \"0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6\": rpc error: code = NotFound desc = could not find container \"0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6\": container with ID starting with 0f735bbc7a4567aabd90dda0ffe8fce9cf24a84cdad55c1bc9c4ccb808af03a6 not found: ID does not exist" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.769855 4885 scope.go:117] "RemoveContainer" containerID="d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c" Mar 08 21:49:37 crc kubenswrapper[4885]: E0308 21:49:37.770317 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c\": container with ID starting with d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c not found: ID does not exist" containerID="d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.770343 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c"} err="failed to get container status \"d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c\": rpc error: code = NotFound desc = could not find container \"d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c\": container with ID starting with d1922e98261ac40831848bc9df12c7b3d12dedc2c0330f0734576d928768858c not found: ID does not exist" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.770364 4885 scope.go:117] "RemoveContainer" containerID="28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3" Mar 08 21:49:37 crc kubenswrapper[4885]: E0308 21:49:37.771139 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3\": container with ID starting with 28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3 not found: ID does not exist" containerID="28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3" Mar 08 21:49:37 crc kubenswrapper[4885]: I0308 21:49:37.771164 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3"} err="failed to get container status \"28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3\": rpc error: code = NotFound desc = could not find container \"28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3\": container with ID starting with 28d54286bf583cbdfc33383ba6a72d4138f48db237ee0f00f2821c3cae4d96c3 not found: ID does not exist" Mar 08 21:49:39 crc kubenswrapper[4885]: I0308 21:49:39.386482 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" path="/var/lib/kubelet/pods/943c3a62-1a72-444a-b860-733dfdac5b16/volumes" Mar 08 21:49:44 crc kubenswrapper[4885]: I0308 21:49:44.368522 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:49:44 crc kubenswrapper[4885]: E0308 21:49:44.369312 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:49:55 crc kubenswrapper[4885]: I0308 21:49:55.368498 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:49:55 crc kubenswrapper[4885]: E0308 21:49:55.369582 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.150914 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550110-9ztjc"] Mar 08 21:50:00 crc kubenswrapper[4885]: E0308 21:50:00.151954 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="registry-server" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.151971 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="registry-server" Mar 08 21:50:00 crc kubenswrapper[4885]: E0308 21:50:00.151993 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="extract-content" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.152002 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="extract-content" Mar 08 21:50:00 crc kubenswrapper[4885]: E0308 21:50:00.152035 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="extract-utilities" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.152041 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="extract-utilities" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.152261 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="943c3a62-1a72-444a-b860-733dfdac5b16" containerName="registry-server" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.153169 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.156163 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.156184 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.156248 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.167749 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550110-9ztjc"] Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.247868 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzsnf\" (UniqueName: \"kubernetes.io/projected/baf0e32a-60d4-4a44-af91-6bbe65bc82c9-kube-api-access-bzsnf\") pod \"auto-csr-approver-29550110-9ztjc\" (UID: \"baf0e32a-60d4-4a44-af91-6bbe65bc82c9\") " pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.351508 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzsnf\" (UniqueName: \"kubernetes.io/projected/baf0e32a-60d4-4a44-af91-6bbe65bc82c9-kube-api-access-bzsnf\") pod \"auto-csr-approver-29550110-9ztjc\" (UID: \"baf0e32a-60d4-4a44-af91-6bbe65bc82c9\") " pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.373634 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzsnf\" (UniqueName: \"kubernetes.io/projected/baf0e32a-60d4-4a44-af91-6bbe65bc82c9-kube-api-access-bzsnf\") pod \"auto-csr-approver-29550110-9ztjc\" (UID: \"baf0e32a-60d4-4a44-af91-6bbe65bc82c9\") " pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.477499 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:00 crc kubenswrapper[4885]: I0308 21:50:00.998149 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550110-9ztjc"] Mar 08 21:50:01 crc kubenswrapper[4885]: I0308 21:50:01.881395 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" event={"ID":"baf0e32a-60d4-4a44-af91-6bbe65bc82c9","Type":"ContainerStarted","Data":"924b95bbb6317b6a65c4d458f1fea1b8e7f0a14e5ae7b3ca71a36da6871b9601"} Mar 08 21:50:02 crc kubenswrapper[4885]: I0308 21:50:02.892419 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" event={"ID":"baf0e32a-60d4-4a44-af91-6bbe65bc82c9","Type":"ContainerStarted","Data":"6672cbe9c59e44b713aad8bb2c9fe671a86cd6c40abe9e893295200a4e793993"} Mar 08 21:50:02 crc kubenswrapper[4885]: I0308 21:50:02.922077 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" podStartSLOduration=1.6907394519999999 podStartE2EDuration="2.922054456s" podCreationTimestamp="2026-03-08 21:50:00 +0000 UTC" firstStartedPulling="2026-03-08 21:50:01.002134534 +0000 UTC m=+8302.398188557" lastFinishedPulling="2026-03-08 21:50:02.233449528 +0000 UTC m=+8303.629503561" observedRunningTime="2026-03-08 21:50:02.909165392 +0000 UTC m=+8304.305219425" watchObservedRunningTime="2026-03-08 21:50:02.922054456 +0000 UTC m=+8304.318108489" Mar 08 21:50:03 crc kubenswrapper[4885]: I0308 21:50:03.906406 4885 generic.go:334] "Generic (PLEG): container finished" podID="baf0e32a-60d4-4a44-af91-6bbe65bc82c9" containerID="6672cbe9c59e44b713aad8bb2c9fe671a86cd6c40abe9e893295200a4e793993" exitCode=0 Mar 08 21:50:03 crc kubenswrapper[4885]: I0308 21:50:03.906553 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" event={"ID":"baf0e32a-60d4-4a44-af91-6bbe65bc82c9","Type":"ContainerDied","Data":"6672cbe9c59e44b713aad8bb2c9fe671a86cd6c40abe9e893295200a4e793993"} Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.406783 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.510454 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzsnf\" (UniqueName: \"kubernetes.io/projected/baf0e32a-60d4-4a44-af91-6bbe65bc82c9-kube-api-access-bzsnf\") pod \"baf0e32a-60d4-4a44-af91-6bbe65bc82c9\" (UID: \"baf0e32a-60d4-4a44-af91-6bbe65bc82c9\") " Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.530428 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf0e32a-60d4-4a44-af91-6bbe65bc82c9-kube-api-access-bzsnf" (OuterVolumeSpecName: "kube-api-access-bzsnf") pod "baf0e32a-60d4-4a44-af91-6bbe65bc82c9" (UID: "baf0e32a-60d4-4a44-af91-6bbe65bc82c9"). InnerVolumeSpecName "kube-api-access-bzsnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.532164 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzsnf\" (UniqueName: \"kubernetes.io/projected/baf0e32a-60d4-4a44-af91-6bbe65bc82c9-kube-api-access-bzsnf\") on node \"crc\" DevicePath \"\"" Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.936348 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" event={"ID":"baf0e32a-60d4-4a44-af91-6bbe65bc82c9","Type":"ContainerDied","Data":"924b95bbb6317b6a65c4d458f1fea1b8e7f0a14e5ae7b3ca71a36da6871b9601"} Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.936723 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="924b95bbb6317b6a65c4d458f1fea1b8e7f0a14e5ae7b3ca71a36da6871b9601" Mar 08 21:50:05 crc kubenswrapper[4885]: I0308 21:50:05.936655 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550110-9ztjc" Mar 08 21:50:06 crc kubenswrapper[4885]: I0308 21:50:06.020162 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550104-7bjtf"] Mar 08 21:50:06 crc kubenswrapper[4885]: I0308 21:50:06.029087 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550104-7bjtf"] Mar 08 21:50:07 crc kubenswrapper[4885]: I0308 21:50:07.381030 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9887f7af-0796-4481-aa3d-5f4996f9ed47" path="/var/lib/kubelet/pods/9887f7af-0796-4481-aa3d-5f4996f9ed47/volumes" Mar 08 21:50:09 crc kubenswrapper[4885]: I0308 21:50:09.378283 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:50:09 crc kubenswrapper[4885]: E0308 21:50:09.378731 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:50:13 crc kubenswrapper[4885]: I0308 21:50:13.676865 4885 scope.go:117] "RemoveContainer" containerID="dc8414267440eda43954aa07f3b3a3139275d86a35c8b1f3e17198abfe4f8b5c" Mar 08 21:50:21 crc kubenswrapper[4885]: I0308 21:50:21.369173 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:50:21 crc kubenswrapper[4885]: E0308 21:50:21.369963 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:50:32 crc kubenswrapper[4885]: I0308 21:50:32.369223 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:50:32 crc kubenswrapper[4885]: E0308 21:50:32.370314 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:50:45 crc kubenswrapper[4885]: I0308 21:50:45.368423 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:50:45 crc kubenswrapper[4885]: E0308 21:50:45.369830 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:50:56 crc kubenswrapper[4885]: I0308 21:50:56.368775 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:50:56 crc kubenswrapper[4885]: E0308 21:50:56.370137 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:51:07 crc kubenswrapper[4885]: I0308 21:51:07.370148 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:51:07 crc kubenswrapper[4885]: E0308 21:51:07.371248 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:51:19 crc kubenswrapper[4885]: I0308 21:51:19.385453 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:51:19 crc kubenswrapper[4885]: E0308 21:51:19.386591 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:51:34 crc kubenswrapper[4885]: I0308 21:51:34.369191 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:51:34 crc kubenswrapper[4885]: E0308 21:51:34.370046 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:51:45 crc kubenswrapper[4885]: I0308 21:51:45.368556 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:51:45 crc kubenswrapper[4885]: E0308 21:51:45.369425 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:51:56 crc kubenswrapper[4885]: I0308 21:51:56.369143 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:51:56 crc kubenswrapper[4885]: E0308 21:51:56.370467 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.153087 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550112-xxhhl"] Mar 08 21:52:00 crc kubenswrapper[4885]: E0308 21:52:00.154357 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf0e32a-60d4-4a44-af91-6bbe65bc82c9" containerName="oc" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.154376 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf0e32a-60d4-4a44-af91-6bbe65bc82c9" containerName="oc" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.154667 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf0e32a-60d4-4a44-af91-6bbe65bc82c9" containerName="oc" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.155752 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.158098 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.158687 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.158753 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.168137 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550112-xxhhl"] Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.188102 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p54td\" (UniqueName: \"kubernetes.io/projected/2aae4f06-bf3b-4963-92b4-9dfc6bb69621-kube-api-access-p54td\") pod \"auto-csr-approver-29550112-xxhhl\" (UID: \"2aae4f06-bf3b-4963-92b4-9dfc6bb69621\") " pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.289727 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p54td\" (UniqueName: \"kubernetes.io/projected/2aae4f06-bf3b-4963-92b4-9dfc6bb69621-kube-api-access-p54td\") pod \"auto-csr-approver-29550112-xxhhl\" (UID: \"2aae4f06-bf3b-4963-92b4-9dfc6bb69621\") " pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.323153 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p54td\" (UniqueName: \"kubernetes.io/projected/2aae4f06-bf3b-4963-92b4-9dfc6bb69621-kube-api-access-p54td\") pod \"auto-csr-approver-29550112-xxhhl\" (UID: \"2aae4f06-bf3b-4963-92b4-9dfc6bb69621\") " pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:00 crc kubenswrapper[4885]: I0308 21:52:00.483578 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:01 crc kubenswrapper[4885]: I0308 21:52:01.051811 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550112-xxhhl"] Mar 08 21:52:01 crc kubenswrapper[4885]: I0308 21:52:01.442692 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" event={"ID":"2aae4f06-bf3b-4963-92b4-9dfc6bb69621","Type":"ContainerStarted","Data":"d7e2ea54daafbf6031ec8758df8ed83d5b47cb0c91fc4607864110f3b604ca20"} Mar 08 21:52:02 crc kubenswrapper[4885]: I0308 21:52:02.457461 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" event={"ID":"2aae4f06-bf3b-4963-92b4-9dfc6bb69621","Type":"ContainerStarted","Data":"8d2a85311da28c593e02f61d9e21770d4b4946e346cfff8ec77956eb44cbfcfa"} Mar 08 21:52:02 crc kubenswrapper[4885]: I0308 21:52:02.484279 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" podStartSLOduration=1.5832814960000001 podStartE2EDuration="2.48425303s" podCreationTimestamp="2026-03-08 21:52:00 +0000 UTC" firstStartedPulling="2026-03-08 21:52:01.054465173 +0000 UTC m=+8422.450519196" lastFinishedPulling="2026-03-08 21:52:01.955436707 +0000 UTC m=+8423.351490730" observedRunningTime="2026-03-08 21:52:02.479798541 +0000 UTC m=+8423.875852604" watchObservedRunningTime="2026-03-08 21:52:02.48425303 +0000 UTC m=+8423.880307083" Mar 08 21:52:03 crc kubenswrapper[4885]: I0308 21:52:03.480807 4885 generic.go:334] "Generic (PLEG): container finished" podID="2aae4f06-bf3b-4963-92b4-9dfc6bb69621" containerID="8d2a85311da28c593e02f61d9e21770d4b4946e346cfff8ec77956eb44cbfcfa" exitCode=0 Mar 08 21:52:03 crc kubenswrapper[4885]: I0308 21:52:03.480864 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" event={"ID":"2aae4f06-bf3b-4963-92b4-9dfc6bb69621","Type":"ContainerDied","Data":"8d2a85311da28c593e02f61d9e21770d4b4946e346cfff8ec77956eb44cbfcfa"} Mar 08 21:52:04 crc kubenswrapper[4885]: I0308 21:52:04.864646 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.006951 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p54td\" (UniqueName: \"kubernetes.io/projected/2aae4f06-bf3b-4963-92b4-9dfc6bb69621-kube-api-access-p54td\") pod \"2aae4f06-bf3b-4963-92b4-9dfc6bb69621\" (UID: \"2aae4f06-bf3b-4963-92b4-9dfc6bb69621\") " Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.015201 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aae4f06-bf3b-4963-92b4-9dfc6bb69621-kube-api-access-p54td" (OuterVolumeSpecName: "kube-api-access-p54td") pod "2aae4f06-bf3b-4963-92b4-9dfc6bb69621" (UID: "2aae4f06-bf3b-4963-92b4-9dfc6bb69621"). InnerVolumeSpecName "kube-api-access-p54td". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.111121 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p54td\" (UniqueName: \"kubernetes.io/projected/2aae4f06-bf3b-4963-92b4-9dfc6bb69621-kube-api-access-p54td\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.523705 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" event={"ID":"2aae4f06-bf3b-4963-92b4-9dfc6bb69621","Type":"ContainerDied","Data":"d7e2ea54daafbf6031ec8758df8ed83d5b47cb0c91fc4607864110f3b604ca20"} Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.523770 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7e2ea54daafbf6031ec8758df8ed83d5b47cb0c91fc4607864110f3b604ca20" Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.523848 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550112-xxhhl" Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.577946 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550106-t45l9"] Mar 08 21:52:05 crc kubenswrapper[4885]: I0308 21:52:05.590280 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550106-t45l9"] Mar 08 21:52:07 crc kubenswrapper[4885]: I0308 21:52:07.385568 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0c5067-1184-4a75-a80b-35b1d03f2a47" path="/var/lib/kubelet/pods/7e0c5067-1184-4a75-a80b-35b1d03f2a47/volumes" Mar 08 21:52:10 crc kubenswrapper[4885]: I0308 21:52:10.370733 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:52:10 crc kubenswrapper[4885]: E0308 21:52:10.371867 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:52:13 crc kubenswrapper[4885]: I0308 21:52:13.816005 4885 scope.go:117] "RemoveContainer" containerID="ac6daccc4f232068706f6a472830b30704b48c1d2e179f38f98cf3dcf6cbd7b0" Mar 08 21:52:23 crc kubenswrapper[4885]: I0308 21:52:23.369102 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:52:23 crc kubenswrapper[4885]: E0308 21:52:23.370188 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:52:36 crc kubenswrapper[4885]: I0308 21:52:36.936796 4885 generic.go:334] "Generic (PLEG): container finished" podID="5583daa6-0c35-4fde-8580-2a4d7ccbfb17" containerID="3deaad07a3fde91f11eb384ddd753833e0df6e58738141160f380ba138e7d97a" exitCode=0 Mar 08 21:52:36 crc kubenswrapper[4885]: I0308 21:52:36.936858 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" event={"ID":"5583daa6-0c35-4fde-8580-2a4d7ccbfb17","Type":"ContainerDied","Data":"3deaad07a3fde91f11eb384ddd753833e0df6e58738141160f380ba138e7d97a"} Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.371412 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:52:38 crc kubenswrapper[4885]: E0308 21:52:38.372210 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.562601 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.726895 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-0\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727031 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-774f5\" (UniqueName: \"kubernetes.io/projected/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-kube-api-access-774f5\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727075 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ssh-key-openstack-cell1\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727128 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-1\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727258 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-telemetry-combined-ca-bundle\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727350 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceph\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727459 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-2\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.727502 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-inventory\") pod \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\" (UID: \"5583daa6-0c35-4fde-8580-2a4d7ccbfb17\") " Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.733286 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.733684 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceph" (OuterVolumeSpecName: "ceph") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.734582 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-kube-api-access-774f5" (OuterVolumeSpecName: "kube-api-access-774f5") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "kube-api-access-774f5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.760282 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.776196 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.778994 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.787975 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.792850 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-inventory" (OuterVolumeSpecName: "inventory") pod "5583daa6-0c35-4fde-8580-2a4d7ccbfb17" (UID: "5583daa6-0c35-4fde-8580-2a4d7ccbfb17"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831597 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831654 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831681 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-774f5\" (UniqueName: \"kubernetes.io/projected/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-kube-api-access-774f5\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831703 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831722 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831740 4885 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831753 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.831764 4885 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/5583daa6-0c35-4fde-8580-2a4d7ccbfb17-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.966914 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" event={"ID":"5583daa6-0c35-4fde-8580-2a4d7ccbfb17","Type":"ContainerDied","Data":"7810ccd43709ca010d0df55ac127e2ace5d877bcf88c0cdb3c388a67357a2ff3"} Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.967002 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7810ccd43709ca010d0df55ac127e2ace5d877bcf88c0cdb3c388a67357a2ff3" Mar 08 21:52:38 crc kubenswrapper[4885]: I0308 21:52:38.967185 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-9mchk" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.089801 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-f6sk8"] Mar 08 21:52:39 crc kubenswrapper[4885]: E0308 21:52:39.090319 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aae4f06-bf3b-4963-92b4-9dfc6bb69621" containerName="oc" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.090337 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aae4f06-bf3b-4963-92b4-9dfc6bb69621" containerName="oc" Mar 08 21:52:39 crc kubenswrapper[4885]: E0308 21:52:39.090375 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5583daa6-0c35-4fde-8580-2a4d7ccbfb17" containerName="telemetry-openstack-openstack-cell1" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.090382 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5583daa6-0c35-4fde-8580-2a4d7ccbfb17" containerName="telemetry-openstack-openstack-cell1" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.090586 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5583daa6-0c35-4fde-8580-2a4d7ccbfb17" containerName="telemetry-openstack-openstack-cell1" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.090605 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aae4f06-bf3b-4963-92b4-9dfc6bb69621" containerName="oc" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.091423 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.094370 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.094527 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.094673 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.094893 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.095037 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.105766 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-f6sk8"] Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.241063 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.241136 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.241187 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.241215 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxhz9\" (UniqueName: \"kubernetes.io/projected/13f318e2-a78d-497f-bfbc-4c60d9156220-kube-api-access-vxhz9\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.242259 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.242402 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.344419 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.344495 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxhz9\" (UniqueName: \"kubernetes.io/projected/13f318e2-a78d-497f-bfbc-4c60d9156220-kube-api-access-vxhz9\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.344679 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.344722 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.344839 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.344943 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.349646 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.350415 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ssh-key-openstack-cell1\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.351729 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.354091 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.355320 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.377029 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxhz9\" (UniqueName: \"kubernetes.io/projected/13f318e2-a78d-497f-bfbc-4c60d9156220-kube-api-access-vxhz9\") pod \"neutron-sriov-openstack-openstack-cell1-f6sk8\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:39 crc kubenswrapper[4885]: I0308 21:52:39.417039 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:52:40 crc kubenswrapper[4885]: I0308 21:52:40.058503 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-f6sk8"] Mar 08 21:52:40 crc kubenswrapper[4885]: I0308 21:52:40.995658 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" event={"ID":"13f318e2-a78d-497f-bfbc-4c60d9156220","Type":"ContainerStarted","Data":"cd2b9c2fdbb97360195fc12d8da62d789c72f771d227eb3a5cefc9016f5c2a0d"} Mar 08 21:52:40 crc kubenswrapper[4885]: I0308 21:52:40.996093 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" event={"ID":"13f318e2-a78d-497f-bfbc-4c60d9156220","Type":"ContainerStarted","Data":"e50b0327dd2efd6f8e0ff0bc77acc1b1d4df2a4d1247cc52554a573044973234"} Mar 08 21:52:41 crc kubenswrapper[4885]: I0308 21:52:41.022271 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" podStartSLOduration=1.519376574 podStartE2EDuration="2.022241054s" podCreationTimestamp="2026-03-08 21:52:39 +0000 UTC" firstStartedPulling="2026-03-08 21:52:40.064588737 +0000 UTC m=+8461.460642760" lastFinishedPulling="2026-03-08 21:52:40.567453177 +0000 UTC m=+8461.963507240" observedRunningTime="2026-03-08 21:52:41.015510094 +0000 UTC m=+8462.411564157" watchObservedRunningTime="2026-03-08 21:52:41.022241054 +0000 UTC m=+8462.418295117" Mar 08 21:52:51 crc kubenswrapper[4885]: I0308 21:52:51.368666 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:52:51 crc kubenswrapper[4885]: E0308 21:52:51.369673 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:53:02 crc kubenswrapper[4885]: I0308 21:53:02.368982 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:53:02 crc kubenswrapper[4885]: E0308 21:53:02.370187 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:53:13 crc kubenswrapper[4885]: I0308 21:53:13.368630 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:53:13 crc kubenswrapper[4885]: E0308 21:53:13.369518 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.113226 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p28nt"] Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.115843 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.156505 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28nt"] Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.171536 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-catalog-content\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.171629 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-utilities\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.171730 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mdql\" (UniqueName: \"kubernetes.io/projected/9c91d099-582a-47b1-b6b3-a403a4cdd428-kube-api-access-5mdql\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.273559 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mdql\" (UniqueName: \"kubernetes.io/projected/9c91d099-582a-47b1-b6b3-a403a4cdd428-kube-api-access-5mdql\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.273656 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-catalog-content\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.273741 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-utilities\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.274109 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-catalog-content\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.274194 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-utilities\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.297865 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mdql\" (UniqueName: \"kubernetes.io/projected/9c91d099-582a-47b1-b6b3-a403a4cdd428-kube-api-access-5mdql\") pod \"redhat-marketplace-p28nt\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.436374 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:22 crc kubenswrapper[4885]: I0308 21:53:22.936482 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28nt"] Mar 08 21:53:23 crc kubenswrapper[4885]: I0308 21:53:23.523067 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28nt" event={"ID":"9c91d099-582a-47b1-b6b3-a403a4cdd428","Type":"ContainerStarted","Data":"630ba5940a762a4ce58dfa8803f182a3722e17630d480c5a02c18c12176e926c"} Mar 08 21:53:24 crc kubenswrapper[4885]: I0308 21:53:24.535452 4885 generic.go:334] "Generic (PLEG): container finished" podID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerID="31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9" exitCode=0 Mar 08 21:53:24 crc kubenswrapper[4885]: I0308 21:53:24.535533 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28nt" event={"ID":"9c91d099-582a-47b1-b6b3-a403a4cdd428","Type":"ContainerDied","Data":"31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9"} Mar 08 21:53:24 crc kubenswrapper[4885]: I0308 21:53:24.538716 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:53:26 crc kubenswrapper[4885]: I0308 21:53:26.569195 4885 generic.go:334] "Generic (PLEG): container finished" podID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerID="153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5" exitCode=0 Mar 08 21:53:26 crc kubenswrapper[4885]: I0308 21:53:26.569273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28nt" event={"ID":"9c91d099-582a-47b1-b6b3-a403a4cdd428","Type":"ContainerDied","Data":"153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5"} Mar 08 21:53:27 crc kubenswrapper[4885]: I0308 21:53:27.582043 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28nt" event={"ID":"9c91d099-582a-47b1-b6b3-a403a4cdd428","Type":"ContainerStarted","Data":"914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71"} Mar 08 21:53:27 crc kubenswrapper[4885]: I0308 21:53:27.622687 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p28nt" podStartSLOduration=3.075768656 podStartE2EDuration="5.622666776s" podCreationTimestamp="2026-03-08 21:53:22 +0000 UTC" firstStartedPulling="2026-03-08 21:53:24.538394474 +0000 UTC m=+8505.934448507" lastFinishedPulling="2026-03-08 21:53:27.085292604 +0000 UTC m=+8508.481346627" observedRunningTime="2026-03-08 21:53:27.603527484 +0000 UTC m=+8508.999581527" watchObservedRunningTime="2026-03-08 21:53:27.622666776 +0000 UTC m=+8509.018720809" Mar 08 21:53:28 crc kubenswrapper[4885]: I0308 21:53:28.369704 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:53:28 crc kubenswrapper[4885]: E0308 21:53:28.370322 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:53:32 crc kubenswrapper[4885]: I0308 21:53:32.437006 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:32 crc kubenswrapper[4885]: I0308 21:53:32.437359 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:32 crc kubenswrapper[4885]: I0308 21:53:32.503478 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:32 crc kubenswrapper[4885]: I0308 21:53:32.704105 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:32 crc kubenswrapper[4885]: I0308 21:53:32.763328 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28nt"] Mar 08 21:53:34 crc kubenswrapper[4885]: I0308 21:53:34.665428 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p28nt" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="registry-server" containerID="cri-o://914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71" gracePeriod=2 Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.289348 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.396591 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mdql\" (UniqueName: \"kubernetes.io/projected/9c91d099-582a-47b1-b6b3-a403a4cdd428-kube-api-access-5mdql\") pod \"9c91d099-582a-47b1-b6b3-a403a4cdd428\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.396786 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-utilities\") pod \"9c91d099-582a-47b1-b6b3-a403a4cdd428\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.397036 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-catalog-content\") pod \"9c91d099-582a-47b1-b6b3-a403a4cdd428\" (UID: \"9c91d099-582a-47b1-b6b3-a403a4cdd428\") " Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.398027 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-utilities" (OuterVolumeSpecName: "utilities") pod "9c91d099-582a-47b1-b6b3-a403a4cdd428" (UID: "9c91d099-582a-47b1-b6b3-a403a4cdd428"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.398412 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.410813 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c91d099-582a-47b1-b6b3-a403a4cdd428-kube-api-access-5mdql" (OuterVolumeSpecName: "kube-api-access-5mdql") pod "9c91d099-582a-47b1-b6b3-a403a4cdd428" (UID: "9c91d099-582a-47b1-b6b3-a403a4cdd428"). InnerVolumeSpecName "kube-api-access-5mdql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.447225 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9c91d099-582a-47b1-b6b3-a403a4cdd428" (UID: "9c91d099-582a-47b1-b6b3-a403a4cdd428"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.500467 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mdql\" (UniqueName: \"kubernetes.io/projected/9c91d099-582a-47b1-b6b3-a403a4cdd428-kube-api-access-5mdql\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.500500 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9c91d099-582a-47b1-b6b3-a403a4cdd428-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.677852 4885 generic.go:334] "Generic (PLEG): container finished" podID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerID="914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71" exitCode=0 Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.677940 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28nt" event={"ID":"9c91d099-582a-47b1-b6b3-a403a4cdd428","Type":"ContainerDied","Data":"914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71"} Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.678009 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p28nt" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.678045 4885 scope.go:117] "RemoveContainer" containerID="914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.678025 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p28nt" event={"ID":"9c91d099-582a-47b1-b6b3-a403a4cdd428","Type":"ContainerDied","Data":"630ba5940a762a4ce58dfa8803f182a3722e17630d480c5a02c18c12176e926c"} Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.701208 4885 scope.go:117] "RemoveContainer" containerID="153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.724003 4885 scope.go:117] "RemoveContainer" containerID="31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.781419 4885 scope.go:117] "RemoveContainer" containerID="914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71" Mar 08 21:53:35 crc kubenswrapper[4885]: E0308 21:53:35.782247 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71\": container with ID starting with 914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71 not found: ID does not exist" containerID="914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.782425 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71"} err="failed to get container status \"914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71\": rpc error: code = NotFound desc = could not find container \"914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71\": container with ID starting with 914f62ddf6583312e2ca686063145ce7b51315056afb1b8f5aa5f85b62c63c71 not found: ID does not exist" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.782596 4885 scope.go:117] "RemoveContainer" containerID="153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5" Mar 08 21:53:35 crc kubenswrapper[4885]: E0308 21:53:35.783199 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5\": container with ID starting with 153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5 not found: ID does not exist" containerID="153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.783396 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5"} err="failed to get container status \"153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5\": rpc error: code = NotFound desc = could not find container \"153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5\": container with ID starting with 153b94257e4069c612529583fadf807a6adeb19cd7d8e840a842906c4f6c25a5 not found: ID does not exist" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.783540 4885 scope.go:117] "RemoveContainer" containerID="31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9" Mar 08 21:53:35 crc kubenswrapper[4885]: E0308 21:53:35.787076 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9\": container with ID starting with 31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9 not found: ID does not exist" containerID="31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.787127 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9"} err="failed to get container status \"31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9\": rpc error: code = NotFound desc = could not find container \"31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9\": container with ID starting with 31b4bd9914da1f6dcb705c9866760e7ceedff03c6302309174ec0b5fc6ca7be9 not found: ID does not exist" Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.788098 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28nt"] Mar 08 21:53:35 crc kubenswrapper[4885]: I0308 21:53:35.799624 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p28nt"] Mar 08 21:53:37 crc kubenswrapper[4885]: I0308 21:53:37.402516 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" path="/var/lib/kubelet/pods/9c91d099-582a-47b1-b6b3-a403a4cdd428/volumes" Mar 08 21:53:39 crc kubenswrapper[4885]: I0308 21:53:39.374761 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:53:39 crc kubenswrapper[4885]: E0308 21:53:39.375228 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:53:46 crc kubenswrapper[4885]: I0308 21:53:46.826007 4885 generic.go:334] "Generic (PLEG): container finished" podID="13f318e2-a78d-497f-bfbc-4c60d9156220" containerID="cd2b9c2fdbb97360195fc12d8da62d789c72f771d227eb3a5cefc9016f5c2a0d" exitCode=0 Mar 08 21:53:46 crc kubenswrapper[4885]: I0308 21:53:46.826087 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" event={"ID":"13f318e2-a78d-497f-bfbc-4c60d9156220","Type":"ContainerDied","Data":"cd2b9c2fdbb97360195fc12d8da62d789c72f771d227eb3a5cefc9016f5c2a0d"} Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.457814 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.567176 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-combined-ca-bundle\") pod \"13f318e2-a78d-497f-bfbc-4c60d9156220\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.567285 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ssh-key-openstack-cell1\") pod \"13f318e2-a78d-497f-bfbc-4c60d9156220\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.567372 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-agent-neutron-config-0\") pod \"13f318e2-a78d-497f-bfbc-4c60d9156220\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.567409 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ceph\") pod \"13f318e2-a78d-497f-bfbc-4c60d9156220\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.567450 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxhz9\" (UniqueName: \"kubernetes.io/projected/13f318e2-a78d-497f-bfbc-4c60d9156220-kube-api-access-vxhz9\") pod \"13f318e2-a78d-497f-bfbc-4c60d9156220\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.567980 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-inventory\") pod \"13f318e2-a78d-497f-bfbc-4c60d9156220\" (UID: \"13f318e2-a78d-497f-bfbc-4c60d9156220\") " Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.575185 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ceph" (OuterVolumeSpecName: "ceph") pod "13f318e2-a78d-497f-bfbc-4c60d9156220" (UID: "13f318e2-a78d-497f-bfbc-4c60d9156220"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.575361 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13f318e2-a78d-497f-bfbc-4c60d9156220-kube-api-access-vxhz9" (OuterVolumeSpecName: "kube-api-access-vxhz9") pod "13f318e2-a78d-497f-bfbc-4c60d9156220" (UID: "13f318e2-a78d-497f-bfbc-4c60d9156220"). InnerVolumeSpecName "kube-api-access-vxhz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.579180 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "13f318e2-a78d-497f-bfbc-4c60d9156220" (UID: "13f318e2-a78d-497f-bfbc-4c60d9156220"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.619228 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "13f318e2-a78d-497f-bfbc-4c60d9156220" (UID: "13f318e2-a78d-497f-bfbc-4c60d9156220"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.619599 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "13f318e2-a78d-497f-bfbc-4c60d9156220" (UID: "13f318e2-a78d-497f-bfbc-4c60d9156220"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.622411 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-inventory" (OuterVolumeSpecName: "inventory") pod "13f318e2-a78d-497f-bfbc-4c60d9156220" (UID: "13f318e2-a78d-497f-bfbc-4c60d9156220"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.670944 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.671108 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.671197 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.671271 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxhz9\" (UniqueName: \"kubernetes.io/projected/13f318e2-a78d-497f-bfbc-4c60d9156220-kube-api-access-vxhz9\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.671348 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.671420 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13f318e2-a78d-497f-bfbc-4c60d9156220-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.856729 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" event={"ID":"13f318e2-a78d-497f-bfbc-4c60d9156220","Type":"ContainerDied","Data":"e50b0327dd2efd6f8e0ff0bc77acc1b1d4df2a4d1247cc52554a573044973234"} Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.856774 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e50b0327dd2efd6f8e0ff0bc77acc1b1d4df2a4d1247cc52554a573044973234" Mar 08 21:53:48 crc kubenswrapper[4885]: I0308 21:53:48.857333 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-f6sk8" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.022065 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx"] Mar 08 21:53:49 crc kubenswrapper[4885]: E0308 21:53:49.022748 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="extract-content" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.022768 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="extract-content" Mar 08 21:53:49 crc kubenswrapper[4885]: E0308 21:53:49.022783 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="registry-server" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.022792 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="registry-server" Mar 08 21:53:49 crc kubenswrapper[4885]: E0308 21:53:49.022841 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13f318e2-a78d-497f-bfbc-4c60d9156220" containerName="neutron-sriov-openstack-openstack-cell1" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.022851 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="13f318e2-a78d-497f-bfbc-4c60d9156220" containerName="neutron-sriov-openstack-openstack-cell1" Mar 08 21:53:49 crc kubenswrapper[4885]: E0308 21:53:49.022866 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="extract-utilities" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.022874 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="extract-utilities" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.023158 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="13f318e2-a78d-497f-bfbc-4c60d9156220" containerName="neutron-sriov-openstack-openstack-cell1" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.023182 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c91d099-582a-47b1-b6b3-a403a4cdd428" containerName="registry-server" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.024003 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.028639 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.029269 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.029734 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.030693 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.032386 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.060854 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx"] Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.080762 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.080831 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.080931 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7h4f\" (UniqueName: \"kubernetes.io/projected/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-kube-api-access-s7h4f\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.081219 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.081277 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.081316 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.183551 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7h4f\" (UniqueName: \"kubernetes.io/projected/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-kube-api-access-s7h4f\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.183677 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.183701 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.183730 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.183774 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.183811 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.188243 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.188523 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ssh-key-openstack-cell1\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.189214 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.192351 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.192987 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.203122 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7h4f\" (UniqueName: \"kubernetes.io/projected/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-kube-api-access-s7h4f\") pod \"neutron-dhcp-openstack-openstack-cell1-qt4fx\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:49 crc kubenswrapper[4885]: I0308 21:53:49.380406 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:53:50 crc kubenswrapper[4885]: I0308 21:53:50.079235 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx"] Mar 08 21:53:50 crc kubenswrapper[4885]: I0308 21:53:50.885058 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" event={"ID":"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9","Type":"ContainerStarted","Data":"d0bc6f58814aadfc21bb336d61fa5e9392cab32e2c3eaf672c0503eb5cce34c5"} Mar 08 21:53:51 crc kubenswrapper[4885]: I0308 21:53:51.899514 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" event={"ID":"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9","Type":"ContainerStarted","Data":"3181598c10cc01d3050bcca680be2ade2157049e157e5e864164210f5ef985ee"} Mar 08 21:53:51 crc kubenswrapper[4885]: I0308 21:53:51.929224 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" podStartSLOduration=2.127837422 podStartE2EDuration="2.929196238s" podCreationTimestamp="2026-03-08 21:53:49 +0000 UTC" firstStartedPulling="2026-03-08 21:53:50.0752158 +0000 UTC m=+8531.471269823" lastFinishedPulling="2026-03-08 21:53:50.876574606 +0000 UTC m=+8532.272628639" observedRunningTime="2026-03-08 21:53:51.921685067 +0000 UTC m=+8533.317739110" watchObservedRunningTime="2026-03-08 21:53:51.929196238 +0000 UTC m=+8533.325250281" Mar 08 21:53:52 crc kubenswrapper[4885]: I0308 21:53:52.369000 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:53:52 crc kubenswrapper[4885]: E0308 21:53:52.369782 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.156250 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550114-tw65c"] Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.159444 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.162674 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.162844 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.163174 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.172584 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550114-tw65c"] Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.270918 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q94gg\" (UniqueName: \"kubernetes.io/projected/28077c49-e447-4b53-ab0a-078b678e322e-kube-api-access-q94gg\") pod \"auto-csr-approver-29550114-tw65c\" (UID: \"28077c49-e447-4b53-ab0a-078b678e322e\") " pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.373135 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q94gg\" (UniqueName: \"kubernetes.io/projected/28077c49-e447-4b53-ab0a-078b678e322e-kube-api-access-q94gg\") pod \"auto-csr-approver-29550114-tw65c\" (UID: \"28077c49-e447-4b53-ab0a-078b678e322e\") " pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.395074 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q94gg\" (UniqueName: \"kubernetes.io/projected/28077c49-e447-4b53-ab0a-078b678e322e-kube-api-access-q94gg\") pod \"auto-csr-approver-29550114-tw65c\" (UID: \"28077c49-e447-4b53-ab0a-078b678e322e\") " pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.483726 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:00 crc kubenswrapper[4885]: I0308 21:54:00.996939 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550114-tw65c"] Mar 08 21:54:01 crc kubenswrapper[4885]: I0308 21:54:01.045580 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550114-tw65c" event={"ID":"28077c49-e447-4b53-ab0a-078b678e322e","Type":"ContainerStarted","Data":"306c2e12b1fb52993479212fb66225bdc6e39d380b75530d0618372e90263d61"} Mar 08 21:54:03 crc kubenswrapper[4885]: I0308 21:54:03.069537 4885 generic.go:334] "Generic (PLEG): container finished" podID="28077c49-e447-4b53-ab0a-078b678e322e" containerID="3ee4d3c132930646f693aee747f5e8b449d0c9e50fb9f8986810b596ef2d993d" exitCode=0 Mar 08 21:54:03 crc kubenswrapper[4885]: I0308 21:54:03.070005 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550114-tw65c" event={"ID":"28077c49-e447-4b53-ab0a-078b678e322e","Type":"ContainerDied","Data":"3ee4d3c132930646f693aee747f5e8b449d0c9e50fb9f8986810b596ef2d993d"} Mar 08 21:54:04 crc kubenswrapper[4885]: I0308 21:54:04.515007 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:04 crc kubenswrapper[4885]: I0308 21:54:04.591757 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q94gg\" (UniqueName: \"kubernetes.io/projected/28077c49-e447-4b53-ab0a-078b678e322e-kube-api-access-q94gg\") pod \"28077c49-e447-4b53-ab0a-078b678e322e\" (UID: \"28077c49-e447-4b53-ab0a-078b678e322e\") " Mar 08 21:54:04 crc kubenswrapper[4885]: I0308 21:54:04.599810 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28077c49-e447-4b53-ab0a-078b678e322e-kube-api-access-q94gg" (OuterVolumeSpecName: "kube-api-access-q94gg") pod "28077c49-e447-4b53-ab0a-078b678e322e" (UID: "28077c49-e447-4b53-ab0a-078b678e322e"). InnerVolumeSpecName "kube-api-access-q94gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:54:04 crc kubenswrapper[4885]: I0308 21:54:04.694637 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q94gg\" (UniqueName: \"kubernetes.io/projected/28077c49-e447-4b53-ab0a-078b678e322e-kube-api-access-q94gg\") on node \"crc\" DevicePath \"\"" Mar 08 21:54:05 crc kubenswrapper[4885]: I0308 21:54:05.091531 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550114-tw65c" event={"ID":"28077c49-e447-4b53-ab0a-078b678e322e","Type":"ContainerDied","Data":"306c2e12b1fb52993479212fb66225bdc6e39d380b75530d0618372e90263d61"} Mar 08 21:54:05 crc kubenswrapper[4885]: I0308 21:54:05.091574 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="306c2e12b1fb52993479212fb66225bdc6e39d380b75530d0618372e90263d61" Mar 08 21:54:05 crc kubenswrapper[4885]: I0308 21:54:05.091634 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550114-tw65c" Mar 08 21:54:05 crc kubenswrapper[4885]: I0308 21:54:05.623331 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550108-z4l8k"] Mar 08 21:54:05 crc kubenswrapper[4885]: I0308 21:54:05.635505 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550108-z4l8k"] Mar 08 21:54:07 crc kubenswrapper[4885]: I0308 21:54:07.371291 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:54:07 crc kubenswrapper[4885]: I0308 21:54:07.395081 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea33e917-a39b-4c83-b80a-9562ddbc2459" path="/var/lib/kubelet/pods/ea33e917-a39b-4c83-b80a-9562ddbc2459/volumes" Mar 08 21:54:08 crc kubenswrapper[4885]: I0308 21:54:08.128845 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"22f99b9a0135e9d4a897236bc6328dc3a4eed4231aa01d92b86545edeafa5622"} Mar 08 21:54:13 crc kubenswrapper[4885]: I0308 21:54:13.955528 4885 scope.go:117] "RemoveContainer" containerID="d4e92452936a7719768a56376a13fed75b8884b1fc954bbcc0c2fcc22f2c6332" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.105852 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7wdkz"] Mar 08 21:54:43 crc kubenswrapper[4885]: E0308 21:54:43.106854 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28077c49-e447-4b53-ab0a-078b678e322e" containerName="oc" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.106868 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="28077c49-e447-4b53-ab0a-078b678e322e" containerName="oc" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.107167 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="28077c49-e447-4b53-ab0a-078b678e322e" containerName="oc" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.109095 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.122640 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7wdkz"] Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.173992 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fllwn\" (UniqueName: \"kubernetes.io/projected/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-kube-api-access-fllwn\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.174063 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-utilities\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.174110 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-catalog-content\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.276486 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fllwn\" (UniqueName: \"kubernetes.io/projected/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-kube-api-access-fllwn\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.276557 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-utilities\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.276596 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-catalog-content\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.277113 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-utilities\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.277369 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-catalog-content\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.304128 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fllwn\" (UniqueName: \"kubernetes.io/projected/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-kube-api-access-fllwn\") pod \"certified-operators-7wdkz\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:43 crc kubenswrapper[4885]: I0308 21:54:43.442162 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:44 crc kubenswrapper[4885]: I0308 21:54:44.006335 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7wdkz"] Mar 08 21:54:44 crc kubenswrapper[4885]: I0308 21:54:44.631378 4885 generic.go:334] "Generic (PLEG): container finished" podID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerID="9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494" exitCode=0 Mar 08 21:54:44 crc kubenswrapper[4885]: I0308 21:54:44.631486 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerDied","Data":"9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494"} Mar 08 21:54:44 crc kubenswrapper[4885]: I0308 21:54:44.631720 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerStarted","Data":"cb90bf0b0e93febc0744f518d6e1727e51ee94fb0c9756ce8dc19a061955a61c"} Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.514759 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tvp9l"] Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.519854 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.533887 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tvp9l"] Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.534744 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n2vc\" (UniqueName: \"kubernetes.io/projected/4eb826cc-db9d-4f89-9736-2365672249ac-kube-api-access-5n2vc\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.534851 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-catalog-content\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.535059 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-utilities\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.637312 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n2vc\" (UniqueName: \"kubernetes.io/projected/4eb826cc-db9d-4f89-9736-2365672249ac-kube-api-access-5n2vc\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.637375 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-catalog-content\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.637452 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-utilities\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.638023 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-catalog-content\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.638163 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-utilities\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.646627 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerStarted","Data":"06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a"} Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.663039 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n2vc\" (UniqueName: \"kubernetes.io/projected/4eb826cc-db9d-4f89-9736-2365672249ac-kube-api-access-5n2vc\") pod \"community-operators-tvp9l\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:45 crc kubenswrapper[4885]: I0308 21:54:45.850442 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:46 crc kubenswrapper[4885]: I0308 21:54:46.473737 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tvp9l"] Mar 08 21:54:46 crc kubenswrapper[4885]: I0308 21:54:46.662108 4885 generic.go:334] "Generic (PLEG): container finished" podID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerID="06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a" exitCode=0 Mar 08 21:54:46 crc kubenswrapper[4885]: I0308 21:54:46.662184 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerDied","Data":"06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a"} Mar 08 21:54:46 crc kubenswrapper[4885]: I0308 21:54:46.664587 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerStarted","Data":"4e025ae4b6cfd5a9631429dbc5ad8db5f4f6ec52af704b61dc5685dd6b7fdcf7"} Mar 08 21:54:47 crc kubenswrapper[4885]: I0308 21:54:47.683324 4885 generic.go:334] "Generic (PLEG): container finished" podID="4eb826cc-db9d-4f89-9736-2365672249ac" containerID="ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9" exitCode=0 Mar 08 21:54:47 crc kubenswrapper[4885]: I0308 21:54:47.685215 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerDied","Data":"ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9"} Mar 08 21:54:47 crc kubenswrapper[4885]: I0308 21:54:47.702289 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerStarted","Data":"2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4"} Mar 08 21:54:47 crc kubenswrapper[4885]: I0308 21:54:47.730813 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7wdkz" podStartSLOduration=2.285000941 podStartE2EDuration="4.730794784s" podCreationTimestamp="2026-03-08 21:54:43 +0000 UTC" firstStartedPulling="2026-03-08 21:54:44.634241944 +0000 UTC m=+8586.030296007" lastFinishedPulling="2026-03-08 21:54:47.080035837 +0000 UTC m=+8588.476089850" observedRunningTime="2026-03-08 21:54:47.730449855 +0000 UTC m=+8589.126503868" watchObservedRunningTime="2026-03-08 21:54:47.730794784 +0000 UTC m=+8589.126848807" Mar 08 21:54:49 crc kubenswrapper[4885]: I0308 21:54:49.724772 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerStarted","Data":"10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018"} Mar 08 21:54:51 crc kubenswrapper[4885]: I0308 21:54:51.752331 4885 generic.go:334] "Generic (PLEG): container finished" podID="4eb826cc-db9d-4f89-9736-2365672249ac" containerID="10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018" exitCode=0 Mar 08 21:54:51 crc kubenswrapper[4885]: I0308 21:54:51.752429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerDied","Data":"10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018"} Mar 08 21:54:52 crc kubenswrapper[4885]: I0308 21:54:52.773945 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerStarted","Data":"26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca"} Mar 08 21:54:52 crc kubenswrapper[4885]: I0308 21:54:52.824697 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tvp9l" podStartSLOduration=3.363874249 podStartE2EDuration="7.824670267s" podCreationTimestamp="2026-03-08 21:54:45 +0000 UTC" firstStartedPulling="2026-03-08 21:54:47.68754081 +0000 UTC m=+8589.083594863" lastFinishedPulling="2026-03-08 21:54:52.148336828 +0000 UTC m=+8593.544390881" observedRunningTime="2026-03-08 21:54:52.811686691 +0000 UTC m=+8594.207740744" watchObservedRunningTime="2026-03-08 21:54:52.824670267 +0000 UTC m=+8594.220724330" Mar 08 21:54:53 crc kubenswrapper[4885]: I0308 21:54:53.443372 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:53 crc kubenswrapper[4885]: I0308 21:54:53.443635 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:53 crc kubenswrapper[4885]: I0308 21:54:53.508530 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:53 crc kubenswrapper[4885]: I0308 21:54:53.856213 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:55 crc kubenswrapper[4885]: I0308 21:54:55.502361 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7wdkz"] Mar 08 21:54:55 crc kubenswrapper[4885]: I0308 21:54:55.810017 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7wdkz" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="registry-server" containerID="cri-o://2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4" gracePeriod=2 Mar 08 21:54:55 crc kubenswrapper[4885]: I0308 21:54:55.850801 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:55 crc kubenswrapper[4885]: I0308 21:54:55.850855 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:55 crc kubenswrapper[4885]: I0308 21:54:55.915074 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.412895 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.500869 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-catalog-content\") pod \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.501101 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-utilities\") pod \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.501126 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fllwn\" (UniqueName: \"kubernetes.io/projected/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-kube-api-access-fllwn\") pod \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\" (UID: \"bc4b25e1-fca9-4293-bc47-4a56fa5be25e\") " Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.501774 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-utilities" (OuterVolumeSpecName: "utilities") pod "bc4b25e1-fca9-4293-bc47-4a56fa5be25e" (UID: "bc4b25e1-fca9-4293-bc47-4a56fa5be25e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.507134 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-kube-api-access-fllwn" (OuterVolumeSpecName: "kube-api-access-fllwn") pod "bc4b25e1-fca9-4293-bc47-4a56fa5be25e" (UID: "bc4b25e1-fca9-4293-bc47-4a56fa5be25e"). InnerVolumeSpecName "kube-api-access-fllwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.578426 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc4b25e1-fca9-4293-bc47-4a56fa5be25e" (UID: "bc4b25e1-fca9-4293-bc47-4a56fa5be25e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.603764 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.603806 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.603821 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fllwn\" (UniqueName: \"kubernetes.io/projected/bc4b25e1-fca9-4293-bc47-4a56fa5be25e-kube-api-access-fllwn\") on node \"crc\" DevicePath \"\"" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.820850 4885 generic.go:334] "Generic (PLEG): container finished" podID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerID="2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4" exitCode=0 Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.822584 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7wdkz" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.823001 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerDied","Data":"2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4"} Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.823035 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7wdkz" event={"ID":"bc4b25e1-fca9-4293-bc47-4a56fa5be25e","Type":"ContainerDied","Data":"cb90bf0b0e93febc0744f518d6e1727e51ee94fb0c9756ce8dc19a061955a61c"} Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.823163 4885 scope.go:117] "RemoveContainer" containerID="2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.844193 4885 scope.go:117] "RemoveContainer" containerID="06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.865328 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7wdkz"] Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.875197 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7wdkz"] Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.891950 4885 scope.go:117] "RemoveContainer" containerID="9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.937709 4885 scope.go:117] "RemoveContainer" containerID="2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4" Mar 08 21:54:56 crc kubenswrapper[4885]: E0308 21:54:56.938266 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4\": container with ID starting with 2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4 not found: ID does not exist" containerID="2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.938375 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4"} err="failed to get container status \"2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4\": rpc error: code = NotFound desc = could not find container \"2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4\": container with ID starting with 2457850b5dde3bce1912d3f354002e872cf7437620fb5de94e231e6cc94772f4 not found: ID does not exist" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.938450 4885 scope.go:117] "RemoveContainer" containerID="06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a" Mar 08 21:54:56 crc kubenswrapper[4885]: E0308 21:54:56.938833 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a\": container with ID starting with 06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a not found: ID does not exist" containerID="06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.938988 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a"} err="failed to get container status \"06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a\": rpc error: code = NotFound desc = could not find container \"06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a\": container with ID starting with 06685dfe9d1999e7b7c8e6e39e8c071215f7130efc5c68ed9518aada440ec13a not found: ID does not exist" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.939083 4885 scope.go:117] "RemoveContainer" containerID="9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494" Mar 08 21:54:56 crc kubenswrapper[4885]: E0308 21:54:56.939656 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494\": container with ID starting with 9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494 not found: ID does not exist" containerID="9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494" Mar 08 21:54:56 crc kubenswrapper[4885]: I0308 21:54:56.939773 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494"} err="failed to get container status \"9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494\": rpc error: code = NotFound desc = could not find container \"9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494\": container with ID starting with 9e4eee21d218c5d346925b402e9719d3d3bb05e3b4dbda4c3057ca3f96575494 not found: ID does not exist" Mar 08 21:54:57 crc kubenswrapper[4885]: I0308 21:54:57.390617 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" path="/var/lib/kubelet/pods/bc4b25e1-fca9-4293-bc47-4a56fa5be25e/volumes" Mar 08 21:55:05 crc kubenswrapper[4885]: I0308 21:55:05.928578 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:55:06 crc kubenswrapper[4885]: I0308 21:55:06.032493 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tvp9l"] Mar 08 21:55:06 crc kubenswrapper[4885]: I0308 21:55:06.963895 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tvp9l" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="registry-server" containerID="cri-o://26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca" gracePeriod=2 Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.616939 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.684279 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n2vc\" (UniqueName: \"kubernetes.io/projected/4eb826cc-db9d-4f89-9736-2365672249ac-kube-api-access-5n2vc\") pod \"4eb826cc-db9d-4f89-9736-2365672249ac\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.684601 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-catalog-content\") pod \"4eb826cc-db9d-4f89-9736-2365672249ac\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.684749 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-utilities\") pod \"4eb826cc-db9d-4f89-9736-2365672249ac\" (UID: \"4eb826cc-db9d-4f89-9736-2365672249ac\") " Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.685736 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-utilities" (OuterVolumeSpecName: "utilities") pod "4eb826cc-db9d-4f89-9736-2365672249ac" (UID: "4eb826cc-db9d-4f89-9736-2365672249ac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.690329 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb826cc-db9d-4f89-9736-2365672249ac-kube-api-access-5n2vc" (OuterVolumeSpecName: "kube-api-access-5n2vc") pod "4eb826cc-db9d-4f89-9736-2365672249ac" (UID: "4eb826cc-db9d-4f89-9736-2365672249ac"). InnerVolumeSpecName "kube-api-access-5n2vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.739052 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4eb826cc-db9d-4f89-9736-2365672249ac" (UID: "4eb826cc-db9d-4f89-9736-2365672249ac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.788330 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n2vc\" (UniqueName: \"kubernetes.io/projected/4eb826cc-db9d-4f89-9736-2365672249ac-kube-api-access-5n2vc\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.788391 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:07 crc kubenswrapper[4885]: I0308 21:55:07.788411 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eb826cc-db9d-4f89-9736-2365672249ac-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:07.997602 4885 generic.go:334] "Generic (PLEG): container finished" podID="4eb826cc-db9d-4f89-9736-2365672249ac" containerID="26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca" exitCode=0 Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:07.998198 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tvp9l" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:07.998218 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerDied","Data":"26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca"} Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.005425 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tvp9l" event={"ID":"4eb826cc-db9d-4f89-9736-2365672249ac","Type":"ContainerDied","Data":"4e025ae4b6cfd5a9631429dbc5ad8db5f4f6ec52af704b61dc5685dd6b7fdcf7"} Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.005490 4885 scope.go:117] "RemoveContainer" containerID="26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.059330 4885 scope.go:117] "RemoveContainer" containerID="10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.063320 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tvp9l"] Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.075209 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tvp9l"] Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.099525 4885 scope.go:117] "RemoveContainer" containerID="ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.140345 4885 scope.go:117] "RemoveContainer" containerID="26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca" Mar 08 21:55:08 crc kubenswrapper[4885]: E0308 21:55:08.140713 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca\": container with ID starting with 26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca not found: ID does not exist" containerID="26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.140755 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca"} err="failed to get container status \"26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca\": rpc error: code = NotFound desc = could not find container \"26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca\": container with ID starting with 26d940f27e26ac83fcb8b35a1df6cc1b6b504685e870bcd4d50e749ecc7842ca not found: ID does not exist" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.140807 4885 scope.go:117] "RemoveContainer" containerID="10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018" Mar 08 21:55:08 crc kubenswrapper[4885]: E0308 21:55:08.141057 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018\": container with ID starting with 10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018 not found: ID does not exist" containerID="10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.141099 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018"} err="failed to get container status \"10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018\": rpc error: code = NotFound desc = could not find container \"10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018\": container with ID starting with 10dfabb2b392b2515e15d0baa962e39dccaf131936bc8e54a001e45723c69018 not found: ID does not exist" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.141118 4885 scope.go:117] "RemoveContainer" containerID="ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9" Mar 08 21:55:08 crc kubenswrapper[4885]: E0308 21:55:08.141356 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9\": container with ID starting with ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9 not found: ID does not exist" containerID="ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9" Mar 08 21:55:08 crc kubenswrapper[4885]: I0308 21:55:08.141397 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9"} err="failed to get container status \"ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9\": rpc error: code = NotFound desc = could not find container \"ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9\": container with ID starting with ac42d77912841375162c1367e78f2583c82041163db56e639169d9eccd2f95f9 not found: ID does not exist" Mar 08 21:55:09 crc kubenswrapper[4885]: I0308 21:55:09.387698 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" path="/var/lib/kubelet/pods/4eb826cc-db9d-4f89-9736-2365672249ac/volumes" Mar 08 21:55:12 crc kubenswrapper[4885]: I0308 21:55:12.055691 4885 generic.go:334] "Generic (PLEG): container finished" podID="b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" containerID="3181598c10cc01d3050bcca680be2ade2157049e157e5e864164210f5ef985ee" exitCode=0 Mar 08 21:55:12 crc kubenswrapper[4885]: I0308 21:55:12.055810 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" event={"ID":"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9","Type":"ContainerDied","Data":"3181598c10cc01d3050bcca680be2ade2157049e157e5e864164210f5ef985ee"} Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.586640 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.636061 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7h4f\" (UniqueName: \"kubernetes.io/projected/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-kube-api-access-s7h4f\") pod \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.636391 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-inventory\") pod \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.636513 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ceph\") pod \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.636550 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-combined-ca-bundle\") pod \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.636599 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ssh-key-openstack-cell1\") pod \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.636683 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-agent-neutron-config-0\") pod \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\" (UID: \"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9\") " Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.643048 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ceph" (OuterVolumeSpecName: "ceph") pod "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" (UID: "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.646243 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" (UID: "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.661154 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-kube-api-access-s7h4f" (OuterVolumeSpecName: "kube-api-access-s7h4f") pod "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" (UID: "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9"). InnerVolumeSpecName "kube-api-access-s7h4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.666117 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-inventory" (OuterVolumeSpecName: "inventory") pod "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" (UID: "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.675242 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" (UID: "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.683746 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" (UID: "b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.740423 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7h4f\" (UniqueName: \"kubernetes.io/projected/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-kube-api-access-s7h4f\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.740454 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.740465 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.740475 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.740485 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:13 crc kubenswrapper[4885]: I0308 21:55:13.740494 4885 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:14 crc kubenswrapper[4885]: I0308 21:55:14.081998 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" event={"ID":"b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9","Type":"ContainerDied","Data":"d0bc6f58814aadfc21bb336d61fa5e9392cab32e2c3eaf672c0503eb5cce34c5"} Mar 08 21:55:14 crc kubenswrapper[4885]: I0308 21:55:14.082073 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-qt4fx" Mar 08 21:55:14 crc kubenswrapper[4885]: I0308 21:55:14.082097 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0bc6f58814aadfc21bb336d61fa5e9392cab32e2c3eaf672c0503eb5cce34c5" Mar 08 21:55:36 crc kubenswrapper[4885]: I0308 21:55:36.575061 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:55:36 crc kubenswrapper[4885]: I0308 21:55:36.575913 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="b042d37f-f908-40b8-88be-21798a9428f6" containerName="nova-cell0-conductor-conductor" containerID="cri-o://3eb13c978d994d153edf32828a7f7fe52fe2b797781fd84bb013541ebf4584bf" gracePeriod=30 Mar 08 21:55:36 crc kubenswrapper[4885]: I0308 21:55:36.615593 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:55:36 crc kubenswrapper[4885]: I0308 21:55:36.616284 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="95d9d37c-0204-47e9-956d-d93f2dd1e94d" containerName="nova-cell1-conductor-conductor" containerID="cri-o://a2acd6bf101df1fb3fec18f77f85983c836074e8e9c6420f407dfa0fe15f85c1" gracePeriod=30 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.243740 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.243994 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" containerName="nova-scheduler-scheduler" containerID="cri-o://949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" gracePeriod=30 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.331613 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.332093 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-log" containerID="cri-o://2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea" gracePeriod=30 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.332137 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-api" containerID="cri-o://cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be" gracePeriod=30 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.348198 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.348763 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-log" containerID="cri-o://37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1" gracePeriod=30 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.349218 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-metadata" containerID="cri-o://88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc" gracePeriod=30 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.413289 4885 generic.go:334] "Generic (PLEG): container finished" podID="95d9d37c-0204-47e9-956d-d93f2dd1e94d" containerID="a2acd6bf101df1fb3fec18f77f85983c836074e8e9c6420f407dfa0fe15f85c1" exitCode=0 Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.413336 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"95d9d37c-0204-47e9-956d-d93f2dd1e94d","Type":"ContainerDied","Data":"a2acd6bf101df1fb3fec18f77f85983c836074e8e9c6420f407dfa0fe15f85c1"} Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.813141 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.910891 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vfz6\" (UniqueName: \"kubernetes.io/projected/95d9d37c-0204-47e9-956d-d93f2dd1e94d-kube-api-access-5vfz6\") pod \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.911035 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-combined-ca-bundle\") pod \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.911117 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-config-data\") pod \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\" (UID: \"95d9d37c-0204-47e9-956d-d93f2dd1e94d\") " Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.924703 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d9d37c-0204-47e9-956d-d93f2dd1e94d-kube-api-access-5vfz6" (OuterVolumeSpecName: "kube-api-access-5vfz6") pod "95d9d37c-0204-47e9-956d-d93f2dd1e94d" (UID: "95d9d37c-0204-47e9-956d-d93f2dd1e94d"). InnerVolumeSpecName "kube-api-access-5vfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.939079 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-config-data" (OuterVolumeSpecName: "config-data") pod "95d9d37c-0204-47e9-956d-d93f2dd1e94d" (UID: "95d9d37c-0204-47e9-956d-d93f2dd1e94d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:37 crc kubenswrapper[4885]: I0308 21:55:37.945023 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95d9d37c-0204-47e9-956d-d93f2dd1e94d" (UID: "95d9d37c-0204-47e9-956d-d93f2dd1e94d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.014336 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.014369 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95d9d37c-0204-47e9-956d-d93f2dd1e94d-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.014378 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vfz6\" (UniqueName: \"kubernetes.io/projected/95d9d37c-0204-47e9-956d-d93f2dd1e94d-kube-api-access-5vfz6\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.424285 4885 generic.go:334] "Generic (PLEG): container finished" podID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerID="2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea" exitCode=143 Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.424366 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f008edbb-a92d-45b1-ab9d-a56978d20e75","Type":"ContainerDied","Data":"2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea"} Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.426967 4885 generic.go:334] "Generic (PLEG): container finished" podID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerID="37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1" exitCode=143 Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.427032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a","Type":"ContainerDied","Data":"37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1"} Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.428262 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"95d9d37c-0204-47e9-956d-d93f2dd1e94d","Type":"ContainerDied","Data":"4989a2ecee24254fe4aca7e76f1e2e40b733be1e70cbeec7d60131da54ab2fd4"} Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.428291 4885 scope.go:117] "RemoveContainer" containerID="a2acd6bf101df1fb3fec18f77f85983c836074e8e9c6420f407dfa0fe15f85c1" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.428326 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.475547 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.491739 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503253 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503763 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="extract-content" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503781 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="extract-content" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503804 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95d9d37c-0204-47e9-956d-d93f2dd1e94d" containerName="nova-cell1-conductor-conductor" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503814 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d9d37c-0204-47e9-956d-d93f2dd1e94d" containerName="nova-cell1-conductor-conductor" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503826 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="registry-server" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503831 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="registry-server" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503843 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="registry-server" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503849 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="registry-server" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503862 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="extract-content" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503868 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="extract-content" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503885 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503891 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503902 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="extract-utilities" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503908 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="extract-utilities" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.503945 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="extract-utilities" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.503953 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="extract-utilities" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.504193 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eb826cc-db9d-4f89-9736-2365672249ac" containerName="registry-server" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.504215 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9" containerName="neutron-dhcp-openstack-openstack-cell1" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.504271 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc4b25e1-fca9-4293-bc47-4a56fa5be25e" containerName="registry-server" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.504285 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="95d9d37c-0204-47e9-956d-d93f2dd1e94d" containerName="nova-cell1-conductor-conductor" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.505116 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.507429 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.514433 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.629283 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9w8v\" (UniqueName: \"kubernetes.io/projected/390628a6-50b8-491e-bc5d-80a524b67be6-kube-api-access-m9w8v\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.629522 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390628a6-50b8-491e-bc5d-80a524b67be6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.629606 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390628a6-50b8-491e-bc5d-80a524b67be6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.732021 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390628a6-50b8-491e-bc5d-80a524b67be6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.732113 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390628a6-50b8-491e-bc5d-80a524b67be6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.732183 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9w8v\" (UniqueName: \"kubernetes.io/projected/390628a6-50b8-491e-bc5d-80a524b67be6-kube-api-access-m9w8v\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.736740 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/390628a6-50b8-491e-bc5d-80a524b67be6-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.737444 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390628a6-50b8-491e-bc5d-80a524b67be6-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.754651 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9w8v\" (UniqueName: \"kubernetes.io/projected/390628a6-50b8-491e-bc5d-80a524b67be6-kube-api-access-m9w8v\") pod \"nova-cell1-conductor-0\" (UID: \"390628a6-50b8-491e-bc5d-80a524b67be6\") " pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.766745 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.768745 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.769999 4885 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 08 21:55:38 crc kubenswrapper[4885]: E0308 21:55:38.770045 4885 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" containerName="nova-scheduler-scheduler" Mar 08 21:55:38 crc kubenswrapper[4885]: I0308 21:55:38.825405 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.298130 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.392639 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95d9d37c-0204-47e9-956d-d93f2dd1e94d" path="/var/lib/kubelet/pods/95d9d37c-0204-47e9-956d-d93f2dd1e94d/volumes" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.446044 4885 generic.go:334] "Generic (PLEG): container finished" podID="b042d37f-f908-40b8-88be-21798a9428f6" containerID="3eb13c978d994d153edf32828a7f7fe52fe2b797781fd84bb013541ebf4584bf" exitCode=0 Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.446081 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b042d37f-f908-40b8-88be-21798a9428f6","Type":"ContainerDied","Data":"3eb13c978d994d153edf32828a7f7fe52fe2b797781fd84bb013541ebf4584bf"} Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.447309 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"390628a6-50b8-491e-bc5d-80a524b67be6","Type":"ContainerStarted","Data":"baff6c3712c2b2039b847d6c62c829fe033c897beac1cf5de551094705fb1cad"} Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.723318 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.854744 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-combined-ca-bundle\") pod \"b042d37f-f908-40b8-88be-21798a9428f6\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.855580 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv9z6\" (UniqueName: \"kubernetes.io/projected/b042d37f-f908-40b8-88be-21798a9428f6-kube-api-access-tv9z6\") pod \"b042d37f-f908-40b8-88be-21798a9428f6\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.855696 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-config-data\") pod \"b042d37f-f908-40b8-88be-21798a9428f6\" (UID: \"b042d37f-f908-40b8-88be-21798a9428f6\") " Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.861189 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b042d37f-f908-40b8-88be-21798a9428f6-kube-api-access-tv9z6" (OuterVolumeSpecName: "kube-api-access-tv9z6") pod "b042d37f-f908-40b8-88be-21798a9428f6" (UID: "b042d37f-f908-40b8-88be-21798a9428f6"). InnerVolumeSpecName "kube-api-access-tv9z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.887889 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-config-data" (OuterVolumeSpecName: "config-data") pod "b042d37f-f908-40b8-88be-21798a9428f6" (UID: "b042d37f-f908-40b8-88be-21798a9428f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.908668 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b042d37f-f908-40b8-88be-21798a9428f6" (UID: "b042d37f-f908-40b8-88be-21798a9428f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.959537 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.959595 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv9z6\" (UniqueName: \"kubernetes.io/projected/b042d37f-f908-40b8-88be-21798a9428f6-kube-api-access-tv9z6\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:39 crc kubenswrapper[4885]: I0308 21:55:39.959617 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b042d37f-f908-40b8-88be-21798a9428f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.464121 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"390628a6-50b8-491e-bc5d-80a524b67be6","Type":"ContainerStarted","Data":"5a60c52e16059b05f1034589a8a5bab7af13e01f6b8ac1e2d16ee1100d38fd62"} Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.465820 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.479425 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b042d37f-f908-40b8-88be-21798a9428f6","Type":"ContainerDied","Data":"80d76197d607834e3f09a7896979af3b1a5308464484d654754ae837dd80a0ab"} Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.479478 4885 scope.go:117] "RemoveContainer" containerID="3eb13c978d994d153edf32828a7f7fe52fe2b797781fd84bb013541ebf4584bf" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.479572 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.497271 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.497247931 podStartE2EDuration="2.497247931s" podCreationTimestamp="2026-03-08 21:55:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:55:40.484747207 +0000 UTC m=+8641.880801240" watchObservedRunningTime="2026-03-08 21:55:40.497247931 +0000 UTC m=+8641.893301944" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.501918 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.127:8775/\": read tcp 10.217.0.2:49212->10.217.1.127:8775: read: connection reset by peer" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.502179 4885 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.127:8775/\": read tcp 10.217.0.2:49202->10.217.1.127:8775: read: connection reset by peer" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.540540 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.572001 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.579978 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:55:40 crc kubenswrapper[4885]: E0308 21:55:40.580484 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b042d37f-f908-40b8-88be-21798a9428f6" containerName="nova-cell0-conductor-conductor" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.580520 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="b042d37f-f908-40b8-88be-21798a9428f6" containerName="nova-cell0-conductor-conductor" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.580743 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="b042d37f-f908-40b8-88be-21798a9428f6" containerName="nova-cell0-conductor-conductor" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.581489 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.586507 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.590880 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.696637 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3534e95e-b33c-4294-98d0-f758ea92cf72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.696672 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cghxl\" (UniqueName: \"kubernetes.io/projected/3534e95e-b33c-4294-98d0-f758ea92cf72-kube-api-access-cghxl\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.698082 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3534e95e-b33c-4294-98d0-f758ea92cf72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.805319 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3534e95e-b33c-4294-98d0-f758ea92cf72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.805723 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cghxl\" (UniqueName: \"kubernetes.io/projected/3534e95e-b33c-4294-98d0-f758ea92cf72-kube-api-access-cghxl\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.806615 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3534e95e-b33c-4294-98d0-f758ea92cf72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.811536 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3534e95e-b33c-4294-98d0-f758ea92cf72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.840529 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3534e95e-b33c-4294-98d0-f758ea92cf72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.846579 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cghxl\" (UniqueName: \"kubernetes.io/projected/3534e95e-b33c-4294-98d0-f758ea92cf72-kube-api-access-cghxl\") pod \"nova-cell0-conductor-0\" (UID: \"3534e95e-b33c-4294-98d0-f758ea92cf72\") " pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:40 crc kubenswrapper[4885]: I0308 21:55:40.946161 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.009242 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-logs\") pod \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.009708 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-config-data\") pod \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.009732 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56dvv\" (UniqueName: \"kubernetes.io/projected/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-kube-api-access-56dvv\") pod \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.010301 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-combined-ca-bundle\") pod \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\" (UID: \"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.010238 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-logs" (OuterVolumeSpecName: "logs") pod "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" (UID: "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.011072 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.013695 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-kube-api-access-56dvv" (OuterVolumeSpecName: "kube-api-access-56dvv") pod "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" (UID: "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a"). InnerVolumeSpecName "kube-api-access-56dvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.018875 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.048600 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-config-data" (OuterVolumeSpecName: "config-data") pod "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" (UID: "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.051993 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.053864 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" (UID: "5e9cda09-9781-4b87-bec3-9f23ea3f0e3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.112508 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-combined-ca-bundle\") pod \"f008edbb-a92d-45b1-ab9d-a56978d20e75\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.112749 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttvw5\" (UniqueName: \"kubernetes.io/projected/f008edbb-a92d-45b1-ab9d-a56978d20e75-kube-api-access-ttvw5\") pod \"f008edbb-a92d-45b1-ab9d-a56978d20e75\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.112812 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-config-data\") pod \"f008edbb-a92d-45b1-ab9d-a56978d20e75\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.112836 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f008edbb-a92d-45b1-ab9d-a56978d20e75-logs\") pod \"f008edbb-a92d-45b1-ab9d-a56978d20e75\" (UID: \"f008edbb-a92d-45b1-ab9d-a56978d20e75\") " Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.113308 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.113321 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56dvv\" (UniqueName: \"kubernetes.io/projected/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-kube-api-access-56dvv\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.113331 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.114094 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f008edbb-a92d-45b1-ab9d-a56978d20e75-logs" (OuterVolumeSpecName: "logs") pod "f008edbb-a92d-45b1-ab9d-a56978d20e75" (UID: "f008edbb-a92d-45b1-ab9d-a56978d20e75"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.123212 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f008edbb-a92d-45b1-ab9d-a56978d20e75-kube-api-access-ttvw5" (OuterVolumeSpecName: "kube-api-access-ttvw5") pod "f008edbb-a92d-45b1-ab9d-a56978d20e75" (UID: "f008edbb-a92d-45b1-ab9d-a56978d20e75"). InnerVolumeSpecName "kube-api-access-ttvw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.150034 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f008edbb-a92d-45b1-ab9d-a56978d20e75" (UID: "f008edbb-a92d-45b1-ab9d-a56978d20e75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.153868 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-config-data" (OuterVolumeSpecName: "config-data") pod "f008edbb-a92d-45b1-ab9d-a56978d20e75" (UID: "f008edbb-a92d-45b1-ab9d-a56978d20e75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.214256 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.214293 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttvw5\" (UniqueName: \"kubernetes.io/projected/f008edbb-a92d-45b1-ab9d-a56978d20e75-kube-api-access-ttvw5\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.214308 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f008edbb-a92d-45b1-ab9d-a56978d20e75-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.214319 4885 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f008edbb-a92d-45b1-ab9d-a56978d20e75-logs\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.382207 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b042d37f-f908-40b8-88be-21798a9428f6" path="/var/lib/kubelet/pods/b042d37f-f908-40b8-88be-21798a9428f6/volumes" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.493932 4885 generic.go:334] "Generic (PLEG): container finished" podID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerID="cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be" exitCode=0 Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.493991 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.494001 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f008edbb-a92d-45b1-ab9d-a56978d20e75","Type":"ContainerDied","Data":"cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be"} Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.494330 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f008edbb-a92d-45b1-ab9d-a56978d20e75","Type":"ContainerDied","Data":"e11dd41c5f061aeb15ad1b8571823004739345378aecc51a3efc7e9384ef84f7"} Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.494352 4885 scope.go:117] "RemoveContainer" containerID="cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.497360 4885 generic.go:334] "Generic (PLEG): container finished" podID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerID="88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc" exitCode=0 Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.497425 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.497440 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a","Type":"ContainerDied","Data":"88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc"} Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.497467 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e9cda09-9781-4b87-bec3-9f23ea3f0e3a","Type":"ContainerDied","Data":"d6b2799aab2f7b019fc1cf13b4754da3de421e26bd6b42741c010f54fac4b62f"} Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.534482 4885 scope.go:117] "RemoveContainer" containerID="2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.543977 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.561417 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.569159 4885 scope.go:117] "RemoveContainer" containerID="cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.570124 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be\": container with ID starting with cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be not found: ID does not exist" containerID="cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.570164 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be"} err="failed to get container status \"cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be\": rpc error: code = NotFound desc = could not find container \"cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be\": container with ID starting with cc76e0107502d309a767ee2a7a7d7c1b0c37deacc5a07881f3e56dc2932ca3be not found: ID does not exist" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.570188 4885 scope.go:117] "RemoveContainer" containerID="2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.571592 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.572093 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-metadata" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572105 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-metadata" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.572130 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-api" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572137 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-api" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.572148 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-log" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572157 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-log" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.572183 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-log" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572189 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-log" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572383 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-log" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572404 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" containerName="nova-api-api" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572418 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-metadata" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.572433 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" containerName="nova-metadata-log" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.573569 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.574588 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea\": container with ID starting with 2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea not found: ID does not exist" containerID="2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.574619 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea"} err="failed to get container status \"2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea\": rpc error: code = NotFound desc = could not find container \"2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea\": container with ID starting with 2a4b16eb208d642f752b328c964ff08e00fee737f5e56886655a47fd68f68aea not found: ID does not exist" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.574636 4885 scope.go:117] "RemoveContainer" containerID="88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.575554 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.581492 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.592595 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.604173 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.614979 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.616980 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.618735 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.622895 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623025 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws766\" (UniqueName: \"kubernetes.io/projected/afd37ef2-90bf-4ea4-86a1-2113a005824e-kube-api-access-ws766\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623129 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d37f17-3e80-43b1-b6e3-df2316900973-logs\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623187 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bssn4\" (UniqueName: \"kubernetes.io/projected/a2d37f17-3e80-43b1-b6e3-df2316900973-kube-api-access-bssn4\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623210 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d37f17-3e80-43b1-b6e3-df2316900973-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623269 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afd37ef2-90bf-4ea4-86a1-2113a005824e-config-data\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623307 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd37ef2-90bf-4ea4-86a1-2113a005824e-logs\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623359 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d37f17-3e80-43b1-b6e3-df2316900973-config-data\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.623431 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd37ef2-90bf-4ea4-86a1-2113a005824e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.631388 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726020 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d37f17-3e80-43b1-b6e3-df2316900973-logs\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726087 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bssn4\" (UniqueName: \"kubernetes.io/projected/a2d37f17-3e80-43b1-b6e3-df2316900973-kube-api-access-bssn4\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726109 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d37f17-3e80-43b1-b6e3-df2316900973-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726134 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afd37ef2-90bf-4ea4-86a1-2113a005824e-config-data\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726187 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd37ef2-90bf-4ea4-86a1-2113a005824e-logs\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726215 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d37f17-3e80-43b1-b6e3-df2316900973-config-data\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726267 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd37ef2-90bf-4ea4-86a1-2113a005824e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.726305 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws766\" (UniqueName: \"kubernetes.io/projected/afd37ef2-90bf-4ea4-86a1-2113a005824e-kube-api-access-ws766\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.727080 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd37ef2-90bf-4ea4-86a1-2113a005824e-logs\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.727417 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d37f17-3e80-43b1-b6e3-df2316900973-logs\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.733604 4885 scope.go:117] "RemoveContainer" containerID="37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.733854 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d37f17-3e80-43b1-b6e3-df2316900973-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.734114 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd37ef2-90bf-4ea4-86a1-2113a005824e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.734113 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d37f17-3e80-43b1-b6e3-df2316900973-config-data\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.743422 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bssn4\" (UniqueName: \"kubernetes.io/projected/a2d37f17-3e80-43b1-b6e3-df2316900973-kube-api-access-bssn4\") pod \"nova-metadata-0\" (UID: \"a2d37f17-3e80-43b1-b6e3-df2316900973\") " pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.744632 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afd37ef2-90bf-4ea4-86a1-2113a005824e-config-data\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.745088 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws766\" (UniqueName: \"kubernetes.io/projected/afd37ef2-90bf-4ea4-86a1-2113a005824e-kube-api-access-ws766\") pod \"nova-api-0\" (UID: \"afd37ef2-90bf-4ea4-86a1-2113a005824e\") " pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.835734 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.846231 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.847503 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.850711 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.850843 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.850989 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-qjpt8" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.851132 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.855483 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.855646 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.857640 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.864751 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.881454 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx"] Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.886846 4885 scope.go:117] "RemoveContainer" containerID="88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.887461 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc\": container with ID starting with 88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc not found: ID does not exist" containerID="88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.887505 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc"} err="failed to get container status \"88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc\": rpc error: code = NotFound desc = could not find container \"88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc\": container with ID starting with 88fdab87e6c5844a24e3c50e89da3caea5a04f4145ff8ffe7c545c3e0bfddcfc not found: ID does not exist" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.887550 4885 scope.go:117] "RemoveContainer" containerID="37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1" Mar 08 21:55:41 crc kubenswrapper[4885]: E0308 21:55:41.887875 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1\": container with ID starting with 37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1 not found: ID does not exist" containerID="37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.887934 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1"} err="failed to get container status \"37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1\": rpc error: code = NotFound desc = could not find container \"37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1\": container with ID starting with 37e8552a32f587324b8e01404af67b4a19aa44a325dff3c372ac1a5a2c9747e1 not found: ID does not exist" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.928654 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929271 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929322 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929361 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929392 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929429 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929451 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929476 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929494 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwdl2\" (UniqueName: \"kubernetes.io/projected/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-kube-api-access-gwdl2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929545 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929591 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929630 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:41 crc kubenswrapper[4885]: I0308 21:55:41.929646 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036534 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036563 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036586 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036609 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036625 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwdl2\" (UniqueName: \"kubernetes.io/projected/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-kube-api-access-gwdl2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036664 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036702 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036740 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036757 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036808 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.036854 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.043977 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.045305 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.046095 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.047881 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-3\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.048307 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.049327 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ssh-key-openstack-cell1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.049356 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.051316 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.052734 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.053532 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.053954 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.062324 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.062959 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwdl2\" (UniqueName: \"kubernetes.io/projected/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-kube-api-access-gwdl2\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.186531 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.414146 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 08 21:55:42 crc kubenswrapper[4885]: W0308 21:55:42.460216 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2d37f17_3e80_43b1_b6e3_df2316900973.slice/crio-231bc1c918ac3a985eec5e5f0f1d33a6f576b1cd107749c248652be3416e9578 WatchSource:0}: Error finding container 231bc1c918ac3a985eec5e5f0f1d33a6f576b1cd107749c248652be3416e9578: Status 404 returned error can't find the container with id 231bc1c918ac3a985eec5e5f0f1d33a6f576b1cd107749c248652be3416e9578 Mar 08 21:55:42 crc kubenswrapper[4885]: W0308 21:55:42.510073 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafd37ef2_90bf_4ea4_86a1_2113a005824e.slice/crio-f7b47ae3e145a420d94469e0367830089cbe27ad3ac94bbc5ef7ada8aa2f509e WatchSource:0}: Error finding container f7b47ae3e145a420d94469e0367830089cbe27ad3ac94bbc5ef7ada8aa2f509e: Status 404 returned error can't find the container with id f7b47ae3e145a420d94469e0367830089cbe27ad3ac94bbc5ef7ada8aa2f509e Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.515942 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.525358 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a2d37f17-3e80-43b1-b6e3-df2316900973","Type":"ContainerStarted","Data":"231bc1c918ac3a985eec5e5f0f1d33a6f576b1cd107749c248652be3416e9578"} Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.528117 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3534e95e-b33c-4294-98d0-f758ea92cf72","Type":"ContainerStarted","Data":"b6747750f2088baadaef68544fc734cc86f717816cc17ca0b34ff3b054312391"} Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.528147 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3534e95e-b33c-4294-98d0-f758ea92cf72","Type":"ContainerStarted","Data":"76a16339acb7b6242d62b0f55ebec5fa9bfb3d918fc8b3c0d59cea3377bb772b"} Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.529203 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.547520 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.5473522539999998 podStartE2EDuration="2.547352254s" podCreationTimestamp="2026-03-08 21:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:55:42.547027415 +0000 UTC m=+8643.943081438" watchObservedRunningTime="2026-03-08 21:55:42.547352254 +0000 UTC m=+8643.943406277" Mar 08 21:55:42 crc kubenswrapper[4885]: I0308 21:55:42.780727 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx"] Mar 08 21:55:42 crc kubenswrapper[4885]: W0308 21:55:42.793767 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ac2d268_855a_485e_a96f_87b5cc0e4f6e.slice/crio-bc1f3a241d4531503378c6a0aa816b34e9f9b27055e28fae7a7344fd598a60e2 WatchSource:0}: Error finding container bc1f3a241d4531503378c6a0aa816b34e9f9b27055e28fae7a7344fd598a60e2: Status 404 returned error can't find the container with id bc1f3a241d4531503378c6a0aa816b34e9f9b27055e28fae7a7344fd598a60e2 Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.121174 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.164184 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-combined-ca-bundle\") pod \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.164349 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz7p5\" (UniqueName: \"kubernetes.io/projected/e5c41752-6a6f-4bbf-882f-a1e873cd225f-kube-api-access-lz7p5\") pod \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.164461 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-config-data\") pod \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\" (UID: \"e5c41752-6a6f-4bbf-882f-a1e873cd225f\") " Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.169975 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c41752-6a6f-4bbf-882f-a1e873cd225f-kube-api-access-lz7p5" (OuterVolumeSpecName: "kube-api-access-lz7p5") pod "e5c41752-6a6f-4bbf-882f-a1e873cd225f" (UID: "e5c41752-6a6f-4bbf-882f-a1e873cd225f"). InnerVolumeSpecName "kube-api-access-lz7p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.191965 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5c41752-6a6f-4bbf-882f-a1e873cd225f" (UID: "e5c41752-6a6f-4bbf-882f-a1e873cd225f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.202215 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-config-data" (OuterVolumeSpecName: "config-data") pod "e5c41752-6a6f-4bbf-882f-a1e873cd225f" (UID: "e5c41752-6a6f-4bbf-882f-a1e873cd225f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.267818 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.267858 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz7p5\" (UniqueName: \"kubernetes.io/projected/e5c41752-6a6f-4bbf-882f-a1e873cd225f-kube-api-access-lz7p5\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.267875 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5c41752-6a6f-4bbf-882f-a1e873cd225f-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.379434 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9cda09-9781-4b87-bec3-9f23ea3f0e3a" path="/var/lib/kubelet/pods/5e9cda09-9781-4b87-bec3-9f23ea3f0e3a/volumes" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.381042 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f008edbb-a92d-45b1-ab9d-a56978d20e75" path="/var/lib/kubelet/pods/f008edbb-a92d-45b1-ab9d-a56978d20e75/volumes" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.564474 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" event={"ID":"0ac2d268-855a-485e-a96f-87b5cc0e4f6e","Type":"ContainerStarted","Data":"bc1f3a241d4531503378c6a0aa816b34e9f9b27055e28fae7a7344fd598a60e2"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.565916 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"afd37ef2-90bf-4ea4-86a1-2113a005824e","Type":"ContainerStarted","Data":"c0965d3d5ce970351c46ebe5f000b7b526f00ca05bc8cac2ff9b123f0e265806"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.565979 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"afd37ef2-90bf-4ea4-86a1-2113a005824e","Type":"ContainerStarted","Data":"455a26ce7ce5a1165b370ace53bfe8b11ccdb91628aafbb1bfecdcbd431e7048"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.565988 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"afd37ef2-90bf-4ea4-86a1-2113a005824e","Type":"ContainerStarted","Data":"f7b47ae3e145a420d94469e0367830089cbe27ad3ac94bbc5ef7ada8aa2f509e"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.574137 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a2d37f17-3e80-43b1-b6e3-df2316900973","Type":"ContainerStarted","Data":"f986557e81454bf1db78eec12e02883dfa242d341cd8fe143e711a41264a5899"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.574369 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a2d37f17-3e80-43b1-b6e3-df2316900973","Type":"ContainerStarted","Data":"9c553ecbf8b599e7d8354dccb7e5f894debb74419e8225b98b8d5cf3ffbbb67b"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.580131 4885 generic.go:334] "Generic (PLEG): container finished" podID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" exitCode=0 Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.580219 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e5c41752-6a6f-4bbf-882f-a1e873cd225f","Type":"ContainerDied","Data":"949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.580295 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e5c41752-6a6f-4bbf-882f-a1e873cd225f","Type":"ContainerDied","Data":"ef5f617fb77787529ec827122c4f886ba5c0cc15c252f2142be0c62ea54a65b1"} Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.580334 4885 scope.go:117] "RemoveContainer" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.580413 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.605143 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.605124633 podStartE2EDuration="2.605124633s" podCreationTimestamp="2026-03-08 21:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:55:43.588684024 +0000 UTC m=+8644.984738047" watchObservedRunningTime="2026-03-08 21:55:43.605124633 +0000 UTC m=+8645.001178656" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.618548 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.618523031 podStartE2EDuration="2.618523031s" podCreationTimestamp="2026-03-08 21:55:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:55:43.618187061 +0000 UTC m=+8645.014241104" watchObservedRunningTime="2026-03-08 21:55:43.618523031 +0000 UTC m=+8645.014577054" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.620396 4885 scope.go:117] "RemoveContainer" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" Mar 08 21:55:43 crc kubenswrapper[4885]: E0308 21:55:43.624283 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403\": container with ID starting with 949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403 not found: ID does not exist" containerID="949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.624348 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403"} err="failed to get container status \"949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403\": rpc error: code = NotFound desc = could not find container \"949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403\": container with ID starting with 949928d31b84acaff4cbb2d4fa8ed9b0ea1f944351534a1710b16cd1e34ad403 not found: ID does not exist" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.653672 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.663968 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.678302 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:55:43 crc kubenswrapper[4885]: E0308 21:55:43.678992 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" containerName="nova-scheduler-scheduler" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.679062 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" containerName="nova-scheduler-scheduler" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.679315 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" containerName="nova-scheduler-scheduler" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.680074 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.683391 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.689902 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.774910 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.774988 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtrrt\" (UniqueName: \"kubernetes.io/projected/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-kube-api-access-dtrrt\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.775152 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-config-data\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.876997 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-config-data\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.877125 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.877162 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtrrt\" (UniqueName: \"kubernetes.io/projected/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-kube-api-access-dtrrt\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.882789 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.887514 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-config-data\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:43 crc kubenswrapper[4885]: I0308 21:55:43.902762 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtrrt\" (UniqueName: \"kubernetes.io/projected/199a5a0a-f05c-4e06-9bee-2a5d0303f3a0-kube-api-access-dtrrt\") pod \"nova-scheduler-0\" (UID: \"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0\") " pod="openstack/nova-scheduler-0" Mar 08 21:55:44 crc kubenswrapper[4885]: I0308 21:55:44.005342 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 08 21:55:44 crc kubenswrapper[4885]: I0308 21:55:44.505870 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 08 21:55:44 crc kubenswrapper[4885]: I0308 21:55:44.606821 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" event={"ID":"0ac2d268-855a-485e-a96f-87b5cc0e4f6e","Type":"ContainerStarted","Data":"5bd421a1b9eb744ee9eff61f628d2ad2fc68006fdecefae53f924670048d44f4"} Mar 08 21:55:44 crc kubenswrapper[4885]: I0308 21:55:44.608677 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0","Type":"ContainerStarted","Data":"332381c985ef489b5b2b2c43b5d02ad256fdc8ee96097b248e95d8cf8d80eac4"} Mar 08 21:55:44 crc kubenswrapper[4885]: I0308 21:55:44.642108 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" podStartSLOduration=3.174973741 podStartE2EDuration="3.642087457s" podCreationTimestamp="2026-03-08 21:55:41 +0000 UTC" firstStartedPulling="2026-03-08 21:55:42.802642977 +0000 UTC m=+8644.198697000" lastFinishedPulling="2026-03-08 21:55:43.269756693 +0000 UTC m=+8644.665810716" observedRunningTime="2026-03-08 21:55:44.629420989 +0000 UTC m=+8646.025475032" watchObservedRunningTime="2026-03-08 21:55:44.642087457 +0000 UTC m=+8646.038141480" Mar 08 21:55:45 crc kubenswrapper[4885]: I0308 21:55:45.387957 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c41752-6a6f-4bbf-882f-a1e873cd225f" path="/var/lib/kubelet/pods/e5c41752-6a6f-4bbf-882f-a1e873cd225f/volumes" Mar 08 21:55:45 crc kubenswrapper[4885]: I0308 21:55:45.643570 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"199a5a0a-f05c-4e06-9bee-2a5d0303f3a0","Type":"ContainerStarted","Data":"546e0708e2f027193a01bfb4b1f0bf3f5cfa6cb7968bc0ce004add563a9451aa"} Mar 08 21:55:45 crc kubenswrapper[4885]: I0308 21:55:45.674248 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.674231253 podStartE2EDuration="2.674231253s" podCreationTimestamp="2026-03-08 21:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 21:55:45.663841825 +0000 UTC m=+8647.059895848" watchObservedRunningTime="2026-03-08 21:55:45.674231253 +0000 UTC m=+8647.070285276" Mar 08 21:55:46 crc kubenswrapper[4885]: I0308 21:55:46.109890 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 08 21:55:46 crc kubenswrapper[4885]: I0308 21:55:46.865531 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:55:46 crc kubenswrapper[4885]: I0308 21:55:46.865650 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 08 21:55:48 crc kubenswrapper[4885]: I0308 21:55:48.883012 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 08 21:55:49 crc kubenswrapper[4885]: I0308 21:55:49.005902 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 08 21:55:51 crc kubenswrapper[4885]: I0308 21:55:51.837593 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:55:51 crc kubenswrapper[4885]: I0308 21:55:51.837985 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 08 21:55:51 crc kubenswrapper[4885]: I0308 21:55:51.866009 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:55:51 crc kubenswrapper[4885]: I0308 21:55:51.866139 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 08 21:55:53 crc kubenswrapper[4885]: I0308 21:55:53.031164 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a2d37f17-3e80-43b1-b6e3-df2316900973" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.0.45:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:55:53 crc kubenswrapper[4885]: I0308 21:55:53.031532 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a2d37f17-3e80-43b1-b6e3-df2316900973" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.0.45:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:55:53 crc kubenswrapper[4885]: I0308 21:55:53.031489 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="afd37ef2-90bf-4ea4-86a1-2113a005824e" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.42:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:55:53 crc kubenswrapper[4885]: I0308 21:55:53.031896 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="afd37ef2-90bf-4ea4-86a1-2113a005824e" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.42:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 08 21:55:54 crc kubenswrapper[4885]: I0308 21:55:54.005529 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 08 21:55:54 crc kubenswrapper[4885]: I0308 21:55:54.053679 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 08 21:55:54 crc kubenswrapper[4885]: I0308 21:55:54.839087 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.158801 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550116-p5zpr"] Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.161584 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.164968 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.165083 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.166735 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.180175 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550116-p5zpr"] Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.266323 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99zwj\" (UniqueName: \"kubernetes.io/projected/480855ed-5f7f-4fb4-99dc-ced66ce15999-kube-api-access-99zwj\") pod \"auto-csr-approver-29550116-p5zpr\" (UID: \"480855ed-5f7f-4fb4-99dc-ced66ce15999\") " pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.369242 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99zwj\" (UniqueName: \"kubernetes.io/projected/480855ed-5f7f-4fb4-99dc-ced66ce15999-kube-api-access-99zwj\") pod \"auto-csr-approver-29550116-p5zpr\" (UID: \"480855ed-5f7f-4fb4-99dc-ced66ce15999\") " pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.400138 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99zwj\" (UniqueName: \"kubernetes.io/projected/480855ed-5f7f-4fb4-99dc-ced66ce15999-kube-api-access-99zwj\") pod \"auto-csr-approver-29550116-p5zpr\" (UID: \"480855ed-5f7f-4fb4-99dc-ced66ce15999\") " pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:00 crc kubenswrapper[4885]: I0308 21:56:00.489717 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.078529 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550116-p5zpr"] Mar 08 21:56:01 crc kubenswrapper[4885]: W0308 21:56:01.078636 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod480855ed_5f7f_4fb4_99dc_ced66ce15999.slice/crio-676c3250378287b0099bd3737e48f719c1468510542ce234a8fb9d75ec65a1c6 WatchSource:0}: Error finding container 676c3250378287b0099bd3737e48f719c1468510542ce234a8fb9d75ec65a1c6: Status 404 returned error can't find the container with id 676c3250378287b0099bd3737e48f719c1468510542ce234a8fb9d75ec65a1c6 Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.846487 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.847829 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.853841 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.858789 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.869849 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.876423 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.885180 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.916047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" event={"ID":"480855ed-5f7f-4fb4-99dc-ced66ce15999","Type":"ContainerStarted","Data":"676c3250378287b0099bd3737e48f719c1468510542ce234a8fb9d75ec65a1c6"} Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.916370 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 08 21:56:01 crc kubenswrapper[4885]: I0308 21:56:01.926640 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 08 21:56:02 crc kubenswrapper[4885]: I0308 21:56:02.930671 4885 generic.go:334] "Generic (PLEG): container finished" podID="480855ed-5f7f-4fb4-99dc-ced66ce15999" containerID="6341a20e412f738fa67c5354da928321dc5ae4ed993b46a3d4ae33371480585f" exitCode=0 Mar 08 21:56:02 crc kubenswrapper[4885]: I0308 21:56:02.930760 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" event={"ID":"480855ed-5f7f-4fb4-99dc-ced66ce15999","Type":"ContainerDied","Data":"6341a20e412f738fa67c5354da928321dc5ae4ed993b46a3d4ae33371480585f"} Mar 08 21:56:02 crc kubenswrapper[4885]: I0308 21:56:02.933460 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.445947 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.572899 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99zwj\" (UniqueName: \"kubernetes.io/projected/480855ed-5f7f-4fb4-99dc-ced66ce15999-kube-api-access-99zwj\") pod \"480855ed-5f7f-4fb4-99dc-ced66ce15999\" (UID: \"480855ed-5f7f-4fb4-99dc-ced66ce15999\") " Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.579405 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480855ed-5f7f-4fb4-99dc-ced66ce15999-kube-api-access-99zwj" (OuterVolumeSpecName: "kube-api-access-99zwj") pod "480855ed-5f7f-4fb4-99dc-ced66ce15999" (UID: "480855ed-5f7f-4fb4-99dc-ced66ce15999"). InnerVolumeSpecName "kube-api-access-99zwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.675298 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99zwj\" (UniqueName: \"kubernetes.io/projected/480855ed-5f7f-4fb4-99dc-ced66ce15999-kube-api-access-99zwj\") on node \"crc\" DevicePath \"\"" Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.963151 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.963153 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550116-p5zpr" event={"ID":"480855ed-5f7f-4fb4-99dc-ced66ce15999","Type":"ContainerDied","Data":"676c3250378287b0099bd3737e48f719c1468510542ce234a8fb9d75ec65a1c6"} Mar 08 21:56:04 crc kubenswrapper[4885]: I0308 21:56:04.963619 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="676c3250378287b0099bd3737e48f719c1468510542ce234a8fb9d75ec65a1c6" Mar 08 21:56:05 crc kubenswrapper[4885]: I0308 21:56:05.532258 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550110-9ztjc"] Mar 08 21:56:05 crc kubenswrapper[4885]: I0308 21:56:05.543879 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550110-9ztjc"] Mar 08 21:56:07 crc kubenswrapper[4885]: I0308 21:56:07.387854 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baf0e32a-60d4-4a44-af91-6bbe65bc82c9" path="/var/lib/kubelet/pods/baf0e32a-60d4-4a44-af91-6bbe65bc82c9/volumes" Mar 08 21:56:14 crc kubenswrapper[4885]: I0308 21:56:14.174197 4885 scope.go:117] "RemoveContainer" containerID="6672cbe9c59e44b713aad8bb2c9fe671a86cd6c40abe9e893295200a4e793993" Mar 08 21:56:32 crc kubenswrapper[4885]: I0308 21:56:32.818674 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:56:32 crc kubenswrapper[4885]: I0308 21:56:32.819205 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:57:02 crc kubenswrapper[4885]: I0308 21:57:02.818392 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:57:02 crc kubenswrapper[4885]: I0308 21:57:02.819145 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:57:32 crc kubenswrapper[4885]: I0308 21:57:32.818951 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 21:57:32 crc kubenswrapper[4885]: I0308 21:57:32.819637 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 21:57:32 crc kubenswrapper[4885]: I0308 21:57:32.819704 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 21:57:32 crc kubenswrapper[4885]: I0308 21:57:32.820944 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22f99b9a0135e9d4a897236bc6328dc3a4eed4231aa01d92b86545edeafa5622"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 21:57:32 crc kubenswrapper[4885]: I0308 21:57:32.821020 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://22f99b9a0135e9d4a897236bc6328dc3a4eed4231aa01d92b86545edeafa5622" gracePeriod=600 Mar 08 21:57:33 crc kubenswrapper[4885]: I0308 21:57:33.555984 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="22f99b9a0135e9d4a897236bc6328dc3a4eed4231aa01d92b86545edeafa5622" exitCode=0 Mar 08 21:57:33 crc kubenswrapper[4885]: I0308 21:57:33.556522 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"22f99b9a0135e9d4a897236bc6328dc3a4eed4231aa01d92b86545edeafa5622"} Mar 08 21:57:33 crc kubenswrapper[4885]: I0308 21:57:33.556549 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae"} Mar 08 21:57:33 crc kubenswrapper[4885]: I0308 21:57:33.556582 4885 scope.go:117] "RemoveContainer" containerID="e992c0c93b7debec7839bad0636bcba1c677b8f626afaf7cf926ec0c7aabd732" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.148589 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550118-28jxb"] Mar 08 21:58:00 crc kubenswrapper[4885]: E0308 21:58:00.149562 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480855ed-5f7f-4fb4-99dc-ced66ce15999" containerName="oc" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.149574 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="480855ed-5f7f-4fb4-99dc-ced66ce15999" containerName="oc" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.149817 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="480855ed-5f7f-4fb4-99dc-ced66ce15999" containerName="oc" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.150525 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.156398 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.157030 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.159299 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.164431 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550118-28jxb"] Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.272330 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7twms\" (UniqueName: \"kubernetes.io/projected/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc-kube-api-access-7twms\") pod \"auto-csr-approver-29550118-28jxb\" (UID: \"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc\") " pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.374050 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7twms\" (UniqueName: \"kubernetes.io/projected/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc-kube-api-access-7twms\") pod \"auto-csr-approver-29550118-28jxb\" (UID: \"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc\") " pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.392807 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7twms\" (UniqueName: \"kubernetes.io/projected/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc-kube-api-access-7twms\") pod \"auto-csr-approver-29550118-28jxb\" (UID: \"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc\") " pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.489625 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:00 crc kubenswrapper[4885]: I0308 21:58:00.972870 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550118-28jxb"] Mar 08 21:58:01 crc kubenswrapper[4885]: I0308 21:58:01.156762 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550118-28jxb" event={"ID":"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc","Type":"ContainerStarted","Data":"e1bd39f573a525120c9814b24982db4cc3a012a96b2c42182904524b87a90731"} Mar 08 21:58:02 crc kubenswrapper[4885]: I0308 21:58:02.165876 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550118-28jxb" event={"ID":"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc","Type":"ContainerStarted","Data":"0d8b1adfd4f970a1d96597c5128035560d0e6296ee9788584fde3df1fcb99135"} Mar 08 21:58:02 crc kubenswrapper[4885]: I0308 21:58:02.187599 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550118-28jxb" podStartSLOduration=1.359045845 podStartE2EDuration="2.187581157s" podCreationTimestamp="2026-03-08 21:58:00 +0000 UTC" firstStartedPulling="2026-03-08 21:58:00.975839039 +0000 UTC m=+8782.371893062" lastFinishedPulling="2026-03-08 21:58:01.804374341 +0000 UTC m=+8783.200428374" observedRunningTime="2026-03-08 21:58:02.178658779 +0000 UTC m=+8783.574712802" watchObservedRunningTime="2026-03-08 21:58:02.187581157 +0000 UTC m=+8783.583635180" Mar 08 21:58:03 crc kubenswrapper[4885]: I0308 21:58:03.179210 4885 generic.go:334] "Generic (PLEG): container finished" podID="0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc" containerID="0d8b1adfd4f970a1d96597c5128035560d0e6296ee9788584fde3df1fcb99135" exitCode=0 Mar 08 21:58:03 crc kubenswrapper[4885]: I0308 21:58:03.179302 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550118-28jxb" event={"ID":"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc","Type":"ContainerDied","Data":"0d8b1adfd4f970a1d96597c5128035560d0e6296ee9788584fde3df1fcb99135"} Mar 08 21:58:04 crc kubenswrapper[4885]: I0308 21:58:04.609550 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:04 crc kubenswrapper[4885]: I0308 21:58:04.793150 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7twms\" (UniqueName: \"kubernetes.io/projected/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc-kube-api-access-7twms\") pod \"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc\" (UID: \"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc\") " Mar 08 21:58:04 crc kubenswrapper[4885]: I0308 21:58:04.800903 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc-kube-api-access-7twms" (OuterVolumeSpecName: "kube-api-access-7twms") pod "0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc" (UID: "0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc"). InnerVolumeSpecName "kube-api-access-7twms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:58:04 crc kubenswrapper[4885]: I0308 21:58:04.897375 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7twms\" (UniqueName: \"kubernetes.io/projected/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc-kube-api-access-7twms\") on node \"crc\" DevicePath \"\"" Mar 08 21:58:05 crc kubenswrapper[4885]: I0308 21:58:05.225196 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550118-28jxb" event={"ID":"0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc","Type":"ContainerDied","Data":"e1bd39f573a525120c9814b24982db4cc3a012a96b2c42182904524b87a90731"} Mar 08 21:58:05 crc kubenswrapper[4885]: I0308 21:58:05.225255 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1bd39f573a525120c9814b24982db4cc3a012a96b2c42182904524b87a90731" Mar 08 21:58:05 crc kubenswrapper[4885]: I0308 21:58:05.225353 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550118-28jxb" Mar 08 21:58:05 crc kubenswrapper[4885]: I0308 21:58:05.266712 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550112-xxhhl"] Mar 08 21:58:05 crc kubenswrapper[4885]: I0308 21:58:05.280355 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550112-xxhhl"] Mar 08 21:58:05 crc kubenswrapper[4885]: I0308 21:58:05.387990 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aae4f06-bf3b-4963-92b4-9dfc6bb69621" path="/var/lib/kubelet/pods/2aae4f06-bf3b-4963-92b4-9dfc6bb69621/volumes" Mar 08 21:58:14 crc kubenswrapper[4885]: I0308 21:58:14.427682 4885 scope.go:117] "RemoveContainer" containerID="8d2a85311da28c593e02f61d9e21770d4b4946e346cfff8ec77956eb44cbfcfa" Mar 08 21:59:33 crc kubenswrapper[4885]: I0308 21:59:33.685213 4885 generic.go:334] "Generic (PLEG): container finished" podID="0ac2d268-855a-485e-a96f-87b5cc0e4f6e" containerID="5bd421a1b9eb744ee9eff61f628d2ad2fc68006fdecefae53f924670048d44f4" exitCode=0 Mar 08 21:59:33 crc kubenswrapper[4885]: I0308 21:59:33.685344 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" event={"ID":"0ac2d268-855a-485e-a96f-87b5cc0e4f6e","Type":"ContainerDied","Data":"5bd421a1b9eb744ee9eff61f628d2ad2fc68006fdecefae53f924670048d44f4"} Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.180917 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.354189 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-2\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.354808 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-0\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.354889 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-combined-ca-bundle\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.354949 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ceph\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.354968 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-inventory\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355010 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-0\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355037 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ssh-key-openstack-cell1\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355080 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-0\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355123 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwdl2\" (UniqueName: \"kubernetes.io/projected/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-kube-api-access-gwdl2\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355165 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-3\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355219 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-1\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355240 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-1\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.355274 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-1\") pod \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\" (UID: \"0ac2d268-855a-485e-a96f-87b5cc0e4f6e\") " Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.359771 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-kube-api-access-gwdl2" (OuterVolumeSpecName: "kube-api-access-gwdl2") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "kube-api-access-gwdl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.360041 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ceph" (OuterVolumeSpecName: "ceph") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.374431 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.392022 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.393161 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.398024 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-inventory" (OuterVolumeSpecName: "inventory") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.405333 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.406062 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.412160 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.418096 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.424725 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.431551 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.431908 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "0ac2d268-855a-485e-a96f-87b5cc0e4f6e" (UID: "0ac2d268-855a-485e-a96f-87b5cc0e4f6e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458117 4885 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458174 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458195 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458214 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458233 4885 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458251 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458269 4885 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ceph\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458289 4885 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-inventory\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458324 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458342 4885 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458362 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458384 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwdl2\" (UniqueName: \"kubernetes.io/projected/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-kube-api-access-gwdl2\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.458403 4885 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/0ac2d268-855a-485e-a96f-87b5cc0e4f6e-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.712297 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" event={"ID":"0ac2d268-855a-485e-a96f-87b5cc0e4f6e","Type":"ContainerDied","Data":"bc1f3a241d4531503378c6a0aa816b34e9f9b27055e28fae7a7344fd598a60e2"} Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.712348 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx" Mar 08 21:59:35 crc kubenswrapper[4885]: I0308 21:59:35.712351 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc1f3a241d4531503378c6a0aa816b34e9f9b27055e28fae7a7344fd598a60e2" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.677423 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lzz29"] Mar 08 21:59:51 crc kubenswrapper[4885]: E0308 21:59:51.678800 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc" containerName="oc" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.678823 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc" containerName="oc" Mar 08 21:59:51 crc kubenswrapper[4885]: E0308 21:59:51.678877 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ac2d268-855a-485e-a96f-87b5cc0e4f6e" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.678892 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ac2d268-855a-485e-a96f-87b5cc0e4f6e" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.679353 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ac2d268-855a-485e-a96f-87b5cc0e4f6e" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.679420 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc" containerName="oc" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.683339 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.697418 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lzz29"] Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.741596 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-catalog-content\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.741661 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8gv\" (UniqueName: \"kubernetes.io/projected/54512cac-df49-4ce5-aeea-b3f205d06de8-kube-api-access-zq8gv\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.741780 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-utilities\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.843691 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-catalog-content\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.843747 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8gv\" (UniqueName: \"kubernetes.io/projected/54512cac-df49-4ce5-aeea-b3f205d06de8-kube-api-access-zq8gv\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.843868 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-utilities\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.844610 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-catalog-content\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.844618 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-utilities\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:51 crc kubenswrapper[4885]: I0308 21:59:51.864000 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8gv\" (UniqueName: \"kubernetes.io/projected/54512cac-df49-4ce5-aeea-b3f205d06de8-kube-api-access-zq8gv\") pod \"redhat-operators-lzz29\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:52 crc kubenswrapper[4885]: I0308 21:59:52.004296 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 21:59:52 crc kubenswrapper[4885]: I0308 21:59:52.498696 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lzz29"] Mar 08 21:59:52 crc kubenswrapper[4885]: I0308 21:59:52.924296 4885 generic.go:334] "Generic (PLEG): container finished" podID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerID="c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64" exitCode=0 Mar 08 21:59:52 crc kubenswrapper[4885]: I0308 21:59:52.924688 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerDied","Data":"c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64"} Mar 08 21:59:52 crc kubenswrapper[4885]: I0308 21:59:52.924717 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerStarted","Data":"99a2eed0975662340cabb29646db1c9fc839d1e4761a0358fe8a7e592a0111d3"} Mar 08 21:59:52 crc kubenswrapper[4885]: I0308 21:59:52.926749 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 21:59:53 crc kubenswrapper[4885]: I0308 21:59:53.939245 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerStarted","Data":"42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907"} Mar 08 21:59:56 crc kubenswrapper[4885]: I0308 21:59:56.976231 4885 generic.go:334] "Generic (PLEG): container finished" podID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerID="42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907" exitCode=0 Mar 08 21:59:56 crc kubenswrapper[4885]: I0308 21:59:56.976289 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerDied","Data":"42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907"} Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.008785 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerStarted","Data":"1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f"} Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.038795 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lzz29" podStartSLOduration=2.836654887 podStartE2EDuration="9.038777536s" podCreationTimestamp="2026-03-08 21:59:51 +0000 UTC" firstStartedPulling="2026-03-08 21:59:52.926565049 +0000 UTC m=+8894.322619062" lastFinishedPulling="2026-03-08 21:59:59.128687688 +0000 UTC m=+8900.524741711" observedRunningTime="2026-03-08 22:00:00.031211815 +0000 UTC m=+8901.427265838" watchObservedRunningTime="2026-03-08 22:00:00.038777536 +0000 UTC m=+8901.434831559" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.147448 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550120-gtf7r"] Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.149318 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.151249 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.151971 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.152155 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.157064 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4"] Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.158953 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.160161 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.160301 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.166180 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550120-gtf7r"] Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.175829 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4"] Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.235473 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06d5ced5-8334-4732-bc13-a8fbf2e27acf-config-volume\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.235533 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvbf5\" (UniqueName: \"kubernetes.io/projected/df80a103-8bbc-4c66-8995-05152b8b9b66-kube-api-access-rvbf5\") pod \"auto-csr-approver-29550120-gtf7r\" (UID: \"df80a103-8bbc-4c66-8995-05152b8b9b66\") " pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.235940 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06d5ced5-8334-4732-bc13-a8fbf2e27acf-secret-volume\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.236113 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r6df\" (UniqueName: \"kubernetes.io/projected/06d5ced5-8334-4732-bc13-a8fbf2e27acf-kube-api-access-8r6df\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.337663 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06d5ced5-8334-4732-bc13-a8fbf2e27acf-secret-volume\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.337743 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r6df\" (UniqueName: \"kubernetes.io/projected/06d5ced5-8334-4732-bc13-a8fbf2e27acf-kube-api-access-8r6df\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.337815 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06d5ced5-8334-4732-bc13-a8fbf2e27acf-config-volume\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.337833 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvbf5\" (UniqueName: \"kubernetes.io/projected/df80a103-8bbc-4c66-8995-05152b8b9b66-kube-api-access-rvbf5\") pod \"auto-csr-approver-29550120-gtf7r\" (UID: \"df80a103-8bbc-4c66-8995-05152b8b9b66\") " pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.339214 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06d5ced5-8334-4732-bc13-a8fbf2e27acf-config-volume\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.348217 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06d5ced5-8334-4732-bc13-a8fbf2e27acf-secret-volume\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.353887 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvbf5\" (UniqueName: \"kubernetes.io/projected/df80a103-8bbc-4c66-8995-05152b8b9b66-kube-api-access-rvbf5\") pod \"auto-csr-approver-29550120-gtf7r\" (UID: \"df80a103-8bbc-4c66-8995-05152b8b9b66\") " pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.358121 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r6df\" (UniqueName: \"kubernetes.io/projected/06d5ced5-8334-4732-bc13-a8fbf2e27acf-kube-api-access-8r6df\") pod \"collect-profiles-29550120-6trf4\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.472623 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:00 crc kubenswrapper[4885]: I0308 22:00:00.487059 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:00 crc kubenswrapper[4885]: W0308 22:00:00.990897 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf80a103_8bbc_4c66_8995_05152b8b9b66.slice/crio-e35ccf84af7c2c0e5d7f67c69db2c97e02265a533321330b7e17f1e79ade4639 WatchSource:0}: Error finding container e35ccf84af7c2c0e5d7f67c69db2c97e02265a533321330b7e17f1e79ade4639: Status 404 returned error can't find the container with id e35ccf84af7c2c0e5d7f67c69db2c97e02265a533321330b7e17f1e79ade4639 Mar 08 22:00:01 crc kubenswrapper[4885]: I0308 22:00:01.003380 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550120-gtf7r"] Mar 08 22:00:01 crc kubenswrapper[4885]: I0308 22:00:01.028114 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" event={"ID":"df80a103-8bbc-4c66-8995-05152b8b9b66","Type":"ContainerStarted","Data":"e35ccf84af7c2c0e5d7f67c69db2c97e02265a533321330b7e17f1e79ade4639"} Mar 08 22:00:01 crc kubenswrapper[4885]: I0308 22:00:01.079093 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4"] Mar 08 22:00:01 crc kubenswrapper[4885]: W0308 22:00:01.088469 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06d5ced5_8334_4732_bc13_a8fbf2e27acf.slice/crio-6159f686b0904ec30ae033bc7e402087f865ebba690482b1337ceaa95c0c1f2a WatchSource:0}: Error finding container 6159f686b0904ec30ae033bc7e402087f865ebba690482b1337ceaa95c0c1f2a: Status 404 returned error can't find the container with id 6159f686b0904ec30ae033bc7e402087f865ebba690482b1337ceaa95c0c1f2a Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.004950 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.005199 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.040777 4885 generic.go:334] "Generic (PLEG): container finished" podID="06d5ced5-8334-4732-bc13-a8fbf2e27acf" containerID="8582316db402e0c93eabdefa470a1f0e233e7e0f8d71ac1dcb4dcae392602245" exitCode=0 Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.040820 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" event={"ID":"06d5ced5-8334-4732-bc13-a8fbf2e27acf","Type":"ContainerDied","Data":"8582316db402e0c93eabdefa470a1f0e233e7e0f8d71ac1dcb4dcae392602245"} Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.040846 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" event={"ID":"06d5ced5-8334-4732-bc13-a8fbf2e27acf","Type":"ContainerStarted","Data":"6159f686b0904ec30ae033bc7e402087f865ebba690482b1337ceaa95c0c1f2a"} Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.818114 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:00:02 crc kubenswrapper[4885]: I0308 22:00:02.818489 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.063598 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lzz29" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="registry-server" probeResult="failure" output=< Mar 08 22:00:03 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 22:00:03 crc kubenswrapper[4885]: > Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.545341 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.633284 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06d5ced5-8334-4732-bc13-a8fbf2e27acf-config-volume\") pod \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.633441 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06d5ced5-8334-4732-bc13-a8fbf2e27acf-secret-volume\") pod \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.633510 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r6df\" (UniqueName: \"kubernetes.io/projected/06d5ced5-8334-4732-bc13-a8fbf2e27acf-kube-api-access-8r6df\") pod \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\" (UID: \"06d5ced5-8334-4732-bc13-a8fbf2e27acf\") " Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.634096 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06d5ced5-8334-4732-bc13-a8fbf2e27acf-config-volume" (OuterVolumeSpecName: "config-volume") pod "06d5ced5-8334-4732-bc13-a8fbf2e27acf" (UID: "06d5ced5-8334-4732-bc13-a8fbf2e27acf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.635352 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06d5ced5-8334-4732-bc13-a8fbf2e27acf-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.640119 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06d5ced5-8334-4732-bc13-a8fbf2e27acf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06d5ced5-8334-4732-bc13-a8fbf2e27acf" (UID: "06d5ced5-8334-4732-bc13-a8fbf2e27acf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.641416 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06d5ced5-8334-4732-bc13-a8fbf2e27acf-kube-api-access-8r6df" (OuterVolumeSpecName: "kube-api-access-8r6df") pod "06d5ced5-8334-4732-bc13-a8fbf2e27acf" (UID: "06d5ced5-8334-4732-bc13-a8fbf2e27acf"). InnerVolumeSpecName "kube-api-access-8r6df". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.737063 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06d5ced5-8334-4732-bc13-a8fbf2e27acf-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:03 crc kubenswrapper[4885]: I0308 22:00:03.737098 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r6df\" (UniqueName: \"kubernetes.io/projected/06d5ced5-8334-4732-bc13-a8fbf2e27acf-kube-api-access-8r6df\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:04 crc kubenswrapper[4885]: I0308 22:00:04.085654 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" event={"ID":"06d5ced5-8334-4732-bc13-a8fbf2e27acf","Type":"ContainerDied","Data":"6159f686b0904ec30ae033bc7e402087f865ebba690482b1337ceaa95c0c1f2a"} Mar 08 22:00:04 crc kubenswrapper[4885]: I0308 22:00:04.085906 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550120-6trf4" Mar 08 22:00:04 crc kubenswrapper[4885]: I0308 22:00:04.085940 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6159f686b0904ec30ae033bc7e402087f865ebba690482b1337ceaa95c0c1f2a" Mar 08 22:00:04 crc kubenswrapper[4885]: I0308 22:00:04.642710 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb"] Mar 08 22:00:04 crc kubenswrapper[4885]: I0308 22:00:04.653036 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550075-dnttb"] Mar 08 22:00:05 crc kubenswrapper[4885]: I0308 22:00:05.379592 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7dec2e5-804e-4bc5-99cc-370c31d352e0" path="/var/lib/kubelet/pods/e7dec2e5-804e-4bc5-99cc-370c31d352e0/volumes" Mar 08 22:00:07 crc kubenswrapper[4885]: I0308 22:00:07.119998 4885 generic.go:334] "Generic (PLEG): container finished" podID="df80a103-8bbc-4c66-8995-05152b8b9b66" containerID="c7a9de3d006a67b60be16dde31bd3a619a9734cffde62a18b4a5fd2544360347" exitCode=0 Mar 08 22:00:07 crc kubenswrapper[4885]: I0308 22:00:07.120058 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" event={"ID":"df80a103-8bbc-4c66-8995-05152b8b9b66","Type":"ContainerDied","Data":"c7a9de3d006a67b60be16dde31bd3a619a9734cffde62a18b4a5fd2544360347"} Mar 08 22:00:08 crc kubenswrapper[4885]: I0308 22:00:08.531203 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:08 crc kubenswrapper[4885]: I0308 22:00:08.658778 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvbf5\" (UniqueName: \"kubernetes.io/projected/df80a103-8bbc-4c66-8995-05152b8b9b66-kube-api-access-rvbf5\") pod \"df80a103-8bbc-4c66-8995-05152b8b9b66\" (UID: \"df80a103-8bbc-4c66-8995-05152b8b9b66\") " Mar 08 22:00:08 crc kubenswrapper[4885]: I0308 22:00:08.664811 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df80a103-8bbc-4c66-8995-05152b8b9b66-kube-api-access-rvbf5" (OuterVolumeSpecName: "kube-api-access-rvbf5") pod "df80a103-8bbc-4c66-8995-05152b8b9b66" (UID: "df80a103-8bbc-4c66-8995-05152b8b9b66"). InnerVolumeSpecName "kube-api-access-rvbf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:00:08 crc kubenswrapper[4885]: I0308 22:00:08.761491 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvbf5\" (UniqueName: \"kubernetes.io/projected/df80a103-8bbc-4c66-8995-05152b8b9b66-kube-api-access-rvbf5\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:09 crc kubenswrapper[4885]: I0308 22:00:09.142904 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" event={"ID":"df80a103-8bbc-4c66-8995-05152b8b9b66","Type":"ContainerDied","Data":"e35ccf84af7c2c0e5d7f67c69db2c97e02265a533321330b7e17f1e79ade4639"} Mar 08 22:00:09 crc kubenswrapper[4885]: I0308 22:00:09.143401 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e35ccf84af7c2c0e5d7f67c69db2c97e02265a533321330b7e17f1e79ade4639" Mar 08 22:00:09 crc kubenswrapper[4885]: I0308 22:00:09.143031 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550120-gtf7r" Mar 08 22:00:09 crc kubenswrapper[4885]: I0308 22:00:09.604059 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550114-tw65c"] Mar 08 22:00:09 crc kubenswrapper[4885]: I0308 22:00:09.613692 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550114-tw65c"] Mar 08 22:00:11 crc kubenswrapper[4885]: I0308 22:00:11.385448 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28077c49-e447-4b53-ab0a-078b678e322e" path="/var/lib/kubelet/pods/28077c49-e447-4b53-ab0a-078b678e322e/volumes" Mar 08 22:00:12 crc kubenswrapper[4885]: I0308 22:00:12.081614 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 22:00:12 crc kubenswrapper[4885]: I0308 22:00:12.150388 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 22:00:12 crc kubenswrapper[4885]: I0308 22:00:12.337503 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lzz29"] Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.186200 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lzz29" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="registry-server" containerID="cri-o://1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f" gracePeriod=2 Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.709651 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.803944 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-utilities\") pod \"54512cac-df49-4ce5-aeea-b3f205d06de8\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.804012 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-catalog-content\") pod \"54512cac-df49-4ce5-aeea-b3f205d06de8\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.804137 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq8gv\" (UniqueName: \"kubernetes.io/projected/54512cac-df49-4ce5-aeea-b3f205d06de8-kube-api-access-zq8gv\") pod \"54512cac-df49-4ce5-aeea-b3f205d06de8\" (UID: \"54512cac-df49-4ce5-aeea-b3f205d06de8\") " Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.830529 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-utilities" (OuterVolumeSpecName: "utilities") pod "54512cac-df49-4ce5-aeea-b3f205d06de8" (UID: "54512cac-df49-4ce5-aeea-b3f205d06de8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.840171 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54512cac-df49-4ce5-aeea-b3f205d06de8-kube-api-access-zq8gv" (OuterVolumeSpecName: "kube-api-access-zq8gv") pod "54512cac-df49-4ce5-aeea-b3f205d06de8" (UID: "54512cac-df49-4ce5-aeea-b3f205d06de8"). InnerVolumeSpecName "kube-api-access-zq8gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.906768 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.906814 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq8gv\" (UniqueName: \"kubernetes.io/projected/54512cac-df49-4ce5-aeea-b3f205d06de8-kube-api-access-zq8gv\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:13 crc kubenswrapper[4885]: I0308 22:00:13.962521 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54512cac-df49-4ce5-aeea-b3f205d06de8" (UID: "54512cac-df49-4ce5-aeea-b3f205d06de8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.008707 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54512cac-df49-4ce5-aeea-b3f205d06de8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.199583 4885 generic.go:334] "Generic (PLEG): container finished" podID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerID="1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f" exitCode=0 Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.199649 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerDied","Data":"1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f"} Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.199676 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzz29" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.200060 4885 scope.go:117] "RemoveContainer" containerID="1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.200038 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzz29" event={"ID":"54512cac-df49-4ce5-aeea-b3f205d06de8","Type":"ContainerDied","Data":"99a2eed0975662340cabb29646db1c9fc839d1e4761a0358fe8a7e592a0111d3"} Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.233883 4885 scope.go:117] "RemoveContainer" containerID="42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.267713 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lzz29"] Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.290986 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lzz29"] Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.306139 4885 scope.go:117] "RemoveContainer" containerID="c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.332904 4885 scope.go:117] "RemoveContainer" containerID="1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f" Mar 08 22:00:14 crc kubenswrapper[4885]: E0308 22:00:14.333688 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f\": container with ID starting with 1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f not found: ID does not exist" containerID="1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.333736 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f"} err="failed to get container status \"1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f\": rpc error: code = NotFound desc = could not find container \"1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f\": container with ID starting with 1512ae7ac88cfb06e5349e37d3befa23ef3e2b2d8ecb0d033b81e86b84ae8d3f not found: ID does not exist" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.333769 4885 scope.go:117] "RemoveContainer" containerID="42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907" Mar 08 22:00:14 crc kubenswrapper[4885]: E0308 22:00:14.334176 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907\": container with ID starting with 42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907 not found: ID does not exist" containerID="42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.334236 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907"} err="failed to get container status \"42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907\": rpc error: code = NotFound desc = could not find container \"42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907\": container with ID starting with 42c4cdc6b2c740fc17f150f0a918c442e3d3432f4dbece9fd827c26d59773907 not found: ID does not exist" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.334274 4885 scope.go:117] "RemoveContainer" containerID="c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64" Mar 08 22:00:14 crc kubenswrapper[4885]: E0308 22:00:14.334837 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64\": container with ID starting with c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64 not found: ID does not exist" containerID="c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.334873 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64"} err="failed to get container status \"c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64\": rpc error: code = NotFound desc = could not find container \"c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64\": container with ID starting with c4d0f7af75505ed945c865bf1fcde00a5673dfbb12f66eca4b842c226ed1da64 not found: ID does not exist" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.548563 4885 scope.go:117] "RemoveContainer" containerID="b737af02da1f4abbd613f83608c2ed474264bbee77babc5263321db11c1a06ed" Mar 08 22:00:14 crc kubenswrapper[4885]: I0308 22:00:14.610689 4885 scope.go:117] "RemoveContainer" containerID="3ee4d3c132930646f693aee747f5e8b449d0c9e50fb9f8986810b596ef2d993d" Mar 08 22:00:15 crc kubenswrapper[4885]: I0308 22:00:15.389663 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" path="/var/lib/kubelet/pods/54512cac-df49-4ce5-aeea-b3f205d06de8/volumes" Mar 08 22:00:32 crc kubenswrapper[4885]: I0308 22:00:32.818439 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:00:32 crc kubenswrapper[4885]: I0308 22:00:32.819141 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.158036 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29550121-m29ck"] Mar 08 22:01:00 crc kubenswrapper[4885]: E0308 22:01:00.159149 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="extract-content" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159163 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="extract-content" Mar 08 22:01:00 crc kubenswrapper[4885]: E0308 22:01:00.159180 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06d5ced5-8334-4732-bc13-a8fbf2e27acf" containerName="collect-profiles" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159188 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="06d5ced5-8334-4732-bc13-a8fbf2e27acf" containerName="collect-profiles" Mar 08 22:01:00 crc kubenswrapper[4885]: E0308 22:01:00.159203 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="extract-utilities" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159210 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="extract-utilities" Mar 08 22:01:00 crc kubenswrapper[4885]: E0308 22:01:00.159220 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="registry-server" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159226 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="registry-server" Mar 08 22:01:00 crc kubenswrapper[4885]: E0308 22:01:00.159256 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df80a103-8bbc-4c66-8995-05152b8b9b66" containerName="oc" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159262 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="df80a103-8bbc-4c66-8995-05152b8b9b66" containerName="oc" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159443 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="06d5ced5-8334-4732-bc13-a8fbf2e27acf" containerName="collect-profiles" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159454 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="54512cac-df49-4ce5-aeea-b3f205d06de8" containerName="registry-server" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.159461 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="df80a103-8bbc-4c66-8995-05152b8b9b66" containerName="oc" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.160204 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.176755 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29550121-m29ck"] Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.358270 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-combined-ca-bundle\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.358816 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-fernet-keys\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.358912 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvrfw\" (UniqueName: \"kubernetes.io/projected/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-kube-api-access-vvrfw\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.359034 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-config-data\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.461410 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-combined-ca-bundle\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.461597 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-fernet-keys\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.461656 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvrfw\" (UniqueName: \"kubernetes.io/projected/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-kube-api-access-vvrfw\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.461733 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-config-data\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.475985 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-fernet-keys\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.476020 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-config-data\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.480792 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-combined-ca-bundle\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.499339 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvrfw\" (UniqueName: \"kubernetes.io/projected/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-kube-api-access-vvrfw\") pod \"keystone-cron-29550121-m29ck\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.515826 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:00 crc kubenswrapper[4885]: I0308 22:01:00.833579 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29550121-m29ck"] Mar 08 22:01:01 crc kubenswrapper[4885]: I0308 22:01:01.802400 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550121-m29ck" event={"ID":"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd","Type":"ContainerStarted","Data":"5045186789d21eeaf5b33dd4d5307b17ab20d55f22a9c0e57932c33279bc15fd"} Mar 08 22:01:01 crc kubenswrapper[4885]: I0308 22:01:01.802749 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550121-m29ck" event={"ID":"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd","Type":"ContainerStarted","Data":"0476e0ea6b5ca5d0bbfbb66107bcbce455f731f5f92931010490582f516c9879"} Mar 08 22:01:01 crc kubenswrapper[4885]: I0308 22:01:01.836320 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29550121-m29ck" podStartSLOduration=1.8362786789999999 podStartE2EDuration="1.836278679s" podCreationTimestamp="2026-03-08 22:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 22:01:01.824777191 +0000 UTC m=+8963.220831284" watchObservedRunningTime="2026-03-08 22:01:01.836278679 +0000 UTC m=+8963.232332742" Mar 08 22:01:02 crc kubenswrapper[4885]: I0308 22:01:02.817876 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:01:02 crc kubenswrapper[4885]: I0308 22:01:02.818129 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:01:02 crc kubenswrapper[4885]: I0308 22:01:02.818163 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 22:01:02 crc kubenswrapper[4885]: I0308 22:01:02.818593 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 22:01:02 crc kubenswrapper[4885]: I0308 22:01:02.818633 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" gracePeriod=600 Mar 08 22:01:02 crc kubenswrapper[4885]: E0308 22:01:02.956004 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:01:03 crc kubenswrapper[4885]: I0308 22:01:03.823540 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" exitCode=0 Mar 08 22:01:03 crc kubenswrapper[4885]: I0308 22:01:03.823683 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae"} Mar 08 22:01:03 crc kubenswrapper[4885]: I0308 22:01:03.823876 4885 scope.go:117] "RemoveContainer" containerID="22f99b9a0135e9d4a897236bc6328dc3a4eed4231aa01d92b86545edeafa5622" Mar 08 22:01:03 crc kubenswrapper[4885]: I0308 22:01:03.824617 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:01:03 crc kubenswrapper[4885]: E0308 22:01:03.825165 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:01:04 crc kubenswrapper[4885]: E0308 22:01:04.033171 4885 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.80:41494->38.102.83.80:33667: write tcp 38.102.83.80:41494->38.102.83.80:33667: write: broken pipe Mar 08 22:01:04 crc kubenswrapper[4885]: I0308 22:01:04.835039 4885 generic.go:334] "Generic (PLEG): container finished" podID="8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" containerID="5045186789d21eeaf5b33dd4d5307b17ab20d55f22a9c0e57932c33279bc15fd" exitCode=0 Mar 08 22:01:04 crc kubenswrapper[4885]: I0308 22:01:04.835072 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550121-m29ck" event={"ID":"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd","Type":"ContainerDied","Data":"5045186789d21eeaf5b33dd4d5307b17ab20d55f22a9c0e57932c33279bc15fd"} Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.335544 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.396650 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvrfw\" (UniqueName: \"kubernetes.io/projected/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-kube-api-access-vvrfw\") pod \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.396980 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-fernet-keys\") pod \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.397054 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-config-data\") pod \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.397084 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-combined-ca-bundle\") pod \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\" (UID: \"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd\") " Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.404146 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-kube-api-access-vvrfw" (OuterVolumeSpecName: "kube-api-access-vvrfw") pod "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" (UID: "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd"). InnerVolumeSpecName "kube-api-access-vvrfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.405380 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" (UID: "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.435836 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" (UID: "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.467268 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-config-data" (OuterVolumeSpecName: "config-data") pod "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" (UID: "8bd4a79d-8d75-4e13-8eee-cc51925ca7fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.499600 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvrfw\" (UniqueName: \"kubernetes.io/projected/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-kube-api-access-vvrfw\") on node \"crc\" DevicePath \"\"" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.499632 4885 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.499643 4885 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.499650 4885 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8bd4a79d-8d75-4e13-8eee-cc51925ca7fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.863653 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29550121-m29ck" event={"ID":"8bd4a79d-8d75-4e13-8eee-cc51925ca7fd","Type":"ContainerDied","Data":"0476e0ea6b5ca5d0bbfbb66107bcbce455f731f5f92931010490582f516c9879"} Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.863705 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0476e0ea6b5ca5d0bbfbb66107bcbce455f731f5f92931010490582f516c9879" Mar 08 22:01:06 crc kubenswrapper[4885]: I0308 22:01:06.863779 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29550121-m29ck" Mar 08 22:01:18 crc kubenswrapper[4885]: I0308 22:01:18.368697 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:01:18 crc kubenswrapper[4885]: E0308 22:01:18.369797 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:01:29 crc kubenswrapper[4885]: I0308 22:01:29.382262 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:01:29 crc kubenswrapper[4885]: E0308 22:01:29.383178 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:01:44 crc kubenswrapper[4885]: I0308 22:01:44.392524 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:01:44 crc kubenswrapper[4885]: E0308 22:01:44.393400 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:01:58 crc kubenswrapper[4885]: I0308 22:01:58.369536 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:01:58 crc kubenswrapper[4885]: E0308 22:01:58.370377 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.153732 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550122-qd8zm"] Mar 08 22:02:00 crc kubenswrapper[4885]: E0308 22:02:00.154522 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" containerName="keystone-cron" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.154535 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" containerName="keystone-cron" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.154759 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bd4a79d-8d75-4e13-8eee-cc51925ca7fd" containerName="keystone-cron" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.155551 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.158684 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.160216 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.160904 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.166640 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550122-qd8zm"] Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.306227 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2zzn\" (UniqueName: \"kubernetes.io/projected/9fd3f7a8-4264-431e-b87b-9a60f7133767-kube-api-access-t2zzn\") pod \"auto-csr-approver-29550122-qd8zm\" (UID: \"9fd3f7a8-4264-431e-b87b-9a60f7133767\") " pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.409856 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2zzn\" (UniqueName: \"kubernetes.io/projected/9fd3f7a8-4264-431e-b87b-9a60f7133767-kube-api-access-t2zzn\") pod \"auto-csr-approver-29550122-qd8zm\" (UID: \"9fd3f7a8-4264-431e-b87b-9a60f7133767\") " pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.446895 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2zzn\" (UniqueName: \"kubernetes.io/projected/9fd3f7a8-4264-431e-b87b-9a60f7133767-kube-api-access-t2zzn\") pod \"auto-csr-approver-29550122-qd8zm\" (UID: \"9fd3f7a8-4264-431e-b87b-9a60f7133767\") " pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:00 crc kubenswrapper[4885]: I0308 22:02:00.508282 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:01 crc kubenswrapper[4885]: W0308 22:02:01.015615 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fd3f7a8_4264_431e_b87b_9a60f7133767.slice/crio-f6dc2a2ff0e0e4eec035305e98ad89d453ded40b2ea3878de7b66f383e34b8c3 WatchSource:0}: Error finding container f6dc2a2ff0e0e4eec035305e98ad89d453ded40b2ea3878de7b66f383e34b8c3: Status 404 returned error can't find the container with id f6dc2a2ff0e0e4eec035305e98ad89d453ded40b2ea3878de7b66f383e34b8c3 Mar 08 22:02:01 crc kubenswrapper[4885]: I0308 22:02:01.019258 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550122-qd8zm"] Mar 08 22:02:01 crc kubenswrapper[4885]: I0308 22:02:01.663032 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" event={"ID":"9fd3f7a8-4264-431e-b87b-9a60f7133767","Type":"ContainerStarted","Data":"f6dc2a2ff0e0e4eec035305e98ad89d453ded40b2ea3878de7b66f383e34b8c3"} Mar 08 22:02:02 crc kubenswrapper[4885]: I0308 22:02:02.683031 4885 generic.go:334] "Generic (PLEG): container finished" podID="9fd3f7a8-4264-431e-b87b-9a60f7133767" containerID="585a20322b6f004269b069c692b948e45ca9aa16a182a4437e57d97dc9bea430" exitCode=0 Mar 08 22:02:02 crc kubenswrapper[4885]: I0308 22:02:02.683108 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" event={"ID":"9fd3f7a8-4264-431e-b87b-9a60f7133767","Type":"ContainerDied","Data":"585a20322b6f004269b069c692b948e45ca9aa16a182a4437e57d97dc9bea430"} Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.154857 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.300871 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2zzn\" (UniqueName: \"kubernetes.io/projected/9fd3f7a8-4264-431e-b87b-9a60f7133767-kube-api-access-t2zzn\") pod \"9fd3f7a8-4264-431e-b87b-9a60f7133767\" (UID: \"9fd3f7a8-4264-431e-b87b-9a60f7133767\") " Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.309162 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd3f7a8-4264-431e-b87b-9a60f7133767-kube-api-access-t2zzn" (OuterVolumeSpecName: "kube-api-access-t2zzn") pod "9fd3f7a8-4264-431e-b87b-9a60f7133767" (UID: "9fd3f7a8-4264-431e-b87b-9a60f7133767"). InnerVolumeSpecName "kube-api-access-t2zzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.403704 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2zzn\" (UniqueName: \"kubernetes.io/projected/9fd3f7a8-4264-431e-b87b-9a60f7133767-kube-api-access-t2zzn\") on node \"crc\" DevicePath \"\"" Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.707374 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" event={"ID":"9fd3f7a8-4264-431e-b87b-9a60f7133767","Type":"ContainerDied","Data":"f6dc2a2ff0e0e4eec035305e98ad89d453ded40b2ea3878de7b66f383e34b8c3"} Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.707427 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6dc2a2ff0e0e4eec035305e98ad89d453ded40b2ea3878de7b66f383e34b8c3" Mar 08 22:02:04 crc kubenswrapper[4885]: I0308 22:02:04.707485 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550122-qd8zm" Mar 08 22:02:05 crc kubenswrapper[4885]: I0308 22:02:05.297747 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550116-p5zpr"] Mar 08 22:02:05 crc kubenswrapper[4885]: I0308 22:02:05.305893 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550116-p5zpr"] Mar 08 22:02:05 crc kubenswrapper[4885]: I0308 22:02:05.389091 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480855ed-5f7f-4fb4-99dc-ced66ce15999" path="/var/lib/kubelet/pods/480855ed-5f7f-4fb4-99dc-ced66ce15999/volumes" Mar 08 22:02:13 crc kubenswrapper[4885]: I0308 22:02:13.373557 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:02:13 crc kubenswrapper[4885]: E0308 22:02:13.374451 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:02:14 crc kubenswrapper[4885]: I0308 22:02:14.761537 4885 scope.go:117] "RemoveContainer" containerID="6341a20e412f738fa67c5354da928321dc5ae4ed993b46a3d4ae33371480585f" Mar 08 22:02:15 crc kubenswrapper[4885]: E0308 22:02:15.059579 4885 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.80:58736->38.102.83.80:33667: write tcp 38.102.83.80:58736->38.102.83.80:33667: write: broken pipe Mar 08 22:02:24 crc kubenswrapper[4885]: I0308 22:02:24.340309 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 08 22:02:24 crc kubenswrapper[4885]: I0308 22:02:24.341456 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="1a10ccbd-e30c-478f-84a4-c869a8cd0924" containerName="adoption" containerID="cri-o://ece949f19629c28be550e718314e58a7cadf1e2c5464306896ed9854368d9a69" gracePeriod=30 Mar 08 22:02:27 crc kubenswrapper[4885]: I0308 22:02:27.370801 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:02:27 crc kubenswrapper[4885]: E0308 22:02:27.372390 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:02:40 crc kubenswrapper[4885]: I0308 22:02:40.369404 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:02:40 crc kubenswrapper[4885]: E0308 22:02:40.370391 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:02:52 crc kubenswrapper[4885]: I0308 22:02:52.369165 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:02:52 crc kubenswrapper[4885]: E0308 22:02:52.370251 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:02:54 crc kubenswrapper[4885]: I0308 22:02:54.430866 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1a10ccbd-e30c-478f-84a4-c869a8cd0924","Type":"ContainerDied","Data":"ece949f19629c28be550e718314e58a7cadf1e2c5464306896ed9854368d9a69"} Mar 08 22:02:54 crc kubenswrapper[4885]: I0308 22:02:54.430732 4885 generic.go:334] "Generic (PLEG): container finished" podID="1a10ccbd-e30c-478f-84a4-c869a8cd0924" containerID="ece949f19629c28be550e718314e58a7cadf1e2c5464306896ed9854368d9a69" exitCode=137 Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.007413 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.069357 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scrbk\" (UniqueName: \"kubernetes.io/projected/1a10ccbd-e30c-478f-84a4-c869a8cd0924-kube-api-access-scrbk\") pod \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.089741 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a10ccbd-e30c-478f-84a4-c869a8cd0924-kube-api-access-scrbk" (OuterVolumeSpecName: "kube-api-access-scrbk") pod "1a10ccbd-e30c-478f-84a4-c869a8cd0924" (UID: "1a10ccbd-e30c-478f-84a4-c869a8cd0924"). InnerVolumeSpecName "kube-api-access-scrbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.171652 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") pod \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\" (UID: \"1a10ccbd-e30c-478f-84a4-c869a8cd0924\") " Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.172650 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scrbk\" (UniqueName: \"kubernetes.io/projected/1a10ccbd-e30c-478f-84a4-c869a8cd0924-kube-api-access-scrbk\") on node \"crc\" DevicePath \"\"" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.195250 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019" (OuterVolumeSpecName: "mariadb-data") pod "1a10ccbd-e30c-478f-84a4-c869a8cd0924" (UID: "1a10ccbd-e30c-478f-84a4-c869a8cd0924"). InnerVolumeSpecName "pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.275468 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") on node \"crc\" " Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.305996 4885 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.306165 4885 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019") on node "crc" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.378274 4885 reconciler_common.go:293] "Volume detached for volume \"pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d66ed415-bfd4-4b9a-a14d-dcfe425bf019\") on node \"crc\" DevicePath \"\"" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.444269 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"1a10ccbd-e30c-478f-84a4-c869a8cd0924","Type":"ContainerDied","Data":"035005254ed1ada4ad851bbbcd646012f92c499c388cf96db3c6e6fbc8bfb688"} Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.444320 4885 scope.go:117] "RemoveContainer" containerID="ece949f19629c28be550e718314e58a7cadf1e2c5464306896ed9854368d9a69" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.444388 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.470971 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Mar 08 22:02:55 crc kubenswrapper[4885]: I0308 22:02:55.483693 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Mar 08 22:02:56 crc kubenswrapper[4885]: I0308 22:02:56.178253 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 08 22:02:56 crc kubenswrapper[4885]: I0308 22:02:56.178516 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="a086771f-d0fc-4265-b8ba-a414a7f6c7d0" containerName="adoption" containerID="cri-o://fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19" gracePeriod=30 Mar 08 22:02:57 crc kubenswrapper[4885]: I0308 22:02:57.385021 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a10ccbd-e30c-478f-84a4-c869a8cd0924" path="/var/lib/kubelet/pods/1a10ccbd-e30c-478f-84a4-c869a8cd0924/volumes" Mar 08 22:03:06 crc kubenswrapper[4885]: I0308 22:03:06.368825 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:03:06 crc kubenswrapper[4885]: E0308 22:03:06.369642 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:03:20 crc kubenswrapper[4885]: I0308 22:03:20.368700 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:03:20 crc kubenswrapper[4885]: E0308 22:03:20.369705 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.736201 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.840030 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") pod \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.840124 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-ovn-data-cert\") pod \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.840145 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmb5r\" (UniqueName: \"kubernetes.io/projected/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-kube-api-access-fmb5r\") pod \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\" (UID: \"a086771f-d0fc-4265-b8ba-a414a7f6c7d0\") " Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.845433 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-kube-api-access-fmb5r" (OuterVolumeSpecName: "kube-api-access-fmb5r") pod "a086771f-d0fc-4265-b8ba-a414a7f6c7d0" (UID: "a086771f-d0fc-4265-b8ba-a414a7f6c7d0"). InnerVolumeSpecName "kube-api-access-fmb5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.847047 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "a086771f-d0fc-4265-b8ba-a414a7f6c7d0" (UID: "a086771f-d0fc-4265-b8ba-a414a7f6c7d0"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.856452 4885 generic.go:334] "Generic (PLEG): container finished" podID="a086771f-d0fc-4265-b8ba-a414a7f6c7d0" containerID="fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19" exitCode=137 Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.856504 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a086771f-d0fc-4265-b8ba-a414a7f6c7d0","Type":"ContainerDied","Data":"fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19"} Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.856533 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"a086771f-d0fc-4265-b8ba-a414a7f6c7d0","Type":"ContainerDied","Data":"2b2a6f955da79537fe6939ae44e2e8e65e67c9ab78da8f292a75babe5150e678"} Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.856553 4885 scope.go:117] "RemoveContainer" containerID="fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.856559 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.864007 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f" (OuterVolumeSpecName: "ovn-data") pod "a086771f-d0fc-4265-b8ba-a414a7f6c7d0" (UID: "a086771f-d0fc-4265-b8ba-a414a7f6c7d0"). InnerVolumeSpecName "pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.942880 4885 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") on node \"crc\" " Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.942912 4885 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.942940 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmb5r\" (UniqueName: \"kubernetes.io/projected/a086771f-d0fc-4265-b8ba-a414a7f6c7d0-kube-api-access-fmb5r\") on node \"crc\" DevicePath \"\"" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.964212 4885 scope.go:117] "RemoveContainer" containerID="fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19" Mar 08 22:03:26 crc kubenswrapper[4885]: E0308 22:03:26.964521 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19\": container with ID starting with fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19 not found: ID does not exist" containerID="fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.964558 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19"} err="failed to get container status \"fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19\": rpc error: code = NotFound desc = could not find container \"fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19\": container with ID starting with fadf6c7b9fc2b504fa1457f69fc40e6610b3cb2c4cdd26149fcfd874698cab19 not found: ID does not exist" Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.984284 4885 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 08 22:03:26 crc kubenswrapper[4885]: I0308 22:03:26.984504 4885 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f") on node "crc" Mar 08 22:03:27 crc kubenswrapper[4885]: I0308 22:03:27.044519 4885 reconciler_common.go:293] "Volume detached for volume \"pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31acecd1-e16d-4517-aa04-ae2c57b0518f\") on node \"crc\" DevicePath \"\"" Mar 08 22:03:27 crc kubenswrapper[4885]: I0308 22:03:27.199652 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Mar 08 22:03:27 crc kubenswrapper[4885]: I0308 22:03:27.214082 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Mar 08 22:03:27 crc kubenswrapper[4885]: I0308 22:03:27.411227 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a086771f-d0fc-4265-b8ba-a414a7f6c7d0" path="/var/lib/kubelet/pods/a086771f-d0fc-4265-b8ba-a414a7f6c7d0/volumes" Mar 08 22:03:35 crc kubenswrapper[4885]: I0308 22:03:35.369111 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:03:35 crc kubenswrapper[4885]: E0308 22:03:35.370384 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.746427 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-66f54"] Mar 08 22:03:43 crc kubenswrapper[4885]: E0308 22:03:43.747758 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a10ccbd-e30c-478f-84a4-c869a8cd0924" containerName="adoption" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.747776 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a10ccbd-e30c-478f-84a4-c869a8cd0924" containerName="adoption" Mar 08 22:03:43 crc kubenswrapper[4885]: E0308 22:03:43.747795 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a086771f-d0fc-4265-b8ba-a414a7f6c7d0" containerName="adoption" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.747805 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="a086771f-d0fc-4265-b8ba-a414a7f6c7d0" containerName="adoption" Mar 08 22:03:43 crc kubenswrapper[4885]: E0308 22:03:43.747824 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fd3f7a8-4264-431e-b87b-9a60f7133767" containerName="oc" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.747832 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd3f7a8-4264-431e-b87b-9a60f7133767" containerName="oc" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.748146 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="a086771f-d0fc-4265-b8ba-a414a7f6c7d0" containerName="adoption" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.748171 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fd3f7a8-4264-431e-b87b-9a60f7133767" containerName="oc" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.748195 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a10ccbd-e30c-478f-84a4-c869a8cd0924" containerName="adoption" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.750274 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.781940 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66f54"] Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.874176 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-utilities\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.874439 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4tj4\" (UniqueName: \"kubernetes.io/projected/669459aa-3d2d-4661-9b8f-61559e8ddd40-kube-api-access-x4tj4\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.874670 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-catalog-content\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.977418 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4tj4\" (UniqueName: \"kubernetes.io/projected/669459aa-3d2d-4661-9b8f-61559e8ddd40-kube-api-access-x4tj4\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.977570 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-catalog-content\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.977707 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-utilities\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.978297 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-catalog-content\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:43 crc kubenswrapper[4885]: I0308 22:03:43.978312 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-utilities\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:44 crc kubenswrapper[4885]: I0308 22:03:44.002726 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4tj4\" (UniqueName: \"kubernetes.io/projected/669459aa-3d2d-4661-9b8f-61559e8ddd40-kube-api-access-x4tj4\") pod \"redhat-marketplace-66f54\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:44 crc kubenswrapper[4885]: I0308 22:03:44.097657 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:44 crc kubenswrapper[4885]: I0308 22:03:44.604398 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-66f54"] Mar 08 22:03:44 crc kubenswrapper[4885]: W0308 22:03:44.627996 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod669459aa_3d2d_4661_9b8f_61559e8ddd40.slice/crio-d89c93f64baeef2f098f805a0a92789e655f00e6b9b4104e174e019efa3c92c9 WatchSource:0}: Error finding container d89c93f64baeef2f098f805a0a92789e655f00e6b9b4104e174e019efa3c92c9: Status 404 returned error can't find the container with id d89c93f64baeef2f098f805a0a92789e655f00e6b9b4104e174e019efa3c92c9 Mar 08 22:03:45 crc kubenswrapper[4885]: I0308 22:03:45.077176 4885 generic.go:334] "Generic (PLEG): container finished" podID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerID="6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e" exitCode=0 Mar 08 22:03:45 crc kubenswrapper[4885]: I0308 22:03:45.077243 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerDied","Data":"6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e"} Mar 08 22:03:45 crc kubenswrapper[4885]: I0308 22:03:45.077635 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerStarted","Data":"d89c93f64baeef2f098f805a0a92789e655f00e6b9b4104e174e019efa3c92c9"} Mar 08 22:03:46 crc kubenswrapper[4885]: I0308 22:03:46.097164 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerStarted","Data":"d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6"} Mar 08 22:03:47 crc kubenswrapper[4885]: I0308 22:03:47.108660 4885 generic.go:334] "Generic (PLEG): container finished" podID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerID="d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6" exitCode=0 Mar 08 22:03:47 crc kubenswrapper[4885]: I0308 22:03:47.108729 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerDied","Data":"d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6"} Mar 08 22:03:48 crc kubenswrapper[4885]: I0308 22:03:48.123219 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerStarted","Data":"d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393"} Mar 08 22:03:48 crc kubenswrapper[4885]: I0308 22:03:48.144768 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-66f54" podStartSLOduration=2.659669096 podStartE2EDuration="5.144749667s" podCreationTimestamp="2026-03-08 22:03:43 +0000 UTC" firstStartedPulling="2026-03-08 22:03:45.079481603 +0000 UTC m=+9126.475535626" lastFinishedPulling="2026-03-08 22:03:47.564562164 +0000 UTC m=+9128.960616197" observedRunningTime="2026-03-08 22:03:48.14184135 +0000 UTC m=+9129.537895373" watchObservedRunningTime="2026-03-08 22:03:48.144749667 +0000 UTC m=+9129.540803680" Mar 08 22:03:49 crc kubenswrapper[4885]: I0308 22:03:49.375024 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:03:49 crc kubenswrapper[4885]: E0308 22:03:49.375627 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:03:54 crc kubenswrapper[4885]: I0308 22:03:54.098544 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:54 crc kubenswrapper[4885]: I0308 22:03:54.099109 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:54 crc kubenswrapper[4885]: I0308 22:03:54.192954 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:54 crc kubenswrapper[4885]: I0308 22:03:54.291054 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:54 crc kubenswrapper[4885]: I0308 22:03:54.441643 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66f54"] Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.232216 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-66f54" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="registry-server" containerID="cri-o://d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393" gracePeriod=2 Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.748706 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.888462 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-utilities\") pod \"669459aa-3d2d-4661-9b8f-61559e8ddd40\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.888533 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-catalog-content\") pod \"669459aa-3d2d-4661-9b8f-61559e8ddd40\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.888568 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4tj4\" (UniqueName: \"kubernetes.io/projected/669459aa-3d2d-4661-9b8f-61559e8ddd40-kube-api-access-x4tj4\") pod \"669459aa-3d2d-4661-9b8f-61559e8ddd40\" (UID: \"669459aa-3d2d-4661-9b8f-61559e8ddd40\") " Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.889913 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-utilities" (OuterVolumeSpecName: "utilities") pod "669459aa-3d2d-4661-9b8f-61559e8ddd40" (UID: "669459aa-3d2d-4661-9b8f-61559e8ddd40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.896178 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/669459aa-3d2d-4661-9b8f-61559e8ddd40-kube-api-access-x4tj4" (OuterVolumeSpecName: "kube-api-access-x4tj4") pod "669459aa-3d2d-4661-9b8f-61559e8ddd40" (UID: "669459aa-3d2d-4661-9b8f-61559e8ddd40"). InnerVolumeSpecName "kube-api-access-x4tj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.913640 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "669459aa-3d2d-4661-9b8f-61559e8ddd40" (UID: "669459aa-3d2d-4661-9b8f-61559e8ddd40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.991638 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.991670 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/669459aa-3d2d-4661-9b8f-61559e8ddd40-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:03:56 crc kubenswrapper[4885]: I0308 22:03:56.991704 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4tj4\" (UniqueName: \"kubernetes.io/projected/669459aa-3d2d-4661-9b8f-61559e8ddd40-kube-api-access-x4tj4\") on node \"crc\" DevicePath \"\"" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.258359 4885 generic.go:334] "Generic (PLEG): container finished" podID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerID="d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393" exitCode=0 Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.258432 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerDied","Data":"d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393"} Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.258478 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-66f54" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.258565 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-66f54" event={"ID":"669459aa-3d2d-4661-9b8f-61559e8ddd40","Type":"ContainerDied","Data":"d89c93f64baeef2f098f805a0a92789e655f00e6b9b4104e174e019efa3c92c9"} Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.258610 4885 scope.go:117] "RemoveContainer" containerID="d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.299612 4885 scope.go:117] "RemoveContainer" containerID="d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.342010 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-66f54"] Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.346661 4885 scope.go:117] "RemoveContainer" containerID="6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.360513 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-66f54"] Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.395256 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" path="/var/lib/kubelet/pods/669459aa-3d2d-4661-9b8f-61559e8ddd40/volumes" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.401075 4885 scope.go:117] "RemoveContainer" containerID="d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393" Mar 08 22:03:57 crc kubenswrapper[4885]: E0308 22:03:57.401448 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393\": container with ID starting with d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393 not found: ID does not exist" containerID="d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.401486 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393"} err="failed to get container status \"d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393\": rpc error: code = NotFound desc = could not find container \"d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393\": container with ID starting with d761fb038b6337f0441e7c74ae30beec000a68f7db564a7f0d8620aaec5e2393 not found: ID does not exist" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.401512 4885 scope.go:117] "RemoveContainer" containerID="d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6" Mar 08 22:03:57 crc kubenswrapper[4885]: E0308 22:03:57.401757 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6\": container with ID starting with d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6 not found: ID does not exist" containerID="d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.401794 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6"} err="failed to get container status \"d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6\": rpc error: code = NotFound desc = could not find container \"d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6\": container with ID starting with d2d2a5f8305c68f3880dcbeabdc9638834591788af6d4e3632c0ccd437c2f6c6 not found: ID does not exist" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.401818 4885 scope.go:117] "RemoveContainer" containerID="6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e" Mar 08 22:03:57 crc kubenswrapper[4885]: E0308 22:03:57.402816 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e\": container with ID starting with 6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e not found: ID does not exist" containerID="6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e" Mar 08 22:03:57 crc kubenswrapper[4885]: I0308 22:03:57.402853 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e"} err="failed to get container status \"6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e\": rpc error: code = NotFound desc = could not find container \"6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e\": container with ID starting with 6ebaf6f41039a0a85c24937ac3f105dde1d51e466fb08a054b3692f35396cb8e not found: ID does not exist" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.141375 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550124-cd88x"] Mar 08 22:04:00 crc kubenswrapper[4885]: E0308 22:04:00.142325 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="registry-server" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.142341 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="registry-server" Mar 08 22:04:00 crc kubenswrapper[4885]: E0308 22:04:00.142362 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="extract-content" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.142368 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="extract-content" Mar 08 22:04:00 crc kubenswrapper[4885]: E0308 22:04:00.142392 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="extract-utilities" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.142398 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="extract-utilities" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.142630 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="669459aa-3d2d-4661-9b8f-61559e8ddd40" containerName="registry-server" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.143387 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.145721 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.145853 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.146178 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.157258 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550124-cd88x"] Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.265217 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp9tl\" (UniqueName: \"kubernetes.io/projected/31bb0f9f-aafb-4d40-9ef4-60deec075e85-kube-api-access-lp9tl\") pod \"auto-csr-approver-29550124-cd88x\" (UID: \"31bb0f9f-aafb-4d40-9ef4-60deec075e85\") " pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.367005 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lp9tl\" (UniqueName: \"kubernetes.io/projected/31bb0f9f-aafb-4d40-9ef4-60deec075e85-kube-api-access-lp9tl\") pod \"auto-csr-approver-29550124-cd88x\" (UID: \"31bb0f9f-aafb-4d40-9ef4-60deec075e85\") " pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.400253 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp9tl\" (UniqueName: \"kubernetes.io/projected/31bb0f9f-aafb-4d40-9ef4-60deec075e85-kube-api-access-lp9tl\") pod \"auto-csr-approver-29550124-cd88x\" (UID: \"31bb0f9f-aafb-4d40-9ef4-60deec075e85\") " pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:00 crc kubenswrapper[4885]: I0308 22:04:00.462525 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:01 crc kubenswrapper[4885]: I0308 22:04:01.041903 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550124-cd88x"] Mar 08 22:04:01 crc kubenswrapper[4885]: W0308 22:04:01.044382 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31bb0f9f_aafb_4d40_9ef4_60deec075e85.slice/crio-806d5e9f3f98a2608e784902f642760f4a083465d880a4492d648503a077a518 WatchSource:0}: Error finding container 806d5e9f3f98a2608e784902f642760f4a083465d880a4492d648503a077a518: Status 404 returned error can't find the container with id 806d5e9f3f98a2608e784902f642760f4a083465d880a4492d648503a077a518 Mar 08 22:04:01 crc kubenswrapper[4885]: I0308 22:04:01.306425 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550124-cd88x" event={"ID":"31bb0f9f-aafb-4d40-9ef4-60deec075e85","Type":"ContainerStarted","Data":"806d5e9f3f98a2608e784902f642760f4a083465d880a4492d648503a077a518"} Mar 08 22:04:03 crc kubenswrapper[4885]: I0308 22:04:03.337632 4885 generic.go:334] "Generic (PLEG): container finished" podID="31bb0f9f-aafb-4d40-9ef4-60deec075e85" containerID="1e7a14780a56f990fdb7ec2362f5fabc1bb27c3bceac15a1a207c4524477403d" exitCode=0 Mar 08 22:04:03 crc kubenswrapper[4885]: I0308 22:04:03.337791 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550124-cd88x" event={"ID":"31bb0f9f-aafb-4d40-9ef4-60deec075e85","Type":"ContainerDied","Data":"1e7a14780a56f990fdb7ec2362f5fabc1bb27c3bceac15a1a207c4524477403d"} Mar 08 22:04:04 crc kubenswrapper[4885]: I0308 22:04:04.368406 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:04:04 crc kubenswrapper[4885]: E0308 22:04:04.369137 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.065866 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.208502 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lp9tl\" (UniqueName: \"kubernetes.io/projected/31bb0f9f-aafb-4d40-9ef4-60deec075e85-kube-api-access-lp9tl\") pod \"31bb0f9f-aafb-4d40-9ef4-60deec075e85\" (UID: \"31bb0f9f-aafb-4d40-9ef4-60deec075e85\") " Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.214403 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31bb0f9f-aafb-4d40-9ef4-60deec075e85-kube-api-access-lp9tl" (OuterVolumeSpecName: "kube-api-access-lp9tl") pod "31bb0f9f-aafb-4d40-9ef4-60deec075e85" (UID: "31bb0f9f-aafb-4d40-9ef4-60deec075e85"). InnerVolumeSpecName "kube-api-access-lp9tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.310974 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lp9tl\" (UniqueName: \"kubernetes.io/projected/31bb0f9f-aafb-4d40-9ef4-60deec075e85-kube-api-access-lp9tl\") on node \"crc\" DevicePath \"\"" Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.364030 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550124-cd88x" event={"ID":"31bb0f9f-aafb-4d40-9ef4-60deec075e85","Type":"ContainerDied","Data":"806d5e9f3f98a2608e784902f642760f4a083465d880a4492d648503a077a518"} Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.364069 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="806d5e9f3f98a2608e784902f642760f4a083465d880a4492d648503a077a518" Mar 08 22:04:05 crc kubenswrapper[4885]: I0308 22:04:05.364394 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550124-cd88x" Mar 08 22:04:06 crc kubenswrapper[4885]: I0308 22:04:06.162583 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550118-28jxb"] Mar 08 22:04:06 crc kubenswrapper[4885]: I0308 22:04:06.171840 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550118-28jxb"] Mar 08 22:04:07 crc kubenswrapper[4885]: I0308 22:04:07.402904 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc" path="/var/lib/kubelet/pods/0ddf386b-7eb9-4312-b1f0-8ce362b1f5dc/volumes" Mar 08 22:04:14 crc kubenswrapper[4885]: I0308 22:04:14.918125 4885 scope.go:117] "RemoveContainer" containerID="0d8b1adfd4f970a1d96597c5128035560d0e6296ee9788584fde3df1fcb99135" Mar 08 22:04:17 crc kubenswrapper[4885]: I0308 22:04:17.369298 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:04:17 crc kubenswrapper[4885]: E0308 22:04:17.369996 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.800932 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58ck9/must-gather-x7xtd"] Mar 08 22:04:21 crc kubenswrapper[4885]: E0308 22:04:21.801864 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31bb0f9f-aafb-4d40-9ef4-60deec075e85" containerName="oc" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.801877 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="31bb0f9f-aafb-4d40-9ef4-60deec075e85" containerName="oc" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.802092 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="31bb0f9f-aafb-4d40-9ef4-60deec075e85" containerName="oc" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.811320 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.816443 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-58ck9"/"default-dockercfg-5z8m9" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.817407 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-58ck9"/"openshift-service-ca.crt" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.818643 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-58ck9"/"kube-root-ca.crt" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.835111 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/009e478e-8f33-43d1-aded-7d3084ed486e-must-gather-output\") pod \"must-gather-x7xtd\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.835220 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsvfg\" (UniqueName: \"kubernetes.io/projected/009e478e-8f33-43d1-aded-7d3084ed486e-kube-api-access-bsvfg\") pod \"must-gather-x7xtd\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.857777 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58ck9/must-gather-x7xtd"] Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.936829 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/009e478e-8f33-43d1-aded-7d3084ed486e-must-gather-output\") pod \"must-gather-x7xtd\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.937199 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsvfg\" (UniqueName: \"kubernetes.io/projected/009e478e-8f33-43d1-aded-7d3084ed486e-kube-api-access-bsvfg\") pod \"must-gather-x7xtd\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.937485 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/009e478e-8f33-43d1-aded-7d3084ed486e-must-gather-output\") pod \"must-gather-x7xtd\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:21 crc kubenswrapper[4885]: I0308 22:04:21.957288 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsvfg\" (UniqueName: \"kubernetes.io/projected/009e478e-8f33-43d1-aded-7d3084ed486e-kube-api-access-bsvfg\") pod \"must-gather-x7xtd\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:22 crc kubenswrapper[4885]: I0308 22:04:22.149102 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:04:22 crc kubenswrapper[4885]: I0308 22:04:22.629113 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-58ck9/must-gather-x7xtd"] Mar 08 22:04:23 crc kubenswrapper[4885]: I0308 22:04:23.586530 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/must-gather-x7xtd" event={"ID":"009e478e-8f33-43d1-aded-7d3084ed486e","Type":"ContainerStarted","Data":"e258e3e225eab9a4f8266b5151c349549e25ac4eb436b890df34a0b800489fe9"} Mar 08 22:04:29 crc kubenswrapper[4885]: I0308 22:04:29.380428 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:04:29 crc kubenswrapper[4885]: E0308 22:04:29.381718 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:04:30 crc kubenswrapper[4885]: I0308 22:04:30.680877 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/must-gather-x7xtd" event={"ID":"009e478e-8f33-43d1-aded-7d3084ed486e","Type":"ContainerStarted","Data":"d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916"} Mar 08 22:04:30 crc kubenswrapper[4885]: I0308 22:04:30.682237 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/must-gather-x7xtd" event={"ID":"009e478e-8f33-43d1-aded-7d3084ed486e","Type":"ContainerStarted","Data":"995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b"} Mar 08 22:04:30 crc kubenswrapper[4885]: I0308 22:04:30.728487 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-58ck9/must-gather-x7xtd" podStartSLOduration=2.910304442 podStartE2EDuration="9.728452582s" podCreationTimestamp="2026-03-08 22:04:21 +0000 UTC" firstStartedPulling="2026-03-08 22:04:22.631689989 +0000 UTC m=+9164.027744052" lastFinishedPulling="2026-03-08 22:04:29.449838169 +0000 UTC m=+9170.845892192" observedRunningTime="2026-03-08 22:04:30.712025153 +0000 UTC m=+9172.108079206" watchObservedRunningTime="2026-03-08 22:04:30.728452582 +0000 UTC m=+9172.124506645" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.185754 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58ck9/crc-debug-7nfbg"] Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.189717 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.317637 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnff6\" (UniqueName: \"kubernetes.io/projected/3d6c9d74-a8c6-4436-a834-8c339a59b15f-kube-api-access-hnff6\") pod \"crc-debug-7nfbg\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.317897 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d6c9d74-a8c6-4436-a834-8c339a59b15f-host\") pod \"crc-debug-7nfbg\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.420452 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnff6\" (UniqueName: \"kubernetes.io/projected/3d6c9d74-a8c6-4436-a834-8c339a59b15f-kube-api-access-hnff6\") pod \"crc-debug-7nfbg\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.420656 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d6c9d74-a8c6-4436-a834-8c339a59b15f-host\") pod \"crc-debug-7nfbg\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.420810 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d6c9d74-a8c6-4436-a834-8c339a59b15f-host\") pod \"crc-debug-7nfbg\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.439193 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnff6\" (UniqueName: \"kubernetes.io/projected/3d6c9d74-a8c6-4436-a834-8c339a59b15f-kube-api-access-hnff6\") pod \"crc-debug-7nfbg\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.512577 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:04:34 crc kubenswrapper[4885]: W0308 22:04:34.580624 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d6c9d74_a8c6_4436_a834_8c339a59b15f.slice/crio-beecb67750bb0ffbf8d767799c985561edec04ec3b01c071981683d2e27340ca WatchSource:0}: Error finding container beecb67750bb0ffbf8d767799c985561edec04ec3b01c071981683d2e27340ca: Status 404 returned error can't find the container with id beecb67750bb0ffbf8d767799c985561edec04ec3b01c071981683d2e27340ca Mar 08 22:04:34 crc kubenswrapper[4885]: I0308 22:04:34.740460 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" event={"ID":"3d6c9d74-a8c6-4436-a834-8c339a59b15f","Type":"ContainerStarted","Data":"beecb67750bb0ffbf8d767799c985561edec04ec3b01c071981683d2e27340ca"} Mar 08 22:04:40 crc kubenswrapper[4885]: I0308 22:04:40.368167 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:04:40 crc kubenswrapper[4885]: E0308 22:04:40.368820 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:04:46 crc kubenswrapper[4885]: I0308 22:04:46.856127 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" event={"ID":"3d6c9d74-a8c6-4436-a834-8c339a59b15f","Type":"ContainerStarted","Data":"409ba0e76cc123c34984a34f377c29a5545224b014b0468202c033f80128ed06"} Mar 08 22:04:46 crc kubenswrapper[4885]: I0308 22:04:46.872782 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" podStartSLOduration=1.058227252 podStartE2EDuration="12.872766564s" podCreationTimestamp="2026-03-08 22:04:34 +0000 UTC" firstStartedPulling="2026-03-08 22:04:34.583958376 +0000 UTC m=+9175.980012399" lastFinishedPulling="2026-03-08 22:04:46.398497688 +0000 UTC m=+9187.794551711" observedRunningTime="2026-03-08 22:04:46.869977591 +0000 UTC m=+9188.266031634" watchObservedRunningTime="2026-03-08 22:04:46.872766564 +0000 UTC m=+9188.268820587" Mar 08 22:04:53 crc kubenswrapper[4885]: I0308 22:04:53.368821 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:04:53 crc kubenswrapper[4885]: E0308 22:04:53.369667 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:04:57 crc kubenswrapper[4885]: I0308 22:04:57.895199 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xdkwp"] Mar 08 22:04:57 crc kubenswrapper[4885]: I0308 22:04:57.897678 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:57 crc kubenswrapper[4885]: I0308 22:04:57.910168 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdkwp"] Mar 08 22:04:57 crc kubenswrapper[4885]: I0308 22:04:57.971716 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-catalog-content\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:57 crc kubenswrapper[4885]: I0308 22:04:57.971787 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkgfr\" (UniqueName: \"kubernetes.io/projected/ce29cc99-ca85-4a7b-b027-1bc84fa92252-kube-api-access-gkgfr\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:57 crc kubenswrapper[4885]: I0308 22:04:57.972017 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-utilities\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:58 crc kubenswrapper[4885]: I0308 22:04:58.073853 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-catalog-content\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:58 crc kubenswrapper[4885]: I0308 22:04:58.073943 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkgfr\" (UniqueName: \"kubernetes.io/projected/ce29cc99-ca85-4a7b-b027-1bc84fa92252-kube-api-access-gkgfr\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:58 crc kubenswrapper[4885]: I0308 22:04:58.073999 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-utilities\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:58 crc kubenswrapper[4885]: I0308 22:04:58.074595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-utilities\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:58 crc kubenswrapper[4885]: I0308 22:04:58.074835 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-catalog-content\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:58 crc kubenswrapper[4885]: I0308 22:04:58.091867 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkgfr\" (UniqueName: \"kubernetes.io/projected/ce29cc99-ca85-4a7b-b027-1bc84fa92252-kube-api-access-gkgfr\") pod \"certified-operators-xdkwp\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:04:59 crc kubenswrapper[4885]: I0308 22:04:59.859173 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:00 crc kubenswrapper[4885]: I0308 22:05:00.388817 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xdkwp"] Mar 08 22:05:00 crc kubenswrapper[4885]: I0308 22:05:00.996701 4885 generic.go:334] "Generic (PLEG): container finished" podID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerID="d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609" exitCode=0 Mar 08 22:05:00 crc kubenswrapper[4885]: I0308 22:05:00.996785 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerDied","Data":"d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609"} Mar 08 22:05:00 crc kubenswrapper[4885]: I0308 22:05:00.997047 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerStarted","Data":"8711031805b818de77f147045a567e51e1dcee443314467dfab3ca8218b78d52"} Mar 08 22:05:00 crc kubenswrapper[4885]: I0308 22:05:00.999658 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 22:05:03 crc kubenswrapper[4885]: I0308 22:05:03.020772 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerStarted","Data":"8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2"} Mar 08 22:05:05 crc kubenswrapper[4885]: I0308 22:05:05.040657 4885 generic.go:334] "Generic (PLEG): container finished" podID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerID="8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2" exitCode=0 Mar 08 22:05:05 crc kubenswrapper[4885]: I0308 22:05:05.040745 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerDied","Data":"8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2"} Mar 08 22:05:06 crc kubenswrapper[4885]: I0308 22:05:06.060257 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerStarted","Data":"af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082"} Mar 08 22:05:06 crc kubenswrapper[4885]: I0308 22:05:06.088008 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xdkwp" podStartSLOduration=4.635462104 podStartE2EDuration="9.087983071s" podCreationTimestamp="2026-03-08 22:04:57 +0000 UTC" firstStartedPulling="2026-03-08 22:05:00.999350918 +0000 UTC m=+9202.395404941" lastFinishedPulling="2026-03-08 22:05:05.451871885 +0000 UTC m=+9206.847925908" observedRunningTime="2026-03-08 22:05:06.079854474 +0000 UTC m=+9207.475908497" watchObservedRunningTime="2026-03-08 22:05:06.087983071 +0000 UTC m=+9207.484037124" Mar 08 22:05:08 crc kubenswrapper[4885]: I0308 22:05:08.369404 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:05:08 crc kubenswrapper[4885]: E0308 22:05:08.369812 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:05:09 crc kubenswrapper[4885]: I0308 22:05:09.089319 4885 generic.go:334] "Generic (PLEG): container finished" podID="3d6c9d74-a8c6-4436-a834-8c339a59b15f" containerID="409ba0e76cc123c34984a34f377c29a5545224b014b0468202c033f80128ed06" exitCode=0 Mar 08 22:05:09 crc kubenswrapper[4885]: I0308 22:05:09.089416 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" event={"ID":"3d6c9d74-a8c6-4436-a834-8c339a59b15f","Type":"ContainerDied","Data":"409ba0e76cc123c34984a34f377c29a5545224b014b0468202c033f80128ed06"} Mar 08 22:05:09 crc kubenswrapper[4885]: I0308 22:05:09.859849 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:09 crc kubenswrapper[4885]: I0308 22:05:09.859885 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:09 crc kubenswrapper[4885]: I0308 22:05:09.940357 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.154442 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.212652 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.217410 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdkwp"] Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.246024 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-58ck9/crc-debug-7nfbg"] Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.255353 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-58ck9/crc-debug-7nfbg"] Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.324736 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d6c9d74-a8c6-4436-a834-8c339a59b15f-host\") pod \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.325111 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnff6\" (UniqueName: \"kubernetes.io/projected/3d6c9d74-a8c6-4436-a834-8c339a59b15f-kube-api-access-hnff6\") pod \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\" (UID: \"3d6c9d74-a8c6-4436-a834-8c339a59b15f\") " Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.324855 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d6c9d74-a8c6-4436-a834-8c339a59b15f-host" (OuterVolumeSpecName: "host") pod "3d6c9d74-a8c6-4436-a834-8c339a59b15f" (UID: "3d6c9d74-a8c6-4436-a834-8c339a59b15f"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.325682 4885 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3d6c9d74-a8c6-4436-a834-8c339a59b15f-host\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.332868 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d6c9d74-a8c6-4436-a834-8c339a59b15f-kube-api-access-hnff6" (OuterVolumeSpecName: "kube-api-access-hnff6") pod "3d6c9d74-a8c6-4436-a834-8c339a59b15f" (UID: "3d6c9d74-a8c6-4436-a834-8c339a59b15f"). InnerVolumeSpecName "kube-api-access-hnff6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:05:10 crc kubenswrapper[4885]: I0308 22:05:10.427491 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnff6\" (UniqueName: \"kubernetes.io/projected/3d6c9d74-a8c6-4436-a834-8c339a59b15f-kube-api-access-hnff6\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.109370 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beecb67750bb0ffbf8d767799c985561edec04ec3b01c071981683d2e27340ca" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.109419 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-7nfbg" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.380832 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d6c9d74-a8c6-4436-a834-8c339a59b15f" path="/var/lib/kubelet/pods/3d6c9d74-a8c6-4436-a834-8c339a59b15f/volumes" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.432017 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58ck9/crc-debug-5zqd9"] Mar 08 22:05:11 crc kubenswrapper[4885]: E0308 22:05:11.432437 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d6c9d74-a8c6-4436-a834-8c339a59b15f" containerName="container-00" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.432454 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d6c9d74-a8c6-4436-a834-8c339a59b15f" containerName="container-00" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.432664 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d6c9d74-a8c6-4436-a834-8c339a59b15f" containerName="container-00" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.433412 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.549880 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6924f\" (UniqueName: \"kubernetes.io/projected/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-kube-api-access-6924f\") pod \"crc-debug-5zqd9\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.550168 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-host\") pod \"crc-debug-5zqd9\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.652724 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-host\") pod \"crc-debug-5zqd9\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.652963 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6924f\" (UniqueName: \"kubernetes.io/projected/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-kube-api-access-6924f\") pod \"crc-debug-5zqd9\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.652969 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-host\") pod \"crc-debug-5zqd9\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.669597 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6924f\" (UniqueName: \"kubernetes.io/projected/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-kube-api-access-6924f\") pod \"crc-debug-5zqd9\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:11 crc kubenswrapper[4885]: I0308 22:05:11.754281 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.118496 4885 generic.go:334] "Generic (PLEG): container finished" podID="ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" containerID="857ab72a7c2126bc7b60e9aa2682b2b13391c1878713ec25e9c07f2cbc164789" exitCode=0 Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.118714 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-5zqd9" event={"ID":"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733","Type":"ContainerDied","Data":"857ab72a7c2126bc7b60e9aa2682b2b13391c1878713ec25e9c07f2cbc164789"} Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.118805 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-5zqd9" event={"ID":"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733","Type":"ContainerStarted","Data":"890a75bcfecb4469b6bad0f0bf13d255a9327ba2fbb2b8828d4b0a1a3e7278f9"} Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.118970 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xdkwp" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="registry-server" containerID="cri-o://af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082" gracePeriod=2 Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.316110 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-58ck9/crc-debug-5zqd9"] Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.329503 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-58ck9/crc-debug-5zqd9"] Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.596336 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.687413 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-utilities\") pod \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.687763 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-catalog-content\") pod \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.687822 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkgfr\" (UniqueName: \"kubernetes.io/projected/ce29cc99-ca85-4a7b-b027-1bc84fa92252-kube-api-access-gkgfr\") pod \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\" (UID: \"ce29cc99-ca85-4a7b-b027-1bc84fa92252\") " Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.688834 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-utilities" (OuterVolumeSpecName: "utilities") pod "ce29cc99-ca85-4a7b-b027-1bc84fa92252" (UID: "ce29cc99-ca85-4a7b-b027-1bc84fa92252"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.697568 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce29cc99-ca85-4a7b-b027-1bc84fa92252-kube-api-access-gkgfr" (OuterVolumeSpecName: "kube-api-access-gkgfr") pod "ce29cc99-ca85-4a7b-b027-1bc84fa92252" (UID: "ce29cc99-ca85-4a7b-b027-1bc84fa92252"). InnerVolumeSpecName "kube-api-access-gkgfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.770314 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce29cc99-ca85-4a7b-b027-1bc84fa92252" (UID: "ce29cc99-ca85-4a7b-b027-1bc84fa92252"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.791600 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.791639 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkgfr\" (UniqueName: \"kubernetes.io/projected/ce29cc99-ca85-4a7b-b027-1bc84fa92252-kube-api-access-gkgfr\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:12 crc kubenswrapper[4885]: I0308 22:05:12.791649 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce29cc99-ca85-4a7b-b027-1bc84fa92252-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.129821 4885 generic.go:334] "Generic (PLEG): container finished" podID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerID="af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082" exitCode=0 Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.129879 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerDied","Data":"af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082"} Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.130192 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xdkwp" event={"ID":"ce29cc99-ca85-4a7b-b027-1bc84fa92252","Type":"ContainerDied","Data":"8711031805b818de77f147045a567e51e1dcee443314467dfab3ca8218b78d52"} Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.130213 4885 scope.go:117] "RemoveContainer" containerID="af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.129892 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xdkwp" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.234685 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.236534 4885 scope.go:117] "RemoveContainer" containerID="8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.260193 4885 scope.go:117] "RemoveContainer" containerID="d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.266830 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xdkwp"] Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.281053 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xdkwp"] Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.301266 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-host\") pod \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.301443 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6924f\" (UniqueName: \"kubernetes.io/projected/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-kube-api-access-6924f\") pod \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\" (UID: \"ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733\") " Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.303161 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-host" (OuterVolumeSpecName: "host") pod "ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" (UID: "ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.308085 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-kube-api-access-6924f" (OuterVolumeSpecName: "kube-api-access-6924f") pod "ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" (UID: "ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733"). InnerVolumeSpecName "kube-api-access-6924f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.328107 4885 scope.go:117] "RemoveContainer" containerID="af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082" Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.332047 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082\": container with ID starting with af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082 not found: ID does not exist" containerID="af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.332090 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082"} err="failed to get container status \"af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082\": rpc error: code = NotFound desc = could not find container \"af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082\": container with ID starting with af484e991060196026c15fdf7c87df36dc0d42042fb1284e5fcd415eabcbd082 not found: ID does not exist" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.332114 4885 scope.go:117] "RemoveContainer" containerID="8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2" Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.336062 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2\": container with ID starting with 8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2 not found: ID does not exist" containerID="8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.336107 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2"} err="failed to get container status \"8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2\": rpc error: code = NotFound desc = could not find container \"8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2\": container with ID starting with 8750c8a3009136cb71581176d7dd6fd3bbd417bba9e249aa5377941db799eac2 not found: ID does not exist" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.336135 4885 scope.go:117] "RemoveContainer" containerID="d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609" Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.341083 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609\": container with ID starting with d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609 not found: ID does not exist" containerID="d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.341131 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609"} err="failed to get container status \"d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609\": rpc error: code = NotFound desc = could not find container \"d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609\": container with ID starting with d11873f54ccd794d68da3ee167d05a0bf2b0c3a0800efe0fbcd9693a4a641609 not found: ID does not exist" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.378622 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" path="/var/lib/kubelet/pods/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733/volumes" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.379151 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" path="/var/lib/kubelet/pods/ce29cc99-ca85-4a7b-b027-1bc84fa92252/volumes" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.404095 4885 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-host\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.404123 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6924f\" (UniqueName: \"kubernetes.io/projected/ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733-kube-api-access-6924f\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.612935 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-58ck9/crc-debug-xq57v"] Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.613383 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="extract-utilities" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.613404 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="extract-utilities" Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.613420 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="registry-server" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.613426 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="registry-server" Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.613441 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" containerName="container-00" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.613448 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" containerName="container-00" Mar 08 22:05:13 crc kubenswrapper[4885]: E0308 22:05:13.613468 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="extract-content" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.613473 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="extract-content" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.613663 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce29cc99-ca85-4a7b-b027-1bc84fa92252" containerName="registry-server" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.613680 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad4f9b3d-3c35-40c4-a2bd-d7ccbc5fa733" containerName="container-00" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.614403 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.710707 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bcca738-8182-42d8-83d2-693323d43424-host\") pod \"crc-debug-xq57v\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.711147 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pdcd\" (UniqueName: \"kubernetes.io/projected/3bcca738-8182-42d8-83d2-693323d43424-kube-api-access-2pdcd\") pod \"crc-debug-xq57v\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.813437 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bcca738-8182-42d8-83d2-693323d43424-host\") pod \"crc-debug-xq57v\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.813591 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bcca738-8182-42d8-83d2-693323d43424-host\") pod \"crc-debug-xq57v\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.813603 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pdcd\" (UniqueName: \"kubernetes.io/projected/3bcca738-8182-42d8-83d2-693323d43424-kube-api-access-2pdcd\") pod \"crc-debug-xq57v\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.831607 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pdcd\" (UniqueName: \"kubernetes.io/projected/3bcca738-8182-42d8-83d2-693323d43424-kube-api-access-2pdcd\") pod \"crc-debug-xq57v\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: I0308 22:05:13.932241 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:13 crc kubenswrapper[4885]: W0308 22:05:13.962260 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bcca738_8182_42d8_83d2_693323d43424.slice/crio-5397042cbd8e5a1efa8d5ba0ce10ab5986a028860625ac6d4f20c51a9b241a27 WatchSource:0}: Error finding container 5397042cbd8e5a1efa8d5ba0ce10ab5986a028860625ac6d4f20c51a9b241a27: Status 404 returned error can't find the container with id 5397042cbd8e5a1efa8d5ba0ce10ab5986a028860625ac6d4f20c51a9b241a27 Mar 08 22:05:14 crc kubenswrapper[4885]: I0308 22:05:14.141043 4885 scope.go:117] "RemoveContainer" containerID="857ab72a7c2126bc7b60e9aa2682b2b13391c1878713ec25e9c07f2cbc164789" Mar 08 22:05:14 crc kubenswrapper[4885]: I0308 22:05:14.141075 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-5zqd9" Mar 08 22:05:14 crc kubenswrapper[4885]: I0308 22:05:14.142319 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-xq57v" event={"ID":"3bcca738-8182-42d8-83d2-693323d43424","Type":"ContainerStarted","Data":"5397042cbd8e5a1efa8d5ba0ce10ab5986a028860625ac6d4f20c51a9b241a27"} Mar 08 22:05:15 crc kubenswrapper[4885]: I0308 22:05:15.155059 4885 generic.go:334] "Generic (PLEG): container finished" podID="3bcca738-8182-42d8-83d2-693323d43424" containerID="3087ec3b25ed2328bb7f3332ce72df4c652f348dc994dcb55a4b1d6924fcf298" exitCode=0 Mar 08 22:05:15 crc kubenswrapper[4885]: I0308 22:05:15.155150 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/crc-debug-xq57v" event={"ID":"3bcca738-8182-42d8-83d2-693323d43424","Type":"ContainerDied","Data":"3087ec3b25ed2328bb7f3332ce72df4c652f348dc994dcb55a4b1d6924fcf298"} Mar 08 22:05:15 crc kubenswrapper[4885]: I0308 22:05:15.207066 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-58ck9/crc-debug-xq57v"] Mar 08 22:05:15 crc kubenswrapper[4885]: I0308 22:05:15.211470 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-58ck9/crc-debug-xq57v"] Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.278515 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.374785 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pdcd\" (UniqueName: \"kubernetes.io/projected/3bcca738-8182-42d8-83d2-693323d43424-kube-api-access-2pdcd\") pod \"3bcca738-8182-42d8-83d2-693323d43424\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.375054 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bcca738-8182-42d8-83d2-693323d43424-host\") pod \"3bcca738-8182-42d8-83d2-693323d43424\" (UID: \"3bcca738-8182-42d8-83d2-693323d43424\") " Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.375673 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3bcca738-8182-42d8-83d2-693323d43424-host" (OuterVolumeSpecName: "host") pod "3bcca738-8182-42d8-83d2-693323d43424" (UID: "3bcca738-8182-42d8-83d2-693323d43424"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.380456 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bcca738-8182-42d8-83d2-693323d43424-kube-api-access-2pdcd" (OuterVolumeSpecName: "kube-api-access-2pdcd") pod "3bcca738-8182-42d8-83d2-693323d43424" (UID: "3bcca738-8182-42d8-83d2-693323d43424"). InnerVolumeSpecName "kube-api-access-2pdcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.476954 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pdcd\" (UniqueName: \"kubernetes.io/projected/3bcca738-8182-42d8-83d2-693323d43424-kube-api-access-2pdcd\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:16 crc kubenswrapper[4885]: I0308 22:05:16.476980 4885 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3bcca738-8182-42d8-83d2-693323d43424-host\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:17 crc kubenswrapper[4885]: I0308 22:05:17.175756 4885 scope.go:117] "RemoveContainer" containerID="3087ec3b25ed2328bb7f3332ce72df4c652f348dc994dcb55a4b1d6924fcf298" Mar 08 22:05:17 crc kubenswrapper[4885]: I0308 22:05:17.175794 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/crc-debug-xq57v" Mar 08 22:05:17 crc kubenswrapper[4885]: I0308 22:05:17.380149 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bcca738-8182-42d8-83d2-693323d43424" path="/var/lib/kubelet/pods/3bcca738-8182-42d8-83d2-693323d43424/volumes" Mar 08 22:05:23 crc kubenswrapper[4885]: I0308 22:05:23.368406 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:05:23 crc kubenswrapper[4885]: E0308 22:05:23.369123 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:05:36 crc kubenswrapper[4885]: I0308 22:05:36.368768 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:05:36 crc kubenswrapper[4885]: E0308 22:05:36.369479 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.136990 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-p99b8"] Mar 08 22:05:42 crc kubenswrapper[4885]: E0308 22:05:42.139655 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcca738-8182-42d8-83d2-693323d43424" containerName="container-00" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.139790 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcca738-8182-42d8-83d2-693323d43424" containerName="container-00" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.140270 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bcca738-8182-42d8-83d2-693323d43424" containerName="container-00" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.142398 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.157342 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p99b8"] Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.245385 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq2qj\" (UniqueName: \"kubernetes.io/projected/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-kube-api-access-zq2qj\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.245634 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-catalog-content\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.245879 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-utilities\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.348135 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-utilities\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.348313 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq2qj\" (UniqueName: \"kubernetes.io/projected/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-kube-api-access-zq2qj\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.348506 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-catalog-content\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.349013 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-utilities\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.349175 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-catalog-content\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.367735 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq2qj\" (UniqueName: \"kubernetes.io/projected/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-kube-api-access-zq2qj\") pod \"community-operators-p99b8\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:42 crc kubenswrapper[4885]: I0308 22:05:42.485782 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:43 crc kubenswrapper[4885]: I0308 22:05:43.042641 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-p99b8"] Mar 08 22:05:43 crc kubenswrapper[4885]: I0308 22:05:43.555833 4885 generic.go:334] "Generic (PLEG): container finished" podID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerID="ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9" exitCode=0 Mar 08 22:05:43 crc kubenswrapper[4885]: I0308 22:05:43.556009 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerDied","Data":"ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9"} Mar 08 22:05:43 crc kubenswrapper[4885]: I0308 22:05:43.556315 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerStarted","Data":"e7fc46e6a9bf19dbb89a694e6bf530cbb7a7bbe680bbca8623dbeae1349edcfd"} Mar 08 22:05:44 crc kubenswrapper[4885]: I0308 22:05:44.604604 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerStarted","Data":"68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a"} Mar 08 22:05:46 crc kubenswrapper[4885]: I0308 22:05:46.634382 4885 generic.go:334] "Generic (PLEG): container finished" podID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerID="68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a" exitCode=0 Mar 08 22:05:46 crc kubenswrapper[4885]: I0308 22:05:46.634512 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerDied","Data":"68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a"} Mar 08 22:05:47 crc kubenswrapper[4885]: I0308 22:05:47.650600 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerStarted","Data":"ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e"} Mar 08 22:05:47 crc kubenswrapper[4885]: I0308 22:05:47.680679 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-p99b8" podStartSLOduration=1.938572341 podStartE2EDuration="5.680651908s" podCreationTimestamp="2026-03-08 22:05:42 +0000 UTC" firstStartedPulling="2026-03-08 22:05:43.559027762 +0000 UTC m=+9244.955081795" lastFinishedPulling="2026-03-08 22:05:47.301107339 +0000 UTC m=+9248.697161362" observedRunningTime="2026-03-08 22:05:47.674111884 +0000 UTC m=+9249.070165907" watchObservedRunningTime="2026-03-08 22:05:47.680651908 +0000 UTC m=+9249.076705951" Mar 08 22:05:48 crc kubenswrapper[4885]: I0308 22:05:48.369507 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:05:48 crc kubenswrapper[4885]: E0308 22:05:48.370402 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:05:52 crc kubenswrapper[4885]: I0308 22:05:52.486071 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:52 crc kubenswrapper[4885]: I0308 22:05:52.486629 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:52 crc kubenswrapper[4885]: I0308 22:05:52.580958 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:52 crc kubenswrapper[4885]: I0308 22:05:52.815843 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:52 crc kubenswrapper[4885]: I0308 22:05:52.882540 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p99b8"] Mar 08 22:05:54 crc kubenswrapper[4885]: I0308 22:05:54.738403 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-p99b8" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="registry-server" containerID="cri-o://ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e" gracePeriod=2 Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.304360 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.379091 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-catalog-content\") pod \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.379229 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-utilities\") pod \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.379467 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq2qj\" (UniqueName: \"kubernetes.io/projected/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-kube-api-access-zq2qj\") pod \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\" (UID: \"9302823a-d143-49a4-9fcc-1e27bcd7ecd4\") " Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.380554 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-utilities" (OuterVolumeSpecName: "utilities") pod "9302823a-d143-49a4-9fcc-1e27bcd7ecd4" (UID: "9302823a-d143-49a4-9fcc-1e27bcd7ecd4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.388389 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-kube-api-access-zq2qj" (OuterVolumeSpecName: "kube-api-access-zq2qj") pod "9302823a-d143-49a4-9fcc-1e27bcd7ecd4" (UID: "9302823a-d143-49a4-9fcc-1e27bcd7ecd4"). InnerVolumeSpecName "kube-api-access-zq2qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.459674 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9302823a-d143-49a4-9fcc-1e27bcd7ecd4" (UID: "9302823a-d143-49a4-9fcc-1e27bcd7ecd4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.482565 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.483063 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.483248 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq2qj\" (UniqueName: \"kubernetes.io/projected/9302823a-d143-49a4-9fcc-1e27bcd7ecd4-kube-api-access-zq2qj\") on node \"crc\" DevicePath \"\"" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.753083 4885 generic.go:334] "Generic (PLEG): container finished" podID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerID="ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e" exitCode=0 Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.753162 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerDied","Data":"ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e"} Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.753204 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-p99b8" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.753240 4885 scope.go:117] "RemoveContainer" containerID="ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.753220 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-p99b8" event={"ID":"9302823a-d143-49a4-9fcc-1e27bcd7ecd4","Type":"ContainerDied","Data":"e7fc46e6a9bf19dbb89a694e6bf530cbb7a7bbe680bbca8623dbeae1349edcfd"} Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.777153 4885 scope.go:117] "RemoveContainer" containerID="68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.810309 4885 scope.go:117] "RemoveContainer" containerID="ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.819260 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-p99b8"] Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.840553 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-p99b8"] Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.870328 4885 scope.go:117] "RemoveContainer" containerID="ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e" Mar 08 22:05:55 crc kubenswrapper[4885]: E0308 22:05:55.870961 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e\": container with ID starting with ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e not found: ID does not exist" containerID="ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.871040 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e"} err="failed to get container status \"ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e\": rpc error: code = NotFound desc = could not find container \"ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e\": container with ID starting with ad62e49a13cb1e0a0447d4b5854caf4f4274937dbc213db802545e30cebe6c8e not found: ID does not exist" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.871083 4885 scope.go:117] "RemoveContainer" containerID="68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a" Mar 08 22:05:55 crc kubenswrapper[4885]: E0308 22:05:55.871729 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a\": container with ID starting with 68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a not found: ID does not exist" containerID="68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.871780 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a"} err="failed to get container status \"68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a\": rpc error: code = NotFound desc = could not find container \"68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a\": container with ID starting with 68c482e17d8eb101087d49e618f2bb23c1732e0cc6e24fa23d46e5ef6aa3491a not found: ID does not exist" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.871807 4885 scope.go:117] "RemoveContainer" containerID="ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9" Mar 08 22:05:55 crc kubenswrapper[4885]: E0308 22:05:55.872218 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9\": container with ID starting with ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9 not found: ID does not exist" containerID="ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9" Mar 08 22:05:55 crc kubenswrapper[4885]: I0308 22:05:55.872247 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9"} err="failed to get container status \"ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9\": rpc error: code = NotFound desc = could not find container \"ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9\": container with ID starting with ffd5282d428ea2b48eda3d81eae1002a796b8acd9e197e4b98313fff88e2a6a9 not found: ID does not exist" Mar 08 22:05:57 crc kubenswrapper[4885]: I0308 22:05:57.385259 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" path="/var/lib/kubelet/pods/9302823a-d143-49a4-9fcc-1e27bcd7ecd4/volumes" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.152485 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550126-tpkcw"] Mar 08 22:06:00 crc kubenswrapper[4885]: E0308 22:06:00.153451 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="extract-content" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.153473 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="extract-content" Mar 08 22:06:00 crc kubenswrapper[4885]: E0308 22:06:00.153514 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="extract-utilities" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.153528 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="extract-utilities" Mar 08 22:06:00 crc kubenswrapper[4885]: E0308 22:06:00.153556 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="registry-server" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.153571 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="registry-server" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.153964 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="9302823a-d143-49a4-9fcc-1e27bcd7ecd4" containerName="registry-server" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.155228 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.161439 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.161617 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.161645 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.162223 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550126-tpkcw"] Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.296434 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfqv8\" (UniqueName: \"kubernetes.io/projected/db2ac46a-e5ce-45f0-8d95-2f520eebd199-kube-api-access-gfqv8\") pod \"auto-csr-approver-29550126-tpkcw\" (UID: \"db2ac46a-e5ce-45f0-8d95-2f520eebd199\") " pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.398652 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfqv8\" (UniqueName: \"kubernetes.io/projected/db2ac46a-e5ce-45f0-8d95-2f520eebd199-kube-api-access-gfqv8\") pod \"auto-csr-approver-29550126-tpkcw\" (UID: \"db2ac46a-e5ce-45f0-8d95-2f520eebd199\") " pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.430582 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfqv8\" (UniqueName: \"kubernetes.io/projected/db2ac46a-e5ce-45f0-8d95-2f520eebd199-kube-api-access-gfqv8\") pod \"auto-csr-approver-29550126-tpkcw\" (UID: \"db2ac46a-e5ce-45f0-8d95-2f520eebd199\") " pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.479824 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:00 crc kubenswrapper[4885]: I0308 22:06:00.833314 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550126-tpkcw"] Mar 08 22:06:01 crc kubenswrapper[4885]: I0308 22:06:01.823702 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" event={"ID":"db2ac46a-e5ce-45f0-8d95-2f520eebd199","Type":"ContainerStarted","Data":"60ff3555b754410dd5d419bfe19d69a36bd479a6856103456563a59ab6dfd34a"} Mar 08 22:06:02 crc kubenswrapper[4885]: I0308 22:06:02.368640 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:06:02 crc kubenswrapper[4885]: E0308 22:06:02.369632 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:06:02 crc kubenswrapper[4885]: I0308 22:06:02.844097 4885 generic.go:334] "Generic (PLEG): container finished" podID="db2ac46a-e5ce-45f0-8d95-2f520eebd199" containerID="78835dd04d2354f01f1264d7b0e37072d10df2af40d9fb9f18dcf2dd6bfeda09" exitCode=0 Mar 08 22:06:02 crc kubenswrapper[4885]: I0308 22:06:02.844167 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" event={"ID":"db2ac46a-e5ce-45f0-8d95-2f520eebd199","Type":"ContainerDied","Data":"78835dd04d2354f01f1264d7b0e37072d10df2af40d9fb9f18dcf2dd6bfeda09"} Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.361257 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.544326 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfqv8\" (UniqueName: \"kubernetes.io/projected/db2ac46a-e5ce-45f0-8d95-2f520eebd199-kube-api-access-gfqv8\") pod \"db2ac46a-e5ce-45f0-8d95-2f520eebd199\" (UID: \"db2ac46a-e5ce-45f0-8d95-2f520eebd199\") " Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.554369 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db2ac46a-e5ce-45f0-8d95-2f520eebd199-kube-api-access-gfqv8" (OuterVolumeSpecName: "kube-api-access-gfqv8") pod "db2ac46a-e5ce-45f0-8d95-2f520eebd199" (UID: "db2ac46a-e5ce-45f0-8d95-2f520eebd199"). InnerVolumeSpecName "kube-api-access-gfqv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.648214 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfqv8\" (UniqueName: \"kubernetes.io/projected/db2ac46a-e5ce-45f0-8d95-2f520eebd199-kube-api-access-gfqv8\") on node \"crc\" DevicePath \"\"" Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.872972 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" event={"ID":"db2ac46a-e5ce-45f0-8d95-2f520eebd199","Type":"ContainerDied","Data":"60ff3555b754410dd5d419bfe19d69a36bd479a6856103456563a59ab6dfd34a"} Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.873028 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60ff3555b754410dd5d419bfe19d69a36bd479a6856103456563a59ab6dfd34a" Mar 08 22:06:04 crc kubenswrapper[4885]: I0308 22:06:04.873061 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550126-tpkcw" Mar 08 22:06:05 crc kubenswrapper[4885]: I0308 22:06:05.463205 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550120-gtf7r"] Mar 08 22:06:05 crc kubenswrapper[4885]: I0308 22:06:05.475228 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550120-gtf7r"] Mar 08 22:06:07 crc kubenswrapper[4885]: I0308 22:06:07.381856 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df80a103-8bbc-4c66-8995-05152b8b9b66" path="/var/lib/kubelet/pods/df80a103-8bbc-4c66-8995-05152b8b9b66/volumes" Mar 08 22:06:13 crc kubenswrapper[4885]: I0308 22:06:13.369767 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:06:14 crc kubenswrapper[4885]: I0308 22:06:14.013760 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"476319ccbb7b7f2c7f41cc8d4d80ef11367ce5ac8d17f701827d65c4b2326b5d"} Mar 08 22:06:15 crc kubenswrapper[4885]: I0308 22:06:15.129474 4885 scope.go:117] "RemoveContainer" containerID="c7a9de3d006a67b60be16dde31bd3a619a9734cffde62a18b4a5fd2544360347" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.148195 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550128-4vqrp"] Mar 08 22:08:00 crc kubenswrapper[4885]: E0308 22:08:00.161261 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db2ac46a-e5ce-45f0-8d95-2f520eebd199" containerName="oc" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.161298 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="db2ac46a-e5ce-45f0-8d95-2f520eebd199" containerName="oc" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.163746 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="db2ac46a-e5ce-45f0-8d95-2f520eebd199" containerName="oc" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.165893 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.170745 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.174742 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.181879 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.185797 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550128-4vqrp"] Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.324156 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68bb8\" (UniqueName: \"kubernetes.io/projected/16108583-f398-4571-9e1c-41d86a071331-kube-api-access-68bb8\") pod \"auto-csr-approver-29550128-4vqrp\" (UID: \"16108583-f398-4571-9e1c-41d86a071331\") " pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.425837 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68bb8\" (UniqueName: \"kubernetes.io/projected/16108583-f398-4571-9e1c-41d86a071331-kube-api-access-68bb8\") pod \"auto-csr-approver-29550128-4vqrp\" (UID: \"16108583-f398-4571-9e1c-41d86a071331\") " pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.444839 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68bb8\" (UniqueName: \"kubernetes.io/projected/16108583-f398-4571-9e1c-41d86a071331-kube-api-access-68bb8\") pod \"auto-csr-approver-29550128-4vqrp\" (UID: \"16108583-f398-4571-9e1c-41d86a071331\") " pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.517543 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:00 crc kubenswrapper[4885]: I0308 22:08:00.994872 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550128-4vqrp"] Mar 08 22:08:01 crc kubenswrapper[4885]: I0308 22:08:01.402986 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" event={"ID":"16108583-f398-4571-9e1c-41d86a071331","Type":"ContainerStarted","Data":"8001d74b9a3b1e54a04b1bf1d770af365d07f5b75b5365df220fd54ba8fbf0f6"} Mar 08 22:08:02 crc kubenswrapper[4885]: I0308 22:08:02.413349 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" event={"ID":"16108583-f398-4571-9e1c-41d86a071331","Type":"ContainerStarted","Data":"359909f363078610b8800c0c14bcbb9a70bea6cbeb9a5e7e55a65cb5c9ec4e4c"} Mar 08 22:08:02 crc kubenswrapper[4885]: I0308 22:08:02.433452 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" podStartSLOduration=1.5373337710000001 podStartE2EDuration="2.433427137s" podCreationTimestamp="2026-03-08 22:08:00 +0000 UTC" firstStartedPulling="2026-03-08 22:08:00.993536798 +0000 UTC m=+9382.389590831" lastFinishedPulling="2026-03-08 22:08:01.889630164 +0000 UTC m=+9383.285684197" observedRunningTime="2026-03-08 22:08:02.42528097 +0000 UTC m=+9383.821335003" watchObservedRunningTime="2026-03-08 22:08:02.433427137 +0000 UTC m=+9383.829481170" Mar 08 22:08:03 crc kubenswrapper[4885]: I0308 22:08:03.426311 4885 generic.go:334] "Generic (PLEG): container finished" podID="16108583-f398-4571-9e1c-41d86a071331" containerID="359909f363078610b8800c0c14bcbb9a70bea6cbeb9a5e7e55a65cb5c9ec4e4c" exitCode=0 Mar 08 22:08:03 crc kubenswrapper[4885]: I0308 22:08:03.426418 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" event={"ID":"16108583-f398-4571-9e1c-41d86a071331","Type":"ContainerDied","Data":"359909f363078610b8800c0c14bcbb9a70bea6cbeb9a5e7e55a65cb5c9ec4e4c"} Mar 08 22:08:04 crc kubenswrapper[4885]: I0308 22:08:04.931066 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.035811 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68bb8\" (UniqueName: \"kubernetes.io/projected/16108583-f398-4571-9e1c-41d86a071331-kube-api-access-68bb8\") pod \"16108583-f398-4571-9e1c-41d86a071331\" (UID: \"16108583-f398-4571-9e1c-41d86a071331\") " Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.042503 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16108583-f398-4571-9e1c-41d86a071331-kube-api-access-68bb8" (OuterVolumeSpecName: "kube-api-access-68bb8") pod "16108583-f398-4571-9e1c-41d86a071331" (UID: "16108583-f398-4571-9e1c-41d86a071331"). InnerVolumeSpecName "kube-api-access-68bb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.142487 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68bb8\" (UniqueName: \"kubernetes.io/projected/16108583-f398-4571-9e1c-41d86a071331-kube-api-access-68bb8\") on node \"crc\" DevicePath \"\"" Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.475369 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" event={"ID":"16108583-f398-4571-9e1c-41d86a071331","Type":"ContainerDied","Data":"8001d74b9a3b1e54a04b1bf1d770af365d07f5b75b5365df220fd54ba8fbf0f6"} Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.475423 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550128-4vqrp" Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.475427 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8001d74b9a3b1e54a04b1bf1d770af365d07f5b75b5365df220fd54ba8fbf0f6" Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.535418 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550122-qd8zm"] Mar 08 22:08:05 crc kubenswrapper[4885]: I0308 22:08:05.549730 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550122-qd8zm"] Mar 08 22:08:07 crc kubenswrapper[4885]: I0308 22:08:07.392508 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd3f7a8-4264-431e-b87b-9a60f7133767" path="/var/lib/kubelet/pods/9fd3f7a8-4264-431e-b87b-9a60f7133767/volumes" Mar 08 22:08:15 crc kubenswrapper[4885]: I0308 22:08:15.342869 4885 scope.go:117] "RemoveContainer" containerID="585a20322b6f004269b069c692b948e45ca9aa16a182a4437e57d97dc9bea430" Mar 08 22:08:32 crc kubenswrapper[4885]: I0308 22:08:32.818481 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:08:32 crc kubenswrapper[4885]: I0308 22:08:32.819049 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:09:02 crc kubenswrapper[4885]: I0308 22:09:02.822966 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:09:02 crc kubenswrapper[4885]: I0308 22:09:02.823652 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:09:32 crc kubenswrapper[4885]: I0308 22:09:32.818188 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:09:32 crc kubenswrapper[4885]: I0308 22:09:32.818801 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:09:32 crc kubenswrapper[4885]: I0308 22:09:32.818859 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 22:09:32 crc kubenswrapper[4885]: I0308 22:09:32.820013 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"476319ccbb7b7f2c7f41cc8d4d80ef11367ce5ac8d17f701827d65c4b2326b5d"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 22:09:32 crc kubenswrapper[4885]: I0308 22:09:32.820108 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://476319ccbb7b7f2c7f41cc8d4d80ef11367ce5ac8d17f701827d65c4b2326b5d" gracePeriod=600 Mar 08 22:09:33 crc kubenswrapper[4885]: I0308 22:09:33.297047 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="476319ccbb7b7f2c7f41cc8d4d80ef11367ce5ac8d17f701827d65c4b2326b5d" exitCode=0 Mar 08 22:09:33 crc kubenswrapper[4885]: I0308 22:09:33.297457 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"476319ccbb7b7f2c7f41cc8d4d80ef11367ce5ac8d17f701827d65c4b2326b5d"} Mar 08 22:09:33 crc kubenswrapper[4885]: I0308 22:09:33.297536 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7"} Mar 08 22:09:33 crc kubenswrapper[4885]: I0308 22:09:33.297569 4885 scope.go:117] "RemoveContainer" containerID="35030fe50ef0079f2dd98a1c85cff074827700db4ea2d4ea9c8d6c97231eccae" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.169589 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550130-q8btr"] Mar 08 22:10:00 crc kubenswrapper[4885]: E0308 22:10:00.171316 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16108583-f398-4571-9e1c-41d86a071331" containerName="oc" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.171337 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="16108583-f398-4571-9e1c-41d86a071331" containerName="oc" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.171607 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="16108583-f398-4571-9e1c-41d86a071331" containerName="oc" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.172626 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.182851 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.182907 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.182989 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.186202 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550130-q8btr"] Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.286638 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx47z\" (UniqueName: \"kubernetes.io/projected/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6-kube-api-access-rx47z\") pod \"auto-csr-approver-29550130-q8btr\" (UID: \"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6\") " pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.389382 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx47z\" (UniqueName: \"kubernetes.io/projected/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6-kube-api-access-rx47z\") pod \"auto-csr-approver-29550130-q8btr\" (UID: \"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6\") " pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.416064 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx47z\" (UniqueName: \"kubernetes.io/projected/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6-kube-api-access-rx47z\") pod \"auto-csr-approver-29550130-q8btr\" (UID: \"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6\") " pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:00 crc kubenswrapper[4885]: I0308 22:10:00.505480 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:01 crc kubenswrapper[4885]: W0308 22:10:01.100820 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8bf31f87_6e2d_4ae5_81e7_e3d501dc03d6.slice/crio-e342f0bf6b711f9a56c2b23b7280c11c8fe1f105d4637e8c432387346f618ff7 WatchSource:0}: Error finding container e342f0bf6b711f9a56c2b23b7280c11c8fe1f105d4637e8c432387346f618ff7: Status 404 returned error can't find the container with id e342f0bf6b711f9a56c2b23b7280c11c8fe1f105d4637e8c432387346f618ff7 Mar 08 22:10:01 crc kubenswrapper[4885]: I0308 22:10:01.105545 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 22:10:01 crc kubenswrapper[4885]: I0308 22:10:01.115962 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550130-q8btr"] Mar 08 22:10:01 crc kubenswrapper[4885]: I0308 22:10:01.624823 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550130-q8btr" event={"ID":"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6","Type":"ContainerStarted","Data":"e342f0bf6b711f9a56c2b23b7280c11c8fe1f105d4637e8c432387346f618ff7"} Mar 08 22:10:02 crc kubenswrapper[4885]: I0308 22:10:02.638753 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550130-q8btr" event={"ID":"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6","Type":"ContainerStarted","Data":"1c5174db17fa21586bec90f86258445c10bafc4fb6675bd3f58ffbbc2c682873"} Mar 08 22:10:02 crc kubenswrapper[4885]: I0308 22:10:02.668966 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550130-q8btr" podStartSLOduration=1.531450139 podStartE2EDuration="2.668892527s" podCreationTimestamp="2026-03-08 22:10:00 +0000 UTC" firstStartedPulling="2026-03-08 22:10:01.105283415 +0000 UTC m=+9502.501337438" lastFinishedPulling="2026-03-08 22:10:02.242725803 +0000 UTC m=+9503.638779826" observedRunningTime="2026-03-08 22:10:02.657591235 +0000 UTC m=+9504.053645298" watchObservedRunningTime="2026-03-08 22:10:02.668892527 +0000 UTC m=+9504.064946590" Mar 08 22:10:03 crc kubenswrapper[4885]: I0308 22:10:03.657352 4885 generic.go:334] "Generic (PLEG): container finished" podID="8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6" containerID="1c5174db17fa21586bec90f86258445c10bafc4fb6675bd3f58ffbbc2c682873" exitCode=0 Mar 08 22:10:03 crc kubenswrapper[4885]: I0308 22:10:03.657640 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550130-q8btr" event={"ID":"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6","Type":"ContainerDied","Data":"1c5174db17fa21586bec90f86258445c10bafc4fb6675bd3f58ffbbc2c682873"} Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.168624 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.227529 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx47z\" (UniqueName: \"kubernetes.io/projected/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6-kube-api-access-rx47z\") pod \"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6\" (UID: \"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6\") " Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.241156 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6-kube-api-access-rx47z" (OuterVolumeSpecName: "kube-api-access-rx47z") pod "8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6" (UID: "8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6"). InnerVolumeSpecName "kube-api-access-rx47z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.329517 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx47z\" (UniqueName: \"kubernetes.io/projected/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6-kube-api-access-rx47z\") on node \"crc\" DevicePath \"\"" Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.691672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550130-q8btr" event={"ID":"8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6","Type":"ContainerDied","Data":"e342f0bf6b711f9a56c2b23b7280c11c8fe1f105d4637e8c432387346f618ff7"} Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.692101 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e342f0bf6b711f9a56c2b23b7280c11c8fe1f105d4637e8c432387346f618ff7" Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.692194 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550130-q8btr" Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.752537 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550124-cd88x"] Mar 08 22:10:05 crc kubenswrapper[4885]: I0308 22:10:05.764193 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550124-cd88x"] Mar 08 22:10:07 crc kubenswrapper[4885]: I0308 22:10:07.385055 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31bb0f9f-aafb-4d40-9ef4-60deec075e85" path="/var/lib/kubelet/pods/31bb0f9f-aafb-4d40-9ef4-60deec075e85/volumes" Mar 08 22:10:15 crc kubenswrapper[4885]: I0308 22:10:15.470277 4885 scope.go:117] "RemoveContainer" containerID="1e7a14780a56f990fdb7ec2362f5fabc1bb27c3bceac15a1a207c4524477403d" Mar 08 22:11:15 crc kubenswrapper[4885]: I0308 22:11:15.605890 4885 scope.go:117] "RemoveContainer" containerID="409ba0e76cc123c34984a34f377c29a5545224b014b0468202c033f80128ed06" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.161299 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550132-jwbgs"] Mar 08 22:12:00 crc kubenswrapper[4885]: E0308 22:12:00.162736 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6" containerName="oc" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.162760 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6" containerName="oc" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.163216 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6" containerName="oc" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.164648 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.168571 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.168777 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.168976 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.182449 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550132-jwbgs"] Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.255418 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsm5t\" (UniqueName: \"kubernetes.io/projected/5350f846-ee1f-400b-8579-de1a56050f02-kube-api-access-lsm5t\") pod \"auto-csr-approver-29550132-jwbgs\" (UID: \"5350f846-ee1f-400b-8579-de1a56050f02\") " pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.358008 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsm5t\" (UniqueName: \"kubernetes.io/projected/5350f846-ee1f-400b-8579-de1a56050f02-kube-api-access-lsm5t\") pod \"auto-csr-approver-29550132-jwbgs\" (UID: \"5350f846-ee1f-400b-8579-de1a56050f02\") " pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.381563 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsm5t\" (UniqueName: \"kubernetes.io/projected/5350f846-ee1f-400b-8579-de1a56050f02-kube-api-access-lsm5t\") pod \"auto-csr-approver-29550132-jwbgs\" (UID: \"5350f846-ee1f-400b-8579-de1a56050f02\") " pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:00 crc kubenswrapper[4885]: I0308 22:12:00.501367 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:01 crc kubenswrapper[4885]: I0308 22:12:01.033521 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550132-jwbgs"] Mar 08 22:12:01 crc kubenswrapper[4885]: I0308 22:12:01.321363 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" event={"ID":"5350f846-ee1f-400b-8579-de1a56050f02","Type":"ContainerStarted","Data":"805b09ae284c11fa6a8b707f213c4248378515482b6ac3e4219b3e4422b2572c"} Mar 08 22:12:02 crc kubenswrapper[4885]: I0308 22:12:02.818459 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:12:02 crc kubenswrapper[4885]: I0308 22:12:02.819001 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:12:03 crc kubenswrapper[4885]: I0308 22:12:03.354671 4885 generic.go:334] "Generic (PLEG): container finished" podID="5350f846-ee1f-400b-8579-de1a56050f02" containerID="381bb8f225c03be035f053937f74c9493566bd9f87da1d7c680e81f6170500d2" exitCode=0 Mar 08 22:12:03 crc kubenswrapper[4885]: I0308 22:12:03.354724 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" event={"ID":"5350f846-ee1f-400b-8579-de1a56050f02","Type":"ContainerDied","Data":"381bb8f225c03be035f053937f74c9493566bd9f87da1d7c680e81f6170500d2"} Mar 08 22:12:04 crc kubenswrapper[4885]: I0308 22:12:04.759151 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:04 crc kubenswrapper[4885]: I0308 22:12:04.888583 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsm5t\" (UniqueName: \"kubernetes.io/projected/5350f846-ee1f-400b-8579-de1a56050f02-kube-api-access-lsm5t\") pod \"5350f846-ee1f-400b-8579-de1a56050f02\" (UID: \"5350f846-ee1f-400b-8579-de1a56050f02\") " Mar 08 22:12:04 crc kubenswrapper[4885]: I0308 22:12:04.895261 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5350f846-ee1f-400b-8579-de1a56050f02-kube-api-access-lsm5t" (OuterVolumeSpecName: "kube-api-access-lsm5t") pod "5350f846-ee1f-400b-8579-de1a56050f02" (UID: "5350f846-ee1f-400b-8579-de1a56050f02"). InnerVolumeSpecName "kube-api-access-lsm5t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:12:04 crc kubenswrapper[4885]: I0308 22:12:04.992169 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsm5t\" (UniqueName: \"kubernetes.io/projected/5350f846-ee1f-400b-8579-de1a56050f02-kube-api-access-lsm5t\") on node \"crc\" DevicePath \"\"" Mar 08 22:12:05 crc kubenswrapper[4885]: I0308 22:12:05.378974 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" Mar 08 22:12:05 crc kubenswrapper[4885]: I0308 22:12:05.393157 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550132-jwbgs" event={"ID":"5350f846-ee1f-400b-8579-de1a56050f02","Type":"ContainerDied","Data":"805b09ae284c11fa6a8b707f213c4248378515482b6ac3e4219b3e4422b2572c"} Mar 08 22:12:05 crc kubenswrapper[4885]: I0308 22:12:05.393218 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="805b09ae284c11fa6a8b707f213c4248378515482b6ac3e4219b3e4422b2572c" Mar 08 22:12:05 crc kubenswrapper[4885]: I0308 22:12:05.852197 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550126-tpkcw"] Mar 08 22:12:05 crc kubenswrapper[4885]: I0308 22:12:05.865246 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550126-tpkcw"] Mar 08 22:12:07 crc kubenswrapper[4885]: I0308 22:12:07.384148 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db2ac46a-e5ce-45f0-8d95-2f520eebd199" path="/var/lib/kubelet/pods/db2ac46a-e5ce-45f0-8d95-2f520eebd199/volumes" Mar 08 22:12:15 crc kubenswrapper[4885]: I0308 22:12:15.700776 4885 scope.go:117] "RemoveContainer" containerID="78835dd04d2354f01f1264d7b0e37072d10df2af40d9fb9f18dcf2dd6bfeda09" Mar 08 22:12:32 crc kubenswrapper[4885]: I0308 22:12:32.818199 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:12:32 crc kubenswrapper[4885]: I0308 22:12:32.819025 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:13:02 crc kubenswrapper[4885]: I0308 22:13:02.818217 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:13:02 crc kubenswrapper[4885]: I0308 22:13:02.818897 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:13:02 crc kubenswrapper[4885]: I0308 22:13:02.818973 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 22:13:02 crc kubenswrapper[4885]: I0308 22:13:02.819592 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 22:13:02 crc kubenswrapper[4885]: I0308 22:13:02.819659 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" gracePeriod=600 Mar 08 22:13:02 crc kubenswrapper[4885]: E0308 22:13:02.975194 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:13:03 crc kubenswrapper[4885]: I0308 22:13:03.083634 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" exitCode=0 Mar 08 22:13:03 crc kubenswrapper[4885]: I0308 22:13:03.083698 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7"} Mar 08 22:13:03 crc kubenswrapper[4885]: I0308 22:13:03.083753 4885 scope.go:117] "RemoveContainer" containerID="476319ccbb7b7f2c7f41cc8d4d80ef11367ce5ac8d17f701827d65c4b2326b5d" Mar 08 22:13:03 crc kubenswrapper[4885]: I0308 22:13:03.084826 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:13:03 crc kubenswrapper[4885]: E0308 22:13:03.085589 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:13:15 crc kubenswrapper[4885]: I0308 22:13:15.368807 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:13:15 crc kubenswrapper[4885]: E0308 22:13:15.369480 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:13:17 crc kubenswrapper[4885]: I0308 22:13:17.685091 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_55b083d5-789c-424a-8e11-f5e2e4bc51b0/init-config-reloader/0.log" Mar 08 22:13:17 crc kubenswrapper[4885]: I0308 22:13:17.907330 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_55b083d5-789c-424a-8e11-f5e2e4bc51b0/alertmanager/0.log" Mar 08 22:13:17 crc kubenswrapper[4885]: I0308 22:13:17.961182 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_55b083d5-789c-424a-8e11-f5e2e4bc51b0/init-config-reloader/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.000694 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_55b083d5-789c-424a-8e11-f5e2e4bc51b0/config-reloader/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.224468 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_737065cc-3153-4e0c-b4ee-4ad587c8d494/aodh-api/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.236149 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_737065cc-3153-4e0c-b4ee-4ad587c8d494/aodh-evaluator/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.324419 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_737065cc-3153-4e0c-b4ee-4ad587c8d494/aodh-listener/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.466351 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_737065cc-3153-4e0c-b4ee-4ad587c8d494/aodh-notifier/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.483031 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-8469b78fd4-9xh8z_ebfd95bc-213c-417c-8dd5-b66637bd98e9/barbican-api/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.549394 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-8469b78fd4-9xh8z_ebfd95bc-213c-417c-8dd5-b66637bd98e9/barbican-api-log/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.755949 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c754cdbdb-h7rpz_a8bdb095-595a-458e-870f-41fea2999d18/barbican-keystone-listener-log/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.759293 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-c754cdbdb-h7rpz_a8bdb095-595a-458e-870f-41fea2999d18/barbican-keystone-listener/0.log" Mar 08 22:13:18 crc kubenswrapper[4885]: I0308 22:13:18.924525 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-755df7d9d5-kl4vq_b7b24c26-4c9a-4442-a124-a66987404ec8/barbican-worker/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.019563 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-755df7d9d5-kl4vq_b7b24c26-4c9a-4442-a124-a66987404ec8/barbican-worker-log/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.087210 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-bcvmz_51b71742-3986-42a4-a016-eeecb3a7ba16/bootstrap-openstack-openstack-cell1/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.231177 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661d1124-50bd-4ad4-95a4-ac90994383b3/ceilometer-central-agent/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.299833 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661d1124-50bd-4ad4-95a4-ac90994383b3/proxy-httpd/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.301847 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661d1124-50bd-4ad4-95a4-ac90994383b3/ceilometer-notification-agent/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.456229 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_661d1124-50bd-4ad4-95a4-ac90994383b3/sg-core/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.520519 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-jr59q_9ef426ef-0010-4b6f-8b94-b45e726c2f02/ceph-client-openstack-openstack-cell1/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.713059 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_216289ea-1f99-4924-aa6b-9951b3b3840e/cinder-api/0.log" Mar 08 22:13:19 crc kubenswrapper[4885]: I0308 22:13:19.758595 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_216289ea-1f99-4924-aa6b-9951b3b3840e/cinder-api-log/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.001483 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_5eb198a5-6241-48b0-bc8c-57ad764a1f3b/probe/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.040532 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_5eb198a5-6241-48b0-bc8c-57ad764a1f3b/cinder-backup/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.086953 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7620c463-ffe0-4d70-ba82-deaef34da248/cinder-scheduler/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.242707 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7620c463-ffe0-4d70-ba82-deaef34da248/probe/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.335002 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_954ec951-d955-4335-93bb-d43e59408ae3/cinder-volume/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.406214 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_954ec951-d955-4335-93bb-d43e59408ae3/probe/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.560374 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-s5bq4_cd7ac915-62c8-4d95-96a3-899c245e685c/configure-network-openstack-openstack-cell1/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.658783 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-9wz6s_eaa3ffc5-e09f-48b4-96b2-e2454bfe6251/configure-os-openstack-openstack-cell1/0.log" Mar 08 22:13:20 crc kubenswrapper[4885]: I0308 22:13:20.841101 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-579b4494b9-nwf4n_cb658095-55a6-4c1a-a84b-23ad21d14212/init/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.026167 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-579b4494b9-nwf4n_cb658095-55a6-4c1a-a84b-23ad21d14212/init/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.057101 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-wl2gp_d2786842-7b37-4e0c-843e-9dc4467df6ad/download-cache-openstack-openstack-cell1/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.071521 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-579b4494b9-nwf4n_cb658095-55a6-4c1a-a84b-23ad21d14212/dnsmasq-dns/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.304391 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cd58de31-5f82-4acb-8713-397027fbae4f/glance-httpd/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.317546 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cd58de31-5f82-4acb-8713-397027fbae4f/glance-log/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.495714 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c1efb870-06f3-40b8-baca-e418a034eaed/glance-httpd/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.532705 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_c1efb870-06f3-40b8-baca-e418a034eaed/glance-log/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.692421 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-796f99d566-r2p9d_78788c18-3ce2-4e27-841d-e7d380fbab71/heat-api/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.852304 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-fffd5d5b8-82pm2_979b34eb-586a-4d86-8e2d-7937614c714a/heat-cfnapi/0.log" Mar 08 22:13:21 crc kubenswrapper[4885]: I0308 22:13:21.864054 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-595686fb49-hx4rx_9c41cdd1-29dd-4252-b988-1efaeed01573/heat-engine/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.055118 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5b7cfb69fc-bhpx4_f24559d3-3f44-434a-b790-32c52475d532/horizon-log/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.075728 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-5b7cfb69fc-bhpx4_f24559d3-3f44-434a-b790-32c52475d532/horizon/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.091816 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-8t465_125b54e2-cc1e-4a7f-83b6-1474e89bad11/install-certs-openstack-openstack-cell1/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.234517 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-mj45h_dcf02d39-6fe8-40ae-bd31-b7d1a38103b4/install-os-openstack-openstack-cell1/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.367793 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-868b8c986d-gxm79_65bf82e2-5440-45b2-b1ff-1f6998ce46f8/keystone-api/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.694413 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29550061-rh8tm_7e397c25-ae37-4c30-83ce-3bdb83f5b9c5/keystone-cron/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.756810 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29550121-m29ck_8bd4a79d-8d75-4e13-8eee-cc51925ca7fd/keystone-cron/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.822713 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_d54c8104-6382-4373-a672-8e2ac804ebba/kube-state-metrics/0.log" Mar 08 22:13:22 crc kubenswrapper[4885]: I0308 22:13:22.981878 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-rd94l_992d3500-f892-42c6-805f-ae9c96793d0f/libvirt-openstack-openstack-cell1/0.log" Mar 08 22:13:23 crc kubenswrapper[4885]: I0308 22:13:23.095809 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_6cffb553-3b2f-404c-a7da-d481d4635cfc/manila-api-log/0.log" Mar 08 22:13:23 crc kubenswrapper[4885]: I0308 22:13:23.122201 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_6cffb553-3b2f-404c-a7da-d481d4635cfc/manila-api/0.log" Mar 08 22:13:23 crc kubenswrapper[4885]: I0308 22:13:23.190469 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_637ae9d4-1fa5-48e0-87d7-5f6004e0352d/probe/0.log" Mar 08 22:13:23 crc kubenswrapper[4885]: I0308 22:13:23.252018 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_637ae9d4-1fa5-48e0-87d7-5f6004e0352d/manila-scheduler/0.log" Mar 08 22:13:23 crc kubenswrapper[4885]: I0308 22:13:23.332664 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_c0142342-e857-4238-b442-8e06ceb406e1/manila-share/0.log" Mar 08 22:13:23 crc kubenswrapper[4885]: I0308 22:13:23.434463 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_c0142342-e857-4238-b442-8e06ceb406e1/probe/0.log" Mar 08 22:13:24 crc kubenswrapper[4885]: I0308 22:13:24.530410 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-qt4fx_b7724d79-7a13-4c03-ae7c-a49a7cd5f9d9/neutron-dhcp-openstack-openstack-cell1/0.log" Mar 08 22:13:24 crc kubenswrapper[4885]: I0308 22:13:24.602244 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67c4b97569-rrjw7_68cdeb73-eb92-4d18-8a9f-a5e3a0a53900/neutron-httpd/0.log" Mar 08 22:13:24 crc kubenswrapper[4885]: I0308 22:13:24.638641 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67c4b97569-rrjw7_68cdeb73-eb92-4d18-8a9f-a5e3a0a53900/neutron-api/0.log" Mar 08 22:13:24 crc kubenswrapper[4885]: I0308 22:13:24.809843 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-hwhq9_12740b7f-a6a2-45e2-a288-fbb880a2c72b/neutron-metadata-openstack-openstack-cell1/0.log" Mar 08 22:13:24 crc kubenswrapper[4885]: I0308 22:13:24.863582 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-f6sk8_13f318e2-a78d-497f-bfbc-4c60d9156220/neutron-sriov-openstack-openstack-cell1/0.log" Mar 08 22:13:25 crc kubenswrapper[4885]: I0308 22:13:25.088243 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_afd37ef2-90bf-4ea4-86a1-2113a005824e/nova-api-api/0.log" Mar 08 22:13:25 crc kubenswrapper[4885]: I0308 22:13:25.190531 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_afd37ef2-90bf-4ea4-86a1-2113a005824e/nova-api-log/0.log" Mar 08 22:13:25 crc kubenswrapper[4885]: I0308 22:13:25.346653 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3534e95e-b33c-4294-98d0-f758ea92cf72/nova-cell0-conductor-conductor/0.log" Mar 08 22:13:25 crc kubenswrapper[4885]: I0308 22:13:25.495201 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_390628a6-50b8-491e-bc5d-80a524b67be6/nova-cell1-conductor-conductor/0.log" Mar 08 22:13:25 crc kubenswrapper[4885]: I0308 22:13:25.708507 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b55b7c8a-8888-43e1-a593-d3a1f00cba4c/nova-cell1-novncproxy-novncproxy/0.log" Mar 08 22:13:26 crc kubenswrapper[4885]: I0308 22:13:26.191881 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellx86rx_0ac2d268-855a-485e-a96f-87b5cc0e4f6e/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Mar 08 22:13:26 crc kubenswrapper[4885]: I0308 22:13:26.309438 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-thkw7_aecb4202-1208-4ba5-8515-2ecf99c8c7d1/nova-cell1-openstack-openstack-cell1/0.log" Mar 08 22:13:26 crc kubenswrapper[4885]: I0308 22:13:26.562680 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a2d37f17-3e80-43b1-b6e3-df2316900973/nova-metadata-log/0.log" Mar 08 22:13:26 crc kubenswrapper[4885]: I0308 22:13:26.573034 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a2d37f17-3e80-43b1-b6e3-df2316900973/nova-metadata-metadata/0.log" Mar 08 22:13:26 crc kubenswrapper[4885]: I0308 22:13:26.710418 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_199a5a0a-f05c-4e06-9bee-2a5d0303f3a0/nova-scheduler-scheduler/0.log" Mar 08 22:13:26 crc kubenswrapper[4885]: I0308 22:13:26.773668 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-754cf98f97-rw6hg_4fef7207-0a04-4fb4-af9e-d9efcd13226f/init/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.067853 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-754cf98f97-rw6hg_4fef7207-0a04-4fb4-af9e-d9efcd13226f/octavia-api-provider-agent/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.079667 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-754cf98f97-rw6hg_4fef7207-0a04-4fb4-af9e-d9efcd13226f/init/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.308588 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-8t2fl_9d4d983f-9ee9-4341-bf69-0c2fc610a2d6/init/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.319832 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-754cf98f97-rw6hg_4fef7207-0a04-4fb4-af9e-d9efcd13226f/octavia-api/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.774694 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-8t2fl_9d4d983f-9ee9-4341-bf69-0c2fc610a2d6/init/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.846442 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-8t2fl_9d4d983f-9ee9-4341-bf69-0c2fc610a2d6/octavia-healthmanager/0.log" Mar 08 22:13:27 crc kubenswrapper[4885]: I0308 22:13:27.868700 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-pchrs_c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8/init/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.124290 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-pchrs_c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8/init/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.218557 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-pchrs_c3f615a5-fb38-42e4-ba88-3da8bbd6c5a8/octavia-housekeeping/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.282323 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-6f5964dbc9-msqj4_6b1c35f0-ed5f-411a-a0ec-1270fd04e266/init/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.367839 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:13:28 crc kubenswrapper[4885]: E0308 22:13:28.368106 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.406777 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-6f5964dbc9-msqj4_6b1c35f0-ed5f-411a-a0ec-1270fd04e266/octavia-amphora-httpd/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.411884 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-image-upload-6f5964dbc9-msqj4_6b1c35f0-ed5f-411a-a0ec-1270fd04e266/init/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.502844 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-b8ndv_956e4845-c662-402d-adb6-b05143af6570/init/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.724468 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-b8ndv_956e4845-c662-402d-adb6-b05143af6570/init/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.783170 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-b8ndv_956e4845-c662-402d-adb6-b05143af6570/octavia-rsyslog/0.log" Mar 08 22:13:28 crc kubenswrapper[4885]: I0308 22:13:28.891634 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-8847z_3c0bddea-5630-4e74-8bc9-ec81fc3eba56/init/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.167810 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-8847z_3c0bddea-5630-4e74-8bc9-ec81fc3eba56/init/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.231874 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31392f16-4aaa-4512-982e-0c56d9af8200/mysql-bootstrap/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.268622 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-8847z_3c0bddea-5630-4e74-8bc9-ec81fc3eba56/octavia-worker/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.408870 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31392f16-4aaa-4512-982e-0c56d9af8200/mysql-bootstrap/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.416461 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_31392f16-4aaa-4512-982e-0c56d9af8200/galera/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.534602 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1fcbde6c-f104-4c3b-9937-24728ac572a8/mysql-bootstrap/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.803822 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1fcbde6c-f104-4c3b-9937-24728ac572a8/galera/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.810895 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_1fcbde6c-f104-4c3b-9937-24728ac572a8/mysql-bootstrap/0.log" Mar 08 22:13:29 crc kubenswrapper[4885]: I0308 22:13:29.814668 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_beb866d8-13cb-4dd6-9ce8-a2dad0935453/openstackclient/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.017183 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-5jmft_00348ab8-7686-4e8d-bada-3d9e32edca19/ovn-controller/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.115472 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-rzmvz_2db23198-8297-4e77-aed3-78ca89d5e6f8/openstack-network-exporter/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.278803 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b6j88_f9fbe86b-d12b-4122-93b5-4cd373fca82b/ovsdb-server-init/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.530784 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b6j88_f9fbe86b-d12b-4122-93b5-4cd373fca82b/ovsdb-server-init/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.531207 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b6j88_f9fbe86b-d12b-4122-93b5-4cd373fca82b/ovsdb-server/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.534754 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-b6j88_f9fbe86b-d12b-4122-93b5-4cd373fca82b/ovs-vswitchd/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.712564 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e81fc01-0a65-4956-9ba5-26ec5f7c25c9/ovn-northd/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.746438 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_3e81fc01-0a65-4956-9ba5-26ec5f7c25c9/openstack-network-exporter/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.887150 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-mz9gg_ba3efa94-310a-4c53-ac95-2444759b8574/ovn-openstack-openstack-cell1/0.log" Mar 08 22:13:30 crc kubenswrapper[4885]: I0308 22:13:30.989908 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ebd9461e-0196-4eaf-a733-44340b19d354/openstack-network-exporter/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.061110 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_ebd9461e-0196-4eaf-a733-44340b19d354/ovsdbserver-nb/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.203183 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_d605915b-24f4-45ec-bb13-7e7097bb288b/ovsdbserver-nb/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.263475 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_d605915b-24f4-45ec-bb13-7e7097bb288b/openstack-network-exporter/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.398155 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_4eca16e2-6962-4cad-9cbb-23d33af9c10a/ovsdbserver-nb/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.413167 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_4eca16e2-6962-4cad-9cbb-23d33af9c10a/openstack-network-exporter/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.546886 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2c0a1292-7594-49d3-b3f0-2e1a6aa004e2/openstack-network-exporter/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.630900 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2c0a1292-7594-49d3-b3f0-2e1a6aa004e2/ovsdbserver-sb/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.801431 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_1cdde225-3478-4566-9019-df846ce962fb/ovsdbserver-sb/0.log" Mar 08 22:13:31 crc kubenswrapper[4885]: I0308 22:13:31.805810 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_1cdde225-3478-4566-9019-df846ce962fb/openstack-network-exporter/0.log" Mar 08 22:13:32 crc kubenswrapper[4885]: I0308 22:13:32.050901 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_11852e05-e4cd-4884-b382-035694906263/ovsdbserver-sb/0.log" Mar 08 22:13:32 crc kubenswrapper[4885]: I0308 22:13:32.068793 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_11852e05-e4cd-4884-b382-035694906263/openstack-network-exporter/0.log" Mar 08 22:13:32 crc kubenswrapper[4885]: I0308 22:13:32.574979 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-cqhnbt_87894214-b974-4fc7-b23d-d739fde2466f/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Mar 08 22:13:32 crc kubenswrapper[4885]: I0308 22:13:32.678786 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bc985567d-hcdbz_0741bee5-7932-4af4-a8c1-1e56b754e359/placement-api/0.log" Mar 08 22:13:32 crc kubenswrapper[4885]: I0308 22:13:32.740156 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-5bc985567d-hcdbz_0741bee5-7932-4af4-a8c1-1e56b754e359/placement-log/0.log" Mar 08 22:13:32 crc kubenswrapper[4885]: I0308 22:13:32.920488 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_efd2a302-0f57-40e0-9a28-0a1cdfabfc5e/init-config-reloader/0.log" Mar 08 22:13:33 crc kubenswrapper[4885]: I0308 22:13:33.056496 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_efd2a302-0f57-40e0-9a28-0a1cdfabfc5e/init-config-reloader/0.log" Mar 08 22:13:33 crc kubenswrapper[4885]: I0308 22:13:33.165838 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_efd2a302-0f57-40e0-9a28-0a1cdfabfc5e/config-reloader/0.log" Mar 08 22:13:33 crc kubenswrapper[4885]: I0308 22:13:33.166032 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_efd2a302-0f57-40e0-9a28-0a1cdfabfc5e/thanos-sidecar/0.log" Mar 08 22:13:33 crc kubenswrapper[4885]: I0308 22:13:33.181625 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_efd2a302-0f57-40e0-9a28-0a1cdfabfc5e/prometheus/0.log" Mar 08 22:13:33 crc kubenswrapper[4885]: I0308 22:13:33.378298 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f0d39294-b81d-4534-b86a-35a3aea74ed7/setup-container/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.226201 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c0cb204d-b6bd-417e-9b6f-6a0c7faf4820/memcached/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.251421 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f0d39294-b81d-4534-b86a-35a3aea74ed7/rabbitmq/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.266433 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_f0d39294-b81d-4534-b86a-35a3aea74ed7/setup-container/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.304588 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_00704ff6-696f-4687-99e0-23bf055d1bef/setup-container/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.494529 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-k564w_cf062e16-c6d1-4d3c-b0aa-ca00e9740bcb/reboot-os-openstack-openstack-cell1/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.544104 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_00704ff6-696f-4687-99e0-23bf055d1bef/rabbitmq/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.566981 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_00704ff6-696f-4687-99e0-23bf055d1bef/setup-container/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.755222 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-fg8lj_53bb70ab-feea-49a2-9850-fc72a2e0f650/run-os-openstack-openstack-cell1/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.843505 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-khtfv_bef3d518-c413-4129-b022-dffb097239b2/ssh-known-hosts-openstack/0.log" Mar 08 22:13:34 crc kubenswrapper[4885]: I0308 22:13:34.984480 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-9mchk_5583daa6-0c35-4fde-8580-2a4d7ccbfb17/telemetry-openstack-openstack-cell1/0.log" Mar 08 22:13:35 crc kubenswrapper[4885]: I0308 22:13:35.152455 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-nnnc6_8be575f8-a741-4b5a-b7fa-c43e5dd65598/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Mar 08 22:13:35 crc kubenswrapper[4885]: I0308 22:13:35.177483 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-lcz5l_df77d68a-3570-49fb-958b-c358543e661f/validate-network-openstack-openstack-cell1/0.log" Mar 08 22:13:43 crc kubenswrapper[4885]: I0308 22:13:43.373428 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:13:43 crc kubenswrapper[4885]: E0308 22:13:43.374204 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:13:55 crc kubenswrapper[4885]: I0308 22:13:55.367908 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:13:55 crc kubenswrapper[4885]: E0308 22:13:55.368506 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.147361 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550134-24s4t"] Mar 08 22:14:00 crc kubenswrapper[4885]: E0308 22:14:00.148490 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5350f846-ee1f-400b-8579-de1a56050f02" containerName="oc" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.148507 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="5350f846-ee1f-400b-8579-de1a56050f02" containerName="oc" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.148791 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="5350f846-ee1f-400b-8579-de1a56050f02" containerName="oc" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.149806 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.151889 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.151998 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.152138 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.165483 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550134-24s4t"] Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.278863 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqq2b\" (UniqueName: \"kubernetes.io/projected/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3-kube-api-access-sqq2b\") pod \"auto-csr-approver-29550134-24s4t\" (UID: \"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3\") " pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.381331 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqq2b\" (UniqueName: \"kubernetes.io/projected/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3-kube-api-access-sqq2b\") pod \"auto-csr-approver-29550134-24s4t\" (UID: \"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3\") " pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.400749 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqq2b\" (UniqueName: \"kubernetes.io/projected/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3-kube-api-access-sqq2b\") pod \"auto-csr-approver-29550134-24s4t\" (UID: \"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3\") " pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:00 crc kubenswrapper[4885]: I0308 22:14:00.469459 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.024486 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550134-24s4t"] Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.181213 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/util/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.409444 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/util/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.432111 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/pull/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.464244 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/pull/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.618972 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/util/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.620188 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/pull/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.648155 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99zqsqb_5fb2cd81-437a-46be-93b5-b96ec94b1d1c/extract/0.log" Mar 08 22:14:01 crc kubenswrapper[4885]: I0308 22:14:01.754402 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550134-24s4t" event={"ID":"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3","Type":"ContainerStarted","Data":"ac91779228759ac5b56b689ad5065661850f5c58ec91efc057ca78d4de929bef"} Mar 08 22:14:02 crc kubenswrapper[4885]: I0308 22:14:02.142438 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-sbrjr_45c29030-0945-4655-b035-d75e8bf0f818/manager/0.log" Mar 08 22:14:02 crc kubenswrapper[4885]: I0308 22:14:02.604593 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-4hstb_69dc5eb7-1c2e-4fbb-a220-2129df60ffb3/manager/0.log" Mar 08 22:14:02 crc kubenswrapper[4885]: I0308 22:14:02.699529 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-n88vz_d5770638-6059-4ce5-b401-84b0155589a3/manager/0.log" Mar 08 22:14:02 crc kubenswrapper[4885]: I0308 22:14:02.789526 4885 generic.go:334] "Generic (PLEG): container finished" podID="2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3" containerID="a08f299dbb605791440e3498bfef15260ed7b91b31657f734e3989c456d8ee4c" exitCode=0 Mar 08 22:14:02 crc kubenswrapper[4885]: I0308 22:14:02.789576 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550134-24s4t" event={"ID":"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3","Type":"ContainerDied","Data":"a08f299dbb605791440e3498bfef15260ed7b91b31657f734e3989c456d8ee4c"} Mar 08 22:14:02 crc kubenswrapper[4885]: I0308 22:14:02.992166 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-xplpw_4742ab81-6c6d-43c8-8025-6a656b8c40dc/manager/0.log" Mar 08 22:14:03 crc kubenswrapper[4885]: I0308 22:14:03.542835 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-nclkr_7180efa7-8d93-436e-8de2-78fe5c173843/manager/0.log" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.306217 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-hlpjf_157555d5-ca64-49f8-8849-cd763c83feda/manager/0.log" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.358627 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-vf24d_9fc40f07-4706-4008-b86e-e73a2f2ab620/manager/0.log" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.428246 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-rplg5_92716f38-db4c-41d9-962d-f3cc2669a7fb/manager/0.log" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.467510 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.585639 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqq2b\" (UniqueName: \"kubernetes.io/projected/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3-kube-api-access-sqq2b\") pod \"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3\" (UID: \"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3\") " Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.599172 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3-kube-api-access-sqq2b" (OuterVolumeSpecName: "kube-api-access-sqq2b") pod "2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3" (UID: "2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3"). InnerVolumeSpecName "kube-api-access-sqq2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.687513 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqq2b\" (UniqueName: \"kubernetes.io/projected/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3-kube-api-access-sqq2b\") on node \"crc\" DevicePath \"\"" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.740118 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-q5hfb_27aa3877-54cd-414d-80a0-ab20a68ed535/manager/0.log" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.808559 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550134-24s4t" event={"ID":"2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3","Type":"ContainerDied","Data":"ac91779228759ac5b56b689ad5065661850f5c58ec91efc057ca78d4de929bef"} Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.808608 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac91779228759ac5b56b689ad5065661850f5c58ec91efc057ca78d4de929bef" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.808667 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550134-24s4t" Mar 08 22:14:04 crc kubenswrapper[4885]: I0308 22:14:04.970302 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-2hsgc_8f363429-f2b7-468c-b74b-ef14ebfab90e/manager/0.log" Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.130874 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-p8r6f_392750e0-9d71-418d-89b0-ec10f33ec505/manager/0.log" Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.480501 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-7vtx7_7c05f3ed-fe8f-47db-b596-8b90b96c295c/manager/0.log" Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.576730 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550128-4vqrp"] Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.593042 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550128-4vqrp"] Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.693937 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-k4r6w_bbb8966a-e61f-427d-af2a-0fdab2348d03/manager/0.log" Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.748657 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-f9jr4_5f89ecdd-60c3-4da6-b185-1f044d8ffc46/manager/0.log" Mar 08 22:14:05 crc kubenswrapper[4885]: I0308 22:14:05.759246 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-dc6dbbbd-wkwc7_d8de7df0-2dea-4d3c-a02e-57bfabade82f/manager/0.log" Mar 08 22:14:06 crc kubenswrapper[4885]: I0308 22:14:06.124252 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6f44f7b99f-l5vj4_9acb4d66-3a49-42b7-bd78-4d904f080c50/operator/0.log" Mar 08 22:14:06 crc kubenswrapper[4885]: I0308 22:14:06.210129 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-w4b99_024a1da8-dfa6-4cdc-a5ec-12b9ce56969a/registry-server/0.log" Mar 08 22:14:06 crc kubenswrapper[4885]: I0308 22:14:06.529946 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-4gfw2_8d086566-6154-4ddd-8028-a9c203cfec11/manager/0.log" Mar 08 22:14:06 crc kubenswrapper[4885]: I0308 22:14:06.651072 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-wdrfh_44fbac8d-d81f-4c03-9555-ef33551d478d/manager/0.log" Mar 08 22:14:06 crc kubenswrapper[4885]: I0308 22:14:06.786991 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-pd9b2_a8caa87f-832f-4436-beaa-aaa505de3bac/operator/0.log" Mar 08 22:14:06 crc kubenswrapper[4885]: I0308 22:14:06.884912 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-7hgld_d9580392-741e-406b-b72d-91aa945f65c2/manager/0.log" Mar 08 22:14:07 crc kubenswrapper[4885]: I0308 22:14:07.147050 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-xf4hm_ea5acc0f-2ad8-46d5-80a2-502e2900fdd6/manager/0.log" Mar 08 22:14:07 crc kubenswrapper[4885]: I0308 22:14:07.177712 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5fdb694969-7mghs_c9e4c6a7-96a6-4ea8-8fd7-aa56d096e65b/manager/0.log" Mar 08 22:14:07 crc kubenswrapper[4885]: I0308 22:14:07.352401 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-66zgf_d5136d34-82a8-47c5-9d7d-09e0206587e8/manager/0.log" Mar 08 22:14:07 crc kubenswrapper[4885]: I0308 22:14:07.384878 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16108583-f398-4571-9e1c-41d86a071331" path="/var/lib/kubelet/pods/16108583-f398-4571-9e1c-41d86a071331/volumes" Mar 08 22:14:08 crc kubenswrapper[4885]: I0308 22:14:08.018918 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7dfcb4d64f-pzg95_deedb14e-007e-44eb-bd52-85bbc12d0bec/manager/0.log" Mar 08 22:14:09 crc kubenswrapper[4885]: I0308 22:14:09.377118 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:14:09 crc kubenswrapper[4885]: E0308 22:14:09.377371 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:14:15 crc kubenswrapper[4885]: I0308 22:14:15.846872 4885 scope.go:117] "RemoveContainer" containerID="359909f363078610b8800c0c14bcbb9a70bea6cbeb9a5e7e55a65cb5c9ec4e4c" Mar 08 22:14:20 crc kubenswrapper[4885]: I0308 22:14:20.368266 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:14:20 crc kubenswrapper[4885]: E0308 22:14:20.371339 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:14:29 crc kubenswrapper[4885]: I0308 22:14:29.974064 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-k2rwt_fe3a8c81-8c1d-4b38-9cae-813fb749fd43/control-plane-machine-set-operator/0.log" Mar 08 22:14:30 crc kubenswrapper[4885]: I0308 22:14:30.169160 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cpx85_175c50f5-857d-4697-bcde-2ce47f2edfc5/kube-rbac-proxy/0.log" Mar 08 22:14:30 crc kubenswrapper[4885]: I0308 22:14:30.236864 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-cpx85_175c50f5-857d-4697-bcde-2ce47f2edfc5/machine-api-operator/0.log" Mar 08 22:14:31 crc kubenswrapper[4885]: I0308 22:14:31.368240 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:14:31 crc kubenswrapper[4885]: E0308 22:14:31.368939 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:14:43 crc kubenswrapper[4885]: I0308 22:14:43.368360 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:14:43 crc kubenswrapper[4885]: E0308 22:14:43.369768 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:14:46 crc kubenswrapper[4885]: I0308 22:14:46.757373 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-8wbq2_6da97aa0-4c69-414f-8fda-23403d2346e5/cert-manager-controller/0.log" Mar 08 22:14:46 crc kubenswrapper[4885]: I0308 22:14:46.915075 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-fbnvg_d62feb91-9474-41c0-b79c-93f3f6dd830b/cert-manager-cainjector/0.log" Mar 08 22:14:47 crc kubenswrapper[4885]: I0308 22:14:47.021734 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-kgm5k_de1b5c94-7518-46c5-af4a-2b692d23b3b7/cert-manager-webhook/0.log" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.314840 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gxvs5"] Mar 08 22:14:51 crc kubenswrapper[4885]: E0308 22:14:51.316002 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3" containerName="oc" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.316122 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3" containerName="oc" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.316422 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3" containerName="oc" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.318497 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.335187 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gxvs5"] Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.423605 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ntsq\" (UniqueName: \"kubernetes.io/projected/52401772-10fd-464c-bb40-dceaaca564db-kube-api-access-5ntsq\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.423788 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-utilities\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.423869 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-catalog-content\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.526317 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-utilities\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.526403 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-catalog-content\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.526489 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ntsq\" (UniqueName: \"kubernetes.io/projected/52401772-10fd-464c-bb40-dceaaca564db-kube-api-access-5ntsq\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.526872 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-utilities\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.527043 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-catalog-content\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.560183 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ntsq\" (UniqueName: \"kubernetes.io/projected/52401772-10fd-464c-bb40-dceaaca564db-kube-api-access-5ntsq\") pod \"redhat-operators-gxvs5\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:51 crc kubenswrapper[4885]: I0308 22:14:51.650414 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:14:52 crc kubenswrapper[4885]: I0308 22:14:52.157446 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gxvs5"] Mar 08 22:14:52 crc kubenswrapper[4885]: I0308 22:14:52.307793 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerStarted","Data":"410526b2d3e9b4d6a551a2823b5579cf7d0ec2f78e2d4b1b12bd289e95ff9e5f"} Mar 08 22:14:53 crc kubenswrapper[4885]: I0308 22:14:53.322657 4885 generic.go:334] "Generic (PLEG): container finished" podID="52401772-10fd-464c-bb40-dceaaca564db" containerID="e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d" exitCode=0 Mar 08 22:14:53 crc kubenswrapper[4885]: I0308 22:14:53.322738 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerDied","Data":"e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d"} Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.504694 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9xscz"] Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.508151 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.519485 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xscz"] Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.606013 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-utilities\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.606159 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-catalog-content\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.606284 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4xng\" (UniqueName: \"kubernetes.io/projected/19f49bd2-97c7-4446-9814-5c5788b65342-kube-api-access-x4xng\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.707933 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4xng\" (UniqueName: \"kubernetes.io/projected/19f49bd2-97c7-4446-9814-5c5788b65342-kube-api-access-x4xng\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.708072 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-utilities\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.708140 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-catalog-content\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.708595 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-catalog-content\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.708625 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-utilities\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.730839 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4xng\" (UniqueName: \"kubernetes.io/projected/19f49bd2-97c7-4446-9814-5c5788b65342-kube-api-access-x4xng\") pod \"redhat-marketplace-9xscz\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:54 crc kubenswrapper[4885]: I0308 22:14:54.833382 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:14:55 crc kubenswrapper[4885]: I0308 22:14:55.317898 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xscz"] Mar 08 22:14:55 crc kubenswrapper[4885]: W0308 22:14:55.320190 4885 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19f49bd2_97c7_4446_9814_5c5788b65342.slice/crio-cbefe58057fc835b715eeffbc91f9f0c63d91b9e015b2ca024e66e769f47a64e WatchSource:0}: Error finding container cbefe58057fc835b715eeffbc91f9f0c63d91b9e015b2ca024e66e769f47a64e: Status 404 returned error can't find the container with id cbefe58057fc835b715eeffbc91f9f0c63d91b9e015b2ca024e66e769f47a64e Mar 08 22:14:55 crc kubenswrapper[4885]: I0308 22:14:55.346642 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerStarted","Data":"cbefe58057fc835b715eeffbc91f9f0c63d91b9e015b2ca024e66e769f47a64e"} Mar 08 22:14:55 crc kubenswrapper[4885]: I0308 22:14:55.348702 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerStarted","Data":"360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728"} Mar 08 22:14:56 crc kubenswrapper[4885]: I0308 22:14:56.366155 4885 generic.go:334] "Generic (PLEG): container finished" podID="19f49bd2-97c7-4446-9814-5c5788b65342" containerID="b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515" exitCode=0 Mar 08 22:14:56 crc kubenswrapper[4885]: I0308 22:14:56.366429 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerDied","Data":"b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515"} Mar 08 22:14:56 crc kubenswrapper[4885]: I0308 22:14:56.368022 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:14:56 crc kubenswrapper[4885]: E0308 22:14:56.370513 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:14:58 crc kubenswrapper[4885]: I0308 22:14:58.389650 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerStarted","Data":"057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba"} Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.160241 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq"] Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.162374 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.164917 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.164910 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.189572 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq"] Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.256071 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb0d15ae-9873-4045-96ff-f333ea013dcb-secret-volume\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.256247 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb0d15ae-9873-4045-96ff-f333ea013dcb-config-volume\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.256606 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g8wm\" (UniqueName: \"kubernetes.io/projected/bb0d15ae-9873-4045-96ff-f333ea013dcb-kube-api-access-8g8wm\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.358495 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g8wm\" (UniqueName: \"kubernetes.io/projected/bb0d15ae-9873-4045-96ff-f333ea013dcb-kube-api-access-8g8wm\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.358674 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb0d15ae-9873-4045-96ff-f333ea013dcb-secret-volume\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.358767 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb0d15ae-9873-4045-96ff-f333ea013dcb-config-volume\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.360336 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb0d15ae-9873-4045-96ff-f333ea013dcb-config-volume\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.384345 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb0d15ae-9873-4045-96ff-f333ea013dcb-secret-volume\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.390831 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g8wm\" (UniqueName: \"kubernetes.io/projected/bb0d15ae-9873-4045-96ff-f333ea013dcb-kube-api-access-8g8wm\") pod \"collect-profiles-29550135-cxdkq\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.423623 4885 generic.go:334] "Generic (PLEG): container finished" podID="52401772-10fd-464c-bb40-dceaaca564db" containerID="360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728" exitCode=0 Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.423700 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerDied","Data":"360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728"} Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.425639 4885 generic.go:334] "Generic (PLEG): container finished" podID="19f49bd2-97c7-4446-9814-5c5788b65342" containerID="057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba" exitCode=0 Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.425672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerDied","Data":"057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba"} Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.497837 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:00 crc kubenswrapper[4885]: I0308 22:15:00.981512 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq"] Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.448582 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerStarted","Data":"6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6"} Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.451903 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerStarted","Data":"39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d"} Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.453530 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" event={"ID":"bb0d15ae-9873-4045-96ff-f333ea013dcb","Type":"ContainerStarted","Data":"469404e34da43d4cd1a85290781b0cb6d733331a47e042aea3b888ec12952ffe"} Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.453586 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" event={"ID":"bb0d15ae-9873-4045-96ff-f333ea013dcb","Type":"ContainerStarted","Data":"bb6e5861fa5a2a72d4a68d0f6c81188939f789e64e9c657c693aa24b9c5f3428"} Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.477982 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gxvs5" podStartSLOduration=3.883050215 podStartE2EDuration="11.477952851s" podCreationTimestamp="2026-03-08 22:14:51 +0000 UTC" firstStartedPulling="2026-03-08 22:14:53.325448293 +0000 UTC m=+9794.721502316" lastFinishedPulling="2026-03-08 22:15:00.920350929 +0000 UTC m=+9802.316404952" observedRunningTime="2026-03-08 22:15:02.472252198 +0000 UTC m=+9803.868306221" watchObservedRunningTime="2026-03-08 22:15:02.477952851 +0000 UTC m=+9803.874006894" Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.502454 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9xscz" podStartSLOduration=3.894324767 podStartE2EDuration="8.502438225s" podCreationTimestamp="2026-03-08 22:14:54 +0000 UTC" firstStartedPulling="2026-03-08 22:14:56.369681994 +0000 UTC m=+9797.765736027" lastFinishedPulling="2026-03-08 22:15:00.977795442 +0000 UTC m=+9802.373849485" observedRunningTime="2026-03-08 22:15:02.492736456 +0000 UTC m=+9803.888790479" watchObservedRunningTime="2026-03-08 22:15:02.502438225 +0000 UTC m=+9803.898492238" Mar 08 22:15:02 crc kubenswrapper[4885]: I0308 22:15:02.524582 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" podStartSLOduration=2.5245670049999998 podStartE2EDuration="2.524567005s" podCreationTimestamp="2026-03-08 22:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-08 22:15:02.51539745 +0000 UTC m=+9803.911451473" watchObservedRunningTime="2026-03-08 22:15:02.524567005 +0000 UTC m=+9803.920621028" Mar 08 22:15:03 crc kubenswrapper[4885]: I0308 22:15:03.464564 4885 generic.go:334] "Generic (PLEG): container finished" podID="bb0d15ae-9873-4045-96ff-f333ea013dcb" containerID="469404e34da43d4cd1a85290781b0cb6d733331a47e042aea3b888ec12952ffe" exitCode=0 Mar 08 22:15:03 crc kubenswrapper[4885]: I0308 22:15:03.464757 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" event={"ID":"bb0d15ae-9873-4045-96ff-f333ea013dcb","Type":"ContainerDied","Data":"469404e34da43d4cd1a85290781b0cb6d733331a47e042aea3b888ec12952ffe"} Mar 08 22:15:03 crc kubenswrapper[4885]: I0308 22:15:03.947953 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-ftcgc_c548cbba-61a5-4167-b494-f57c45b1599b/nmstate-console-plugin/0.log" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.138863 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6m2b5_74d96fe5-1ab9-4703-8717-509cf115d985/nmstate-handler/0.log" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.199030 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-wsk7q_75f588d1-7159-4a94-bf89-bb18a880a403/kube-rbac-proxy/0.log" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.387103 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-wsk7q_75f588d1-7159-4a94-bf89-bb18a880a403/nmstate-metrics/0.log" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.388982 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-5twjk_02d2b43e-55f4-49f1-9bb1-3e70ed22a3da/nmstate-operator/0.log" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.619279 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-bb7k9_3793d26a-a132-40db-b8fe-2cf83428b03c/nmstate-webhook/0.log" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.834430 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.834491 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.898216 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.954561 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb0d15ae-9873-4045-96ff-f333ea013dcb-secret-volume\") pod \"bb0d15ae-9873-4045-96ff-f333ea013dcb\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.954829 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb0d15ae-9873-4045-96ff-f333ea013dcb-config-volume\") pod \"bb0d15ae-9873-4045-96ff-f333ea013dcb\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.954895 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g8wm\" (UniqueName: \"kubernetes.io/projected/bb0d15ae-9873-4045-96ff-f333ea013dcb-kube-api-access-8g8wm\") pod \"bb0d15ae-9873-4045-96ff-f333ea013dcb\" (UID: \"bb0d15ae-9873-4045-96ff-f333ea013dcb\") " Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.957651 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb0d15ae-9873-4045-96ff-f333ea013dcb-config-volume" (OuterVolumeSpecName: "config-volume") pod "bb0d15ae-9873-4045-96ff-f333ea013dcb" (UID: "bb0d15ae-9873-4045-96ff-f333ea013dcb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.962941 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb0d15ae-9873-4045-96ff-f333ea013dcb-kube-api-access-8g8wm" (OuterVolumeSpecName: "kube-api-access-8g8wm") pod "bb0d15ae-9873-4045-96ff-f333ea013dcb" (UID: "bb0d15ae-9873-4045-96ff-f333ea013dcb"). InnerVolumeSpecName "kube-api-access-8g8wm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:15:04 crc kubenswrapper[4885]: I0308 22:15:04.976191 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb0d15ae-9873-4045-96ff-f333ea013dcb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bb0d15ae-9873-4045-96ff-f333ea013dcb" (UID: "bb0d15ae-9873-4045-96ff-f333ea013dcb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.057734 4885 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bb0d15ae-9873-4045-96ff-f333ea013dcb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.057764 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g8wm\" (UniqueName: \"kubernetes.io/projected/bb0d15ae-9873-4045-96ff-f333ea013dcb-kube-api-access-8g8wm\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.057775 4885 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bb0d15ae-9873-4045-96ff-f333ea013dcb-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.485819 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" event={"ID":"bb0d15ae-9873-4045-96ff-f333ea013dcb","Type":"ContainerDied","Data":"bb6e5861fa5a2a72d4a68d0f6c81188939f789e64e9c657c693aa24b9c5f3428"} Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.485865 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb6e5861fa5a2a72d4a68d0f6c81188939f789e64e9c657c693aa24b9c5f3428" Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.485954 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29550135-cxdkq" Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.892497 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-9xscz" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="registry-server" probeResult="failure" output=< Mar 08 22:15:05 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 22:15:05 crc kubenswrapper[4885]: > Mar 08 22:15:05 crc kubenswrapper[4885]: I0308 22:15:05.988616 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb"] Mar 08 22:15:06 crc kubenswrapper[4885]: I0308 22:15:06.003505 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29550090-qp6gb"] Mar 08 22:15:07 crc kubenswrapper[4885]: I0308 22:15:07.383072 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2e12e20-b9a4-4fc7-8101-cd76f53c70ba" path="/var/lib/kubelet/pods/c2e12e20-b9a4-4fc7-8101-cd76f53c70ba/volumes" Mar 08 22:15:09 crc kubenswrapper[4885]: I0308 22:15:09.375991 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:15:09 crc kubenswrapper[4885]: E0308 22:15:09.376559 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:15:11 crc kubenswrapper[4885]: I0308 22:15:11.651336 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:15:11 crc kubenswrapper[4885]: I0308 22:15:11.651722 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:15:12 crc kubenswrapper[4885]: I0308 22:15:12.722526 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gxvs5" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" probeResult="failure" output=< Mar 08 22:15:12 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 22:15:12 crc kubenswrapper[4885]: > Mar 08 22:15:14 crc kubenswrapper[4885]: I0308 22:15:14.896781 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:15:14 crc kubenswrapper[4885]: I0308 22:15:14.971720 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:15:16 crc kubenswrapper[4885]: I0308 22:15:16.127045 4885 scope.go:117] "RemoveContainer" containerID="8ffeb3ea1d44ddbc8ed5f91dcd1d3740e5d0c398b63612136a09bb9296a735fb" Mar 08 22:15:17 crc kubenswrapper[4885]: I0308 22:15:17.811232 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xscz"] Mar 08 22:15:17 crc kubenswrapper[4885]: I0308 22:15:17.812035 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9xscz" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="registry-server" containerID="cri-o://39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d" gracePeriod=2 Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.305785 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.351154 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-utilities\") pod \"19f49bd2-97c7-4446-9814-5c5788b65342\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.351219 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-catalog-content\") pod \"19f49bd2-97c7-4446-9814-5c5788b65342\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.351288 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4xng\" (UniqueName: \"kubernetes.io/projected/19f49bd2-97c7-4446-9814-5c5788b65342-kube-api-access-x4xng\") pod \"19f49bd2-97c7-4446-9814-5c5788b65342\" (UID: \"19f49bd2-97c7-4446-9814-5c5788b65342\") " Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.353158 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-utilities" (OuterVolumeSpecName: "utilities") pod "19f49bd2-97c7-4446-9814-5c5788b65342" (UID: "19f49bd2-97c7-4446-9814-5c5788b65342"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.372632 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19f49bd2-97c7-4446-9814-5c5788b65342-kube-api-access-x4xng" (OuterVolumeSpecName: "kube-api-access-x4xng") pod "19f49bd2-97c7-4446-9814-5c5788b65342" (UID: "19f49bd2-97c7-4446-9814-5c5788b65342"). InnerVolumeSpecName "kube-api-access-x4xng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.387216 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "19f49bd2-97c7-4446-9814-5c5788b65342" (UID: "19f49bd2-97c7-4446-9814-5c5788b65342"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.454278 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.454312 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/19f49bd2-97c7-4446-9814-5c5788b65342-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.454322 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4xng\" (UniqueName: \"kubernetes.io/projected/19f49bd2-97c7-4446-9814-5c5788b65342-kube-api-access-x4xng\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.637594 4885 generic.go:334] "Generic (PLEG): container finished" podID="19f49bd2-97c7-4446-9814-5c5788b65342" containerID="39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d" exitCode=0 Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.637670 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerDied","Data":"39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d"} Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.638178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xscz" event={"ID":"19f49bd2-97c7-4446-9814-5c5788b65342","Type":"ContainerDied","Data":"cbefe58057fc835b715eeffbc91f9f0c63d91b9e015b2ca024e66e769f47a64e"} Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.638231 4885 scope.go:117] "RemoveContainer" containerID="39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.637739 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9xscz" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.669389 4885 scope.go:117] "RemoveContainer" containerID="057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.680454 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xscz"] Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.690256 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xscz"] Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.697805 4885 scope.go:117] "RemoveContainer" containerID="b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.771742 4885 scope.go:117] "RemoveContainer" containerID="39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d" Mar 08 22:15:18 crc kubenswrapper[4885]: E0308 22:15:18.772268 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d\": container with ID starting with 39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d not found: ID does not exist" containerID="39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.772338 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d"} err="failed to get container status \"39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d\": rpc error: code = NotFound desc = could not find container \"39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d\": container with ID starting with 39357cf7e67a3a4e5ed204baf53de6b55551fe7f8b47ec64b3f9681798b5cd4d not found: ID does not exist" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.772377 4885 scope.go:117] "RemoveContainer" containerID="057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba" Mar 08 22:15:18 crc kubenswrapper[4885]: E0308 22:15:18.772766 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba\": container with ID starting with 057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba not found: ID does not exist" containerID="057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.772812 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba"} err="failed to get container status \"057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba\": rpc error: code = NotFound desc = could not find container \"057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba\": container with ID starting with 057a66cf286105e04340dc8d9e4f3b3162af8297c49164e608e9f9b76a4504ba not found: ID does not exist" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.772841 4885 scope.go:117] "RemoveContainer" containerID="b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515" Mar 08 22:15:18 crc kubenswrapper[4885]: E0308 22:15:18.773189 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515\": container with ID starting with b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515 not found: ID does not exist" containerID="b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515" Mar 08 22:15:18 crc kubenswrapper[4885]: I0308 22:15:18.773233 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515"} err="failed to get container status \"b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515\": rpc error: code = NotFound desc = could not find container \"b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515\": container with ID starting with b5ddf3fa5ba5ac0cb523d506e554470d7f95c254df220259deb8ae2684f8f515 not found: ID does not exist" Mar 08 22:15:19 crc kubenswrapper[4885]: I0308 22:15:19.383990 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" path="/var/lib/kubelet/pods/19f49bd2-97c7-4446-9814-5c5788b65342/volumes" Mar 08 22:15:21 crc kubenswrapper[4885]: I0308 22:15:21.573390 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-brf5z_c9864aac-5821-4f9b-bcc8-f07752f987b7/prometheus-operator/0.log" Mar 08 22:15:21 crc kubenswrapper[4885]: I0308 22:15:21.624938 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429_65ea3078-ccec-4913-9ce0-873ad93efd0e/prometheus-operator-admission-webhook/0.log" Mar 08 22:15:21 crc kubenswrapper[4885]: I0308 22:15:21.774247 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7_0fe4d43f-e037-431e-98e3-d50194963def/prometheus-operator-admission-webhook/0.log" Mar 08 22:15:21 crc kubenswrapper[4885]: I0308 22:15:21.828145 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-qfwg5_482d7874-16e6-4043-95b1-59222dab9edc/operator/0.log" Mar 08 22:15:21 crc kubenswrapper[4885]: I0308 22:15:21.955077 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-m8k65_062a5ba6-b2c8-4b0c-95e1-d51c1196f367/perses-operator/0.log" Mar 08 22:15:22 crc kubenswrapper[4885]: I0308 22:15:22.700887 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gxvs5" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" probeResult="failure" output=< Mar 08 22:15:22 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 22:15:22 crc kubenswrapper[4885]: > Mar 08 22:15:23 crc kubenswrapper[4885]: I0308 22:15:23.368478 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:15:23 crc kubenswrapper[4885]: E0308 22:15:23.369103 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.038246 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fdhwr"] Mar 08 22:15:30 crc kubenswrapper[4885]: E0308 22:15:30.039320 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="extract-utilities" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.039334 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="extract-utilities" Mar 08 22:15:30 crc kubenswrapper[4885]: E0308 22:15:30.039354 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="registry-server" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.039360 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="registry-server" Mar 08 22:15:30 crc kubenswrapper[4885]: E0308 22:15:30.039375 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb0d15ae-9873-4045-96ff-f333ea013dcb" containerName="collect-profiles" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.039381 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb0d15ae-9873-4045-96ff-f333ea013dcb" containerName="collect-profiles" Mar 08 22:15:30 crc kubenswrapper[4885]: E0308 22:15:30.039400 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="extract-content" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.039406 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="extract-content" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.039654 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="19f49bd2-97c7-4446-9814-5c5788b65342" containerName="registry-server" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.039665 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb0d15ae-9873-4045-96ff-f333ea013dcb" containerName="collect-profiles" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.041606 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.052129 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fdhwr"] Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.158935 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znggx\" (UniqueName: \"kubernetes.io/projected/d913a458-5b1d-491c-bfdc-d2a07f571ce8-kube-api-access-znggx\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.159232 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-catalog-content\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.159579 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-utilities\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.261966 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-catalog-content\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.262148 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-utilities\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.262498 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-catalog-content\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.262604 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-utilities\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.262788 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znggx\" (UniqueName: \"kubernetes.io/projected/d913a458-5b1d-491c-bfdc-d2a07f571ce8-kube-api-access-znggx\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.282067 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znggx\" (UniqueName: \"kubernetes.io/projected/d913a458-5b1d-491c-bfdc-d2a07f571ce8-kube-api-access-znggx\") pod \"certified-operators-fdhwr\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.366774 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:30 crc kubenswrapper[4885]: I0308 22:15:30.895780 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fdhwr"] Mar 08 22:15:31 crc kubenswrapper[4885]: I0308 22:15:31.771246 4885 generic.go:334] "Generic (PLEG): container finished" podID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerID="6609edb3e701aaab6662d1f5509505324f944e8ead3a92d45ce10f8e8a141f42" exitCode=0 Mar 08 22:15:31 crc kubenswrapper[4885]: I0308 22:15:31.771355 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerDied","Data":"6609edb3e701aaab6662d1f5509505324f944e8ead3a92d45ce10f8e8a141f42"} Mar 08 22:15:31 crc kubenswrapper[4885]: I0308 22:15:31.771554 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerStarted","Data":"fa30b1d03ae78f1ddefa0bbe2c4ca9e029a71f271d639838db899fda7c6403bc"} Mar 08 22:15:31 crc kubenswrapper[4885]: I0308 22:15:31.776180 4885 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 08 22:15:33 crc kubenswrapper[4885]: I0308 22:15:33.433699 4885 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gxvs5" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" probeResult="failure" output=< Mar 08 22:15:33 crc kubenswrapper[4885]: timeout: failed to connect service ":50051" within 1s Mar 08 22:15:33 crc kubenswrapper[4885]: > Mar 08 22:15:33 crc kubenswrapper[4885]: I0308 22:15:33.791871 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerStarted","Data":"8e0a849eed3599c2bb3cd606a45cc6d9d875f4438f672ab3a1daf074c9663f7a"} Mar 08 22:15:34 crc kubenswrapper[4885]: I0308 22:15:34.803114 4885 generic.go:334] "Generic (PLEG): container finished" podID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerID="8e0a849eed3599c2bb3cd606a45cc6d9d875f4438f672ab3a1daf074c9663f7a" exitCode=0 Mar 08 22:15:34 crc kubenswrapper[4885]: I0308 22:15:34.803231 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerDied","Data":"8e0a849eed3599c2bb3cd606a45cc6d9d875f4438f672ab3a1daf074c9663f7a"} Mar 08 22:15:35 crc kubenswrapper[4885]: I0308 22:15:35.825292 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerStarted","Data":"576fbed1dd2f3fbde9539787147b1274c70f39d716fb007c750e3c7d20de26ce"} Mar 08 22:15:35 crc kubenswrapper[4885]: I0308 22:15:35.857313 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fdhwr" podStartSLOduration=2.405334161 podStartE2EDuration="5.857289303s" podCreationTimestamp="2026-03-08 22:15:30 +0000 UTC" firstStartedPulling="2026-03-08 22:15:31.775724308 +0000 UTC m=+9833.171778371" lastFinishedPulling="2026-03-08 22:15:35.22767948 +0000 UTC m=+9836.623733513" observedRunningTime="2026-03-08 22:15:35.84403901 +0000 UTC m=+9837.240093063" watchObservedRunningTime="2026-03-08 22:15:35.857289303 +0000 UTC m=+9837.253343336" Mar 08 22:15:38 crc kubenswrapper[4885]: I0308 22:15:38.368854 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:15:38 crc kubenswrapper[4885]: E0308 22:15:38.370498 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.366910 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.367413 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.432409 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.498890 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-xj2vs_6d11a8df-ce5d-404a-b827-822101b061c8/kube-rbac-proxy/0.log" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.702456 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-frr-files/0.log" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.914753 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:40 crc kubenswrapper[4885]: I0308 22:15:40.991476 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-frr-files/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.006156 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-xj2vs_6d11a8df-ce5d-404a-b827-822101b061c8/controller/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.029188 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-reloader/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.037514 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-metrics/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.173065 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-reloader/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.329401 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-frr-files/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.329859 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-reloader/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.358195 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-metrics/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.392735 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-metrics/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.540911 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-reloader/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.546501 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-frr-files/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.552082 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/cp-metrics/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.606506 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/controller/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.713314 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.765098 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/frr-metrics/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.766702 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.811462 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/kube-rbac-proxy/0.log" Mar 08 22:15:41 crc kubenswrapper[4885]: I0308 22:15:41.833766 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/kube-rbac-proxy-frr/0.log" Mar 08 22:15:42 crc kubenswrapper[4885]: I0308 22:15:42.068743 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/reloader/0.log" Mar 08 22:15:42 crc kubenswrapper[4885]: I0308 22:15:42.087786 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-cq6xg_e76b0259-0d11-4451-b770-4ca5611ce32e/frr-k8s-webhook-server/0.log" Mar 08 22:15:42 crc kubenswrapper[4885]: I0308 22:15:42.929949 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6bc5657994-v7mn9_ca51bb10-b38d-4e58-9d29-6c6b8922f72e/manager/0.log" Mar 08 22:15:43 crc kubenswrapper[4885]: I0308 22:15:43.125890 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-74675b5ddf-p7c2j_6ea4545b-278f-43ff-be3c-fc1346b591a1/webhook-server/0.log" Mar 08 22:15:43 crc kubenswrapper[4885]: I0308 22:15:43.179087 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5nclk_860f2bc3-9bd4-43c5-9400-67293a877c6f/kube-rbac-proxy/0.log" Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.220928 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5nclk_860f2bc3-9bd4-43c5-9400-67293a877c6f/speaker/0.log" Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.416176 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fdhwr"] Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.416435 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fdhwr" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="registry-server" containerID="cri-o://576fbed1dd2f3fbde9539787147b1274c70f39d716fb007c750e3c7d20de26ce" gracePeriod=2 Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.908572 4885 generic.go:334] "Generic (PLEG): container finished" podID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerID="576fbed1dd2f3fbde9539787147b1274c70f39d716fb007c750e3c7d20de26ce" exitCode=0 Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.908611 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerDied","Data":"576fbed1dd2f3fbde9539787147b1274c70f39d716fb007c750e3c7d20de26ce"} Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.908948 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fdhwr" event={"ID":"d913a458-5b1d-491c-bfdc-d2a07f571ce8","Type":"ContainerDied","Data":"fa30b1d03ae78f1ddefa0bbe2c4ca9e029a71f271d639838db899fda7c6403bc"} Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.908964 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa30b1d03ae78f1ddefa0bbe2c4ca9e029a71f271d639838db899fda7c6403bc" Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.955263 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.982668 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-catalog-content\") pod \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.982970 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-utilities\") pod \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.983079 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znggx\" (UniqueName: \"kubernetes.io/projected/d913a458-5b1d-491c-bfdc-d2a07f571ce8-kube-api-access-znggx\") pod \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\" (UID: \"d913a458-5b1d-491c-bfdc-d2a07f571ce8\") " Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.985478 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-utilities" (OuterVolumeSpecName: "utilities") pod "d913a458-5b1d-491c-bfdc-d2a07f571ce8" (UID: "d913a458-5b1d-491c-bfdc-d2a07f571ce8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:15:44 crc kubenswrapper[4885]: I0308 22:15:44.996799 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d913a458-5b1d-491c-bfdc-d2a07f571ce8-kube-api-access-znggx" (OuterVolumeSpecName: "kube-api-access-znggx") pod "d913a458-5b1d-491c-bfdc-d2a07f571ce8" (UID: "d913a458-5b1d-491c-bfdc-d2a07f571ce8"). InnerVolumeSpecName "kube-api-access-znggx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.048966 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d913a458-5b1d-491c-bfdc-d2a07f571ce8" (UID: "d913a458-5b1d-491c-bfdc-d2a07f571ce8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.085452 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.085504 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d913a458-5b1d-491c-bfdc-d2a07f571ce8-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.085520 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znggx\" (UniqueName: \"kubernetes.io/projected/d913a458-5b1d-491c-bfdc-d2a07f571ce8-kube-api-access-znggx\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.794102 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hq28v_dca42faa-df32-44b5-99e8-109120aa36a1/frr/0.log" Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.813844 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gxvs5"] Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.814109 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gxvs5" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" containerID="cri-o://6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6" gracePeriod=2 Mar 08 22:15:45 crc kubenswrapper[4885]: I0308 22:15:45.918055 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fdhwr" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.006846 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fdhwr"] Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.020775 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fdhwr"] Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.405035 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.513115 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ntsq\" (UniqueName: \"kubernetes.io/projected/52401772-10fd-464c-bb40-dceaaca564db-kube-api-access-5ntsq\") pod \"52401772-10fd-464c-bb40-dceaaca564db\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.513457 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-catalog-content\") pod \"52401772-10fd-464c-bb40-dceaaca564db\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.513548 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-utilities\") pod \"52401772-10fd-464c-bb40-dceaaca564db\" (UID: \"52401772-10fd-464c-bb40-dceaaca564db\") " Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.514650 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-utilities" (OuterVolumeSpecName: "utilities") pod "52401772-10fd-464c-bb40-dceaaca564db" (UID: "52401772-10fd-464c-bb40-dceaaca564db"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.522481 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52401772-10fd-464c-bb40-dceaaca564db-kube-api-access-5ntsq" (OuterVolumeSpecName: "kube-api-access-5ntsq") pod "52401772-10fd-464c-bb40-dceaaca564db" (UID: "52401772-10fd-464c-bb40-dceaaca564db"). InnerVolumeSpecName "kube-api-access-5ntsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.616357 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ntsq\" (UniqueName: \"kubernetes.io/projected/52401772-10fd-464c-bb40-dceaaca564db-kube-api-access-5ntsq\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.616406 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.646013 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52401772-10fd-464c-bb40-dceaaca564db" (UID: "52401772-10fd-464c-bb40-dceaaca564db"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.718546 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52401772-10fd-464c-bb40-dceaaca564db-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.937856 4885 generic.go:334] "Generic (PLEG): container finished" podID="52401772-10fd-464c-bb40-dceaaca564db" containerID="6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6" exitCode=0 Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.937945 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gxvs5" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.937951 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerDied","Data":"6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6"} Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.938306 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gxvs5" event={"ID":"52401772-10fd-464c-bb40-dceaaca564db","Type":"ContainerDied","Data":"410526b2d3e9b4d6a551a2823b5579cf7d0ec2f78e2d4b1b12bd289e95ff9e5f"} Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.938333 4885 scope.go:117] "RemoveContainer" containerID="6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.978606 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gxvs5"] Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.979370 4885 scope.go:117] "RemoveContainer" containerID="360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728" Mar 08 22:15:46 crc kubenswrapper[4885]: I0308 22:15:46.991045 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gxvs5"] Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.021115 4885 scope.go:117] "RemoveContainer" containerID="e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.075190 4885 scope.go:117] "RemoveContainer" containerID="6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6" Mar 08 22:15:47 crc kubenswrapper[4885]: E0308 22:15:47.075777 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6\": container with ID starting with 6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6 not found: ID does not exist" containerID="6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.075810 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6"} err="failed to get container status \"6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6\": rpc error: code = NotFound desc = could not find container \"6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6\": container with ID starting with 6f4b1b94ff228b4a5a030c6c7e51283ba9e9ccb5a16ab8ee5baf97c4b6c1e6a6 not found: ID does not exist" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.075833 4885 scope.go:117] "RemoveContainer" containerID="360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728" Mar 08 22:15:47 crc kubenswrapper[4885]: E0308 22:15:47.076476 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728\": container with ID starting with 360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728 not found: ID does not exist" containerID="360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.076533 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728"} err="failed to get container status \"360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728\": rpc error: code = NotFound desc = could not find container \"360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728\": container with ID starting with 360582123ef0b410d63e1ca417395445917b1d27eed269bdde6e57610a83c728 not found: ID does not exist" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.076566 4885 scope.go:117] "RemoveContainer" containerID="e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d" Mar 08 22:15:47 crc kubenswrapper[4885]: E0308 22:15:47.076872 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d\": container with ID starting with e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d not found: ID does not exist" containerID="e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.076936 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d"} err="failed to get container status \"e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d\": rpc error: code = NotFound desc = could not find container \"e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d\": container with ID starting with e140c9739be1aeda12d192e1c07a3e405726e2e67354a47cf68766020dfe775d not found: ID does not exist" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.381557 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52401772-10fd-464c-bb40-dceaaca564db" path="/var/lib/kubelet/pods/52401772-10fd-464c-bb40-dceaaca564db/volumes" Mar 08 22:15:47 crc kubenswrapper[4885]: I0308 22:15:47.383497 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" path="/var/lib/kubelet/pods/d913a458-5b1d-491c-bfdc-d2a07f571ce8/volumes" Mar 08 22:15:49 crc kubenswrapper[4885]: I0308 22:15:49.376377 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:15:49 crc kubenswrapper[4885]: E0308 22:15:49.377321 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.108266 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/util/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.269581 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/util/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.303299 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/pull/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.334438 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/pull/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.512195 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/util/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.550301 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/pull/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.567273 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82q98wf_1a6fcfe0-2307-48aa-a03f-b8a9b2d5cd67/extract/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.702841 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/util/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.949164 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/pull/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.951722 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/pull/0.log" Mar 08 22:15:58 crc kubenswrapper[4885]: I0308 22:15:58.958554 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/util/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.155820 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/util/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.170751 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/pull/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.187226 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5b546w_a5bcb33a-118a-438a-86f5-467399e36ddb/extract/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.364984 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/util/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.536605 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/pull/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.541324 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/pull/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.559287 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/util/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.756718 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/extract/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.788267 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/util/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.815396 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085ssnm_99cf706b-d380-4027-ad93-af7f1e5f8a36/pull/0.log" Mar 08 22:15:59 crc kubenswrapper[4885]: I0308 22:15:59.979379 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/extract-utilities/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.140807 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550136-4kln6"] Mar 08 22:16:00 crc kubenswrapper[4885]: E0308 22:16:00.141222 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="extract-content" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141240 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="extract-content" Mar 08 22:16:00 crc kubenswrapper[4885]: E0308 22:16:00.141251 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141258 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" Mar 08 22:16:00 crc kubenswrapper[4885]: E0308 22:16:00.141272 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="extract-utilities" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141280 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="extract-utilities" Mar 08 22:16:00 crc kubenswrapper[4885]: E0308 22:16:00.141305 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="extract-utilities" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141310 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="extract-utilities" Mar 08 22:16:00 crc kubenswrapper[4885]: E0308 22:16:00.141328 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="extract-content" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141334 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="extract-content" Mar 08 22:16:00 crc kubenswrapper[4885]: E0308 22:16:00.141344 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="registry-server" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141349 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="registry-server" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141559 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="52401772-10fd-464c-bb40-dceaaca564db" containerName="registry-server" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.141577 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="d913a458-5b1d-491c-bfdc-d2a07f571ce8" containerName="registry-server" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.142271 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.144261 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.144378 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.150289 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.158111 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550136-4kln6"] Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.180875 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/extract-content/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.210531 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/extract-utilities/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.241472 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/extract-content/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.248426 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58tw6\" (UniqueName: \"kubernetes.io/projected/89e80cc7-fce3-4c3c-9c10-b76e212f51e0-kube-api-access-58tw6\") pod \"auto-csr-approver-29550136-4kln6\" (UID: \"89e80cc7-fce3-4c3c-9c10-b76e212f51e0\") " pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.350082 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58tw6\" (UniqueName: \"kubernetes.io/projected/89e80cc7-fce3-4c3c-9c10-b76e212f51e0-kube-api-access-58tw6\") pod \"auto-csr-approver-29550136-4kln6\" (UID: \"89e80cc7-fce3-4c3c-9c10-b76e212f51e0\") " pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.403073 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58tw6\" (UniqueName: \"kubernetes.io/projected/89e80cc7-fce3-4c3c-9c10-b76e212f51e0-kube-api-access-58tw6\") pod \"auto-csr-approver-29550136-4kln6\" (UID: \"89e80cc7-fce3-4c3c-9c10-b76e212f51e0\") " pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.425287 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/extract-content/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.437863 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/extract-utilities/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.458099 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.640441 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/extract-utilities/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.897407 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/extract-utilities/0.log" Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.960769 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550136-4kln6"] Mar 08 22:16:00 crc kubenswrapper[4885]: I0308 22:16:00.970270 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/extract-content/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.001506 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/extract-content/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.104166 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550136-4kln6" event={"ID":"89e80cc7-fce3-4c3c-9c10-b76e212f51e0","Type":"ContainerStarted","Data":"cef497a02157383253457d9710e34df0086facb594f35b48877783afea030628"} Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.398327 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/extract-utilities/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.410457 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/extract-content/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.566626 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-xjhsv_2914e8af-92f9-40a3-99ea-a52bfaf31a36/registry-server/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.736485 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/util/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.858348 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/util/0.log" Mar 08 22:16:01 crc kubenswrapper[4885]: I0308 22:16:01.941844 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/pull/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.024195 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/pull/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.277412 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/util/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.282299 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/pull/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.292501 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4l6kmr_4b43d8cc-1dca-4c13-a0b7-df1371935186/extract/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.532680 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-2774l_1e87323f-cf50-46ef-8e7c-cccd8a1e3601/marketplace-operator/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.563424 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/extract-utilities/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.728975 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-2tkhk_2b34f7ab-2ff3-40fd-8a23-82b9ff4536e9/registry-server/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.741288 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/extract-utilities/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.751988 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/extract-content/0.log" Mar 08 22:16:02 crc kubenswrapper[4885]: I0308 22:16:02.780951 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/extract-content/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.133178 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550136-4kln6" event={"ID":"89e80cc7-fce3-4c3c-9c10-b76e212f51e0","Type":"ContainerStarted","Data":"44bdf92c76807ade0aa539cc5874fe807bee8fa87d0c0f440ab4fe4105ecda01"} Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.153866 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29550136-4kln6" podStartSLOduration=2.018463094 podStartE2EDuration="3.153839367s" podCreationTimestamp="2026-03-08 22:16:00 +0000 UTC" firstStartedPulling="2026-03-08 22:16:00.966892748 +0000 UTC m=+9862.362946771" lastFinishedPulling="2026-03-08 22:16:02.102269021 +0000 UTC m=+9863.498323044" observedRunningTime="2026-03-08 22:16:03.144447236 +0000 UTC m=+9864.540501259" watchObservedRunningTime="2026-03-08 22:16:03.153839367 +0000 UTC m=+9864.549893430" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.620595 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/extract-utilities/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.646426 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/extract-utilities/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.646465 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/extract-content/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.958256 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/extract-utilities/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.958289 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6ctxc_751589b1-c864-424f-9315-13a7d880bcf6/registry-server/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.973286 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/extract-content/0.log" Mar 08 22:16:03 crc kubenswrapper[4885]: I0308 22:16:03.974739 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/extract-content/0.log" Mar 08 22:16:04 crc kubenswrapper[4885]: I0308 22:16:04.144523 4885 generic.go:334] "Generic (PLEG): container finished" podID="89e80cc7-fce3-4c3c-9c10-b76e212f51e0" containerID="44bdf92c76807ade0aa539cc5874fe807bee8fa87d0c0f440ab4fe4105ecda01" exitCode=0 Mar 08 22:16:04 crc kubenswrapper[4885]: I0308 22:16:04.144564 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550136-4kln6" event={"ID":"89e80cc7-fce3-4c3c-9c10-b76e212f51e0","Type":"ContainerDied","Data":"44bdf92c76807ade0aa539cc5874fe807bee8fa87d0c0f440ab4fe4105ecda01"} Mar 08 22:16:04 crc kubenswrapper[4885]: I0308 22:16:04.144832 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/extract-utilities/0.log" Mar 08 22:16:04 crc kubenswrapper[4885]: I0308 22:16:04.211275 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/extract-content/0.log" Mar 08 22:16:04 crc kubenswrapper[4885]: I0308 22:16:04.367747 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:16:04 crc kubenswrapper[4885]: E0308 22:16:04.368130 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:16:05 crc kubenswrapper[4885]: I0308 22:16:05.206025 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-bs2qs_bdc128cf-2f55-4964-8229-6aa7e1dd9f1e/registry-server/0.log" Mar 08 22:16:05 crc kubenswrapper[4885]: I0308 22:16:05.753197 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:05 crc kubenswrapper[4885]: I0308 22:16:05.909223 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58tw6\" (UniqueName: \"kubernetes.io/projected/89e80cc7-fce3-4c3c-9c10-b76e212f51e0-kube-api-access-58tw6\") pod \"89e80cc7-fce3-4c3c-9c10-b76e212f51e0\" (UID: \"89e80cc7-fce3-4c3c-9c10-b76e212f51e0\") " Mar 08 22:16:05 crc kubenswrapper[4885]: I0308 22:16:05.918304 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89e80cc7-fce3-4c3c-9c10-b76e212f51e0-kube-api-access-58tw6" (OuterVolumeSpecName: "kube-api-access-58tw6") pod "89e80cc7-fce3-4c3c-9c10-b76e212f51e0" (UID: "89e80cc7-fce3-4c3c-9c10-b76e212f51e0"). InnerVolumeSpecName "kube-api-access-58tw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:16:06 crc kubenswrapper[4885]: I0308 22:16:06.012294 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58tw6\" (UniqueName: \"kubernetes.io/projected/89e80cc7-fce3-4c3c-9c10-b76e212f51e0-kube-api-access-58tw6\") on node \"crc\" DevicePath \"\"" Mar 08 22:16:06 crc kubenswrapper[4885]: I0308 22:16:06.191672 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550136-4kln6" event={"ID":"89e80cc7-fce3-4c3c-9c10-b76e212f51e0","Type":"ContainerDied","Data":"cef497a02157383253457d9710e34df0086facb594f35b48877783afea030628"} Mar 08 22:16:06 crc kubenswrapper[4885]: I0308 22:16:06.191727 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cef497a02157383253457d9710e34df0086facb594f35b48877783afea030628" Mar 08 22:16:06 crc kubenswrapper[4885]: I0308 22:16:06.191815 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550136-4kln6" Mar 08 22:16:06 crc kubenswrapper[4885]: I0308 22:16:06.220677 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550130-q8btr"] Mar 08 22:16:06 crc kubenswrapper[4885]: I0308 22:16:06.229705 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550130-q8btr"] Mar 08 22:16:07 crc kubenswrapper[4885]: I0308 22:16:07.377948 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6" path="/var/lib/kubelet/pods/8bf31f87-6e2d-4ae5-81e7-e3d501dc03d6/volumes" Mar 08 22:16:16 crc kubenswrapper[4885]: I0308 22:16:16.190387 4885 scope.go:117] "RemoveContainer" containerID="1c5174db17fa21586bec90f86258445c10bafc4fb6675bd3f58ffbbc2c682873" Mar 08 22:16:19 crc kubenswrapper[4885]: I0308 22:16:19.377708 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:16:19 crc kubenswrapper[4885]: E0308 22:16:19.379598 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:16:19 crc kubenswrapper[4885]: I0308 22:16:19.666200 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-brf5z_c9864aac-5821-4f9b-bcc8-f07752f987b7/prometheus-operator/0.log" Mar 08 22:16:19 crc kubenswrapper[4885]: I0308 22:16:19.719485 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-694bf7b9c4-gb429_65ea3078-ccec-4913-9ce0-873ad93efd0e/prometheus-operator-admission-webhook/0.log" Mar 08 22:16:19 crc kubenswrapper[4885]: I0308 22:16:19.733665 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-694bf7b9c4-mcwh7_0fe4d43f-e037-431e-98e3-d50194963def/prometheus-operator-admission-webhook/0.log" Mar 08 22:16:19 crc kubenswrapper[4885]: I0308 22:16:19.855511 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-qfwg5_482d7874-16e6-4043-95b1-59222dab9edc/operator/0.log" Mar 08 22:16:19 crc kubenswrapper[4885]: I0308 22:16:19.919726 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-m8k65_062a5ba6-b2c8-4b0c-95e1-d51c1196f367/perses-operator/0.log" Mar 08 22:16:29 crc kubenswrapper[4885]: I0308 22:16:29.922877 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cmbcr"] Mar 08 22:16:29 crc kubenswrapper[4885]: E0308 22:16:29.923742 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89e80cc7-fce3-4c3c-9c10-b76e212f51e0" containerName="oc" Mar 08 22:16:29 crc kubenswrapper[4885]: I0308 22:16:29.923754 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="89e80cc7-fce3-4c3c-9c10-b76e212f51e0" containerName="oc" Mar 08 22:16:29 crc kubenswrapper[4885]: I0308 22:16:29.929937 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="89e80cc7-fce3-4c3c-9c10-b76e212f51e0" containerName="oc" Mar 08 22:16:29 crc kubenswrapper[4885]: I0308 22:16:29.931695 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:29 crc kubenswrapper[4885]: I0308 22:16:29.937543 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cmbcr"] Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.051166 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-utilities\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.051628 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-catalog-content\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.051891 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl7dr\" (UniqueName: \"kubernetes.io/projected/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-kube-api-access-gl7dr\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.153591 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-utilities\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.153670 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-catalog-content\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.154341 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-catalog-content\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.154379 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-utilities\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.154636 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl7dr\" (UniqueName: \"kubernetes.io/projected/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-kube-api-access-gl7dr\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.177796 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl7dr\" (UniqueName: \"kubernetes.io/projected/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-kube-api-access-gl7dr\") pod \"community-operators-cmbcr\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.253269 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:30 crc kubenswrapper[4885]: I0308 22:16:30.871467 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cmbcr"] Mar 08 22:16:31 crc kubenswrapper[4885]: I0308 22:16:31.368229 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:16:31 crc kubenswrapper[4885]: E0308 22:16:31.368946 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:16:31 crc kubenswrapper[4885]: I0308 22:16:31.488771 4885 generic.go:334] "Generic (PLEG): container finished" podID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerID="d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5" exitCode=0 Mar 08 22:16:31 crc kubenswrapper[4885]: I0308 22:16:31.488954 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerDied","Data":"d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5"} Mar 08 22:16:31 crc kubenswrapper[4885]: I0308 22:16:31.489286 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerStarted","Data":"d11f09a1fc366a8d58eedcdbcbd46f3f35e1bad13a8028153e55cdffe1f6d801"} Mar 08 22:16:33 crc kubenswrapper[4885]: I0308 22:16:33.510118 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerStarted","Data":"d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9"} Mar 08 22:16:35 crc kubenswrapper[4885]: I0308 22:16:35.538601 4885 generic.go:334] "Generic (PLEG): container finished" podID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerID="d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9" exitCode=0 Mar 08 22:16:35 crc kubenswrapper[4885]: I0308 22:16:35.538792 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerDied","Data":"d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9"} Mar 08 22:16:36 crc kubenswrapper[4885]: I0308 22:16:36.551349 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerStarted","Data":"188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b"} Mar 08 22:16:36 crc kubenswrapper[4885]: I0308 22:16:36.567929 4885 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cmbcr" podStartSLOduration=3.069929826 podStartE2EDuration="7.567900105s" podCreationTimestamp="2026-03-08 22:16:29 +0000 UTC" firstStartedPulling="2026-03-08 22:16:31.490884002 +0000 UTC m=+9892.886938015" lastFinishedPulling="2026-03-08 22:16:35.988854281 +0000 UTC m=+9897.384908294" observedRunningTime="2026-03-08 22:16:36.564905706 +0000 UTC m=+9897.960959729" watchObservedRunningTime="2026-03-08 22:16:36.567900105 +0000 UTC m=+9897.963954128" Mar 08 22:16:40 crc kubenswrapper[4885]: I0308 22:16:40.254140 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:40 crc kubenswrapper[4885]: I0308 22:16:40.254771 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:40 crc kubenswrapper[4885]: I0308 22:16:40.313402 4885 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:46 crc kubenswrapper[4885]: I0308 22:16:46.368005 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:16:46 crc kubenswrapper[4885]: E0308 22:16:46.368645 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:16:50 crc kubenswrapper[4885]: I0308 22:16:50.308419 4885 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:50 crc kubenswrapper[4885]: I0308 22:16:50.375409 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cmbcr"] Mar 08 22:16:50 crc kubenswrapper[4885]: I0308 22:16:50.717982 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cmbcr" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="registry-server" containerID="cri-o://188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b" gracePeriod=2 Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.290751 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.489311 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-catalog-content\") pod \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.489412 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-utilities\") pod \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.489591 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl7dr\" (UniqueName: \"kubernetes.io/projected/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-kube-api-access-gl7dr\") pod \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\" (UID: \"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15\") " Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.491767 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-utilities" (OuterVolumeSpecName: "utilities") pod "4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" (UID: "4f02b2be-ffee-4ef7-aab7-d23fe2e81b15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.495817 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-kube-api-access-gl7dr" (OuterVolumeSpecName: "kube-api-access-gl7dr") pod "4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" (UID: "4f02b2be-ffee-4ef7-aab7-d23fe2e81b15"). InnerVolumeSpecName "kube-api-access-gl7dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.569263 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" (UID: "4f02b2be-ffee-4ef7-aab7-d23fe2e81b15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.594060 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl7dr\" (UniqueName: \"kubernetes.io/projected/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-kube-api-access-gl7dr\") on node \"crc\" DevicePath \"\"" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.594385 4885 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.594405 4885 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15-utilities\") on node \"crc\" DevicePath \"\"" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.733898 4885 generic.go:334] "Generic (PLEG): container finished" podID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerID="188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b" exitCode=0 Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.733964 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerDied","Data":"188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b"} Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.734013 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cmbcr" event={"ID":"4f02b2be-ffee-4ef7-aab7-d23fe2e81b15","Type":"ContainerDied","Data":"d11f09a1fc366a8d58eedcdbcbd46f3f35e1bad13a8028153e55cdffe1f6d801"} Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.734031 4885 scope.go:117] "RemoveContainer" containerID="188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.734061 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cmbcr" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.782671 4885 scope.go:117] "RemoveContainer" containerID="d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.813429 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cmbcr"] Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.823542 4885 scope.go:117] "RemoveContainer" containerID="d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.825759 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cmbcr"] Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.874727 4885 scope.go:117] "RemoveContainer" containerID="188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b" Mar 08 22:16:51 crc kubenswrapper[4885]: E0308 22:16:51.875165 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b\": container with ID starting with 188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b not found: ID does not exist" containerID="188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.875205 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b"} err="failed to get container status \"188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b\": rpc error: code = NotFound desc = could not find container \"188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b\": container with ID starting with 188941bcfde3bf4ca0e3b7d9d0f8963bd80d90019ec12555b1d168c64d348f4b not found: ID does not exist" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.875228 4885 scope.go:117] "RemoveContainer" containerID="d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9" Mar 08 22:16:51 crc kubenswrapper[4885]: E0308 22:16:51.875559 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9\": container with ID starting with d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9 not found: ID does not exist" containerID="d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.875588 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9"} err="failed to get container status \"d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9\": rpc error: code = NotFound desc = could not find container \"d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9\": container with ID starting with d48934d005b75b4c8097b6ef2fecaf477eafd7e55e9d98378054c6d763d5aff9 not found: ID does not exist" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.875606 4885 scope.go:117] "RemoveContainer" containerID="d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5" Mar 08 22:16:51 crc kubenswrapper[4885]: E0308 22:16:51.875856 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5\": container with ID starting with d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5 not found: ID does not exist" containerID="d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5" Mar 08 22:16:51 crc kubenswrapper[4885]: I0308 22:16:51.875900 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5"} err="failed to get container status \"d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5\": rpc error: code = NotFound desc = could not find container \"d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5\": container with ID starting with d69a39c7ac335f3ffb7a4f7c06899e92b79695daebdbdcfa5d8734c0139145a5 not found: ID does not exist" Mar 08 22:16:53 crc kubenswrapper[4885]: I0308 22:16:53.381773 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" path="/var/lib/kubelet/pods/4f02b2be-ffee-4ef7-aab7-d23fe2e81b15/volumes" Mar 08 22:16:58 crc kubenswrapper[4885]: I0308 22:16:58.369191 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:16:58 crc kubenswrapper[4885]: E0308 22:16:58.370219 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:17:13 crc kubenswrapper[4885]: I0308 22:17:13.372194 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:17:13 crc kubenswrapper[4885]: E0308 22:17:13.374968 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:17:26 crc kubenswrapper[4885]: I0308 22:17:26.368510 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:17:26 crc kubenswrapper[4885]: E0308 22:17:26.369302 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:17:39 crc kubenswrapper[4885]: I0308 22:17:39.405894 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:17:39 crc kubenswrapper[4885]: E0308 22:17:39.406985 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:17:50 crc kubenswrapper[4885]: I0308 22:17:50.368079 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:17:50 crc kubenswrapper[4885]: E0308 22:17:50.368998 4885 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ttb97_openshift-machine-config-operator(3c5dda3b-3e01-4bb4-af02-b0f4eeadda58)\"" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.167393 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550138-n9n57"] Mar 08 22:18:00 crc kubenswrapper[4885]: E0308 22:18:00.168662 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="extract-utilities" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.168680 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="extract-utilities" Mar 08 22:18:00 crc kubenswrapper[4885]: E0308 22:18:00.168703 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="registry-server" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.168714 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="registry-server" Mar 08 22:18:00 crc kubenswrapper[4885]: E0308 22:18:00.168742 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="extract-content" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.168751 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="extract-content" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.169031 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f02b2be-ffee-4ef7-aab7-d23fe2e81b15" containerName="registry-server" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.170044 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.172411 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.172839 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.178840 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.201895 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550138-n9n57"] Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.333847 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvp5k\" (UniqueName: \"kubernetes.io/projected/aa83b743-7733-47dc-856b-bcc08f1f6571-kube-api-access-qvp5k\") pod \"auto-csr-approver-29550138-n9n57\" (UID: \"aa83b743-7733-47dc-856b-bcc08f1f6571\") " pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.437256 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvp5k\" (UniqueName: \"kubernetes.io/projected/aa83b743-7733-47dc-856b-bcc08f1f6571-kube-api-access-qvp5k\") pod \"auto-csr-approver-29550138-n9n57\" (UID: \"aa83b743-7733-47dc-856b-bcc08f1f6571\") " pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.464407 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvp5k\" (UniqueName: \"kubernetes.io/projected/aa83b743-7733-47dc-856b-bcc08f1f6571-kube-api-access-qvp5k\") pod \"auto-csr-approver-29550138-n9n57\" (UID: \"aa83b743-7733-47dc-856b-bcc08f1f6571\") " pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.502634 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:00 crc kubenswrapper[4885]: I0308 22:18:00.964602 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550138-n9n57"] Mar 08 22:18:01 crc kubenswrapper[4885]: I0308 22:18:01.654318 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550138-n9n57" event={"ID":"aa83b743-7733-47dc-856b-bcc08f1f6571","Type":"ContainerStarted","Data":"57a95a0323a9637f304607194e5c006146755c22374b000d00348433cac9d9ca"} Mar 08 22:18:02 crc kubenswrapper[4885]: I0308 22:18:02.668087 4885 generic.go:334] "Generic (PLEG): container finished" podID="aa83b743-7733-47dc-856b-bcc08f1f6571" containerID="cb5e22f23aa19fc5c02b09e4507e9cf89fc1959db4ad0aff6bb1d9208face06e" exitCode=0 Mar 08 22:18:02 crc kubenswrapper[4885]: I0308 22:18:02.668188 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550138-n9n57" event={"ID":"aa83b743-7733-47dc-856b-bcc08f1f6571","Type":"ContainerDied","Data":"cb5e22f23aa19fc5c02b09e4507e9cf89fc1959db4ad0aff6bb1d9208face06e"} Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.104291 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.228834 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvp5k\" (UniqueName: \"kubernetes.io/projected/aa83b743-7733-47dc-856b-bcc08f1f6571-kube-api-access-qvp5k\") pod \"aa83b743-7733-47dc-856b-bcc08f1f6571\" (UID: \"aa83b743-7733-47dc-856b-bcc08f1f6571\") " Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.234745 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa83b743-7733-47dc-856b-bcc08f1f6571-kube-api-access-qvp5k" (OuterVolumeSpecName: "kube-api-access-qvp5k") pod "aa83b743-7733-47dc-856b-bcc08f1f6571" (UID: "aa83b743-7733-47dc-856b-bcc08f1f6571"). InnerVolumeSpecName "kube-api-access-qvp5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.332500 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qvp5k\" (UniqueName: \"kubernetes.io/projected/aa83b743-7733-47dc-856b-bcc08f1f6571-kube-api-access-qvp5k\") on node \"crc\" DevicePath \"\"" Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.369448 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7" Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.710357 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"20357516b8a200da650d714af40dc8b472d24afffa0537208eea9c9c2b38005b"} Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.726446 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550138-n9n57" event={"ID":"aa83b743-7733-47dc-856b-bcc08f1f6571","Type":"ContainerDied","Data":"57a95a0323a9637f304607194e5c006146755c22374b000d00348433cac9d9ca"} Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.726488 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57a95a0323a9637f304607194e5c006146755c22374b000d00348433cac9d9ca" Mar 08 22:18:04 crc kubenswrapper[4885]: I0308 22:18:04.726548 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550138-n9n57" Mar 08 22:18:05 crc kubenswrapper[4885]: I0308 22:18:05.207768 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550132-jwbgs"] Mar 08 22:18:05 crc kubenswrapper[4885]: I0308 22:18:05.217176 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550132-jwbgs"] Mar 08 22:18:05 crc kubenswrapper[4885]: I0308 22:18:05.391817 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5350f846-ee1f-400b-8579-de1a56050f02" path="/var/lib/kubelet/pods/5350f846-ee1f-400b-8579-de1a56050f02/volumes" Mar 08 22:18:16 crc kubenswrapper[4885]: I0308 22:18:16.717150 4885 scope.go:117] "RemoveContainer" containerID="381bb8f225c03be035f053937f74c9493566bd9f87da1d7c680e81f6170500d2" Mar 08 22:18:31 crc kubenswrapper[4885]: I0308 22:18:31.067426 4885 generic.go:334] "Generic (PLEG): container finished" podID="009e478e-8f33-43d1-aded-7d3084ed486e" containerID="995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b" exitCode=0 Mar 08 22:18:31 crc kubenswrapper[4885]: I0308 22:18:31.067510 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-58ck9/must-gather-x7xtd" event={"ID":"009e478e-8f33-43d1-aded-7d3084ed486e","Type":"ContainerDied","Data":"995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b"} Mar 08 22:18:31 crc kubenswrapper[4885]: I0308 22:18:31.071816 4885 scope.go:117] "RemoveContainer" containerID="995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b" Mar 08 22:18:31 crc kubenswrapper[4885]: I0308 22:18:31.575798 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-58ck9_must-gather-x7xtd_009e478e-8f33-43d1-aded-7d3084ed486e/gather/0.log" Mar 08 22:18:39 crc kubenswrapper[4885]: I0308 22:18:39.685307 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-58ck9/must-gather-x7xtd"] Mar 08 22:18:39 crc kubenswrapper[4885]: I0308 22:18:39.686206 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-58ck9/must-gather-x7xtd" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="copy" containerID="cri-o://d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916" gracePeriod=2 Mar 08 22:18:39 crc kubenswrapper[4885]: I0308 22:18:39.705354 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-58ck9/must-gather-x7xtd"] Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.163416 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-58ck9_must-gather-x7xtd_009e478e-8f33-43d1-aded-7d3084ed486e/copy/0.log" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.164301 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.171716 4885 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-58ck9_must-gather-x7xtd_009e478e-8f33-43d1-aded-7d3084ed486e/copy/0.log" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.172186 4885 generic.go:334] "Generic (PLEG): container finished" podID="009e478e-8f33-43d1-aded-7d3084ed486e" containerID="d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916" exitCode=143 Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.172259 4885 scope.go:117] "RemoveContainer" containerID="d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.172356 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-58ck9/must-gather-x7xtd" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.208711 4885 scope.go:117] "RemoveContainer" containerID="995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.240998 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsvfg\" (UniqueName: \"kubernetes.io/projected/009e478e-8f33-43d1-aded-7d3084ed486e-kube-api-access-bsvfg\") pod \"009e478e-8f33-43d1-aded-7d3084ed486e\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.241246 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/009e478e-8f33-43d1-aded-7d3084ed486e-must-gather-output\") pod \"009e478e-8f33-43d1-aded-7d3084ed486e\" (UID: \"009e478e-8f33-43d1-aded-7d3084ed486e\") " Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.247542 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/009e478e-8f33-43d1-aded-7d3084ed486e-kube-api-access-bsvfg" (OuterVolumeSpecName: "kube-api-access-bsvfg") pod "009e478e-8f33-43d1-aded-7d3084ed486e" (UID: "009e478e-8f33-43d1-aded-7d3084ed486e"). InnerVolumeSpecName "kube-api-access-bsvfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.272376 4885 scope.go:117] "RemoveContainer" containerID="d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916" Mar 08 22:18:40 crc kubenswrapper[4885]: E0308 22:18:40.273058 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916\": container with ID starting with d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916 not found: ID does not exist" containerID="d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.273095 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916"} err="failed to get container status \"d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916\": rpc error: code = NotFound desc = could not find container \"d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916\": container with ID starting with d1ecb4b5f3087eee4dc4b86e5f4cd04ae313c53c67011fa376283cf0e18f7916 not found: ID does not exist" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.273118 4885 scope.go:117] "RemoveContainer" containerID="995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b" Mar 08 22:18:40 crc kubenswrapper[4885]: E0308 22:18:40.273350 4885 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b\": container with ID starting with 995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b not found: ID does not exist" containerID="995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.273369 4885 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b"} err="failed to get container status \"995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b\": rpc error: code = NotFound desc = could not find container \"995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b\": container with ID starting with 995440926bbc789898f79ed29cff5064b28089d90cb70720b548463332c91f5b not found: ID does not exist" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.345006 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsvfg\" (UniqueName: \"kubernetes.io/projected/009e478e-8f33-43d1-aded-7d3084ed486e-kube-api-access-bsvfg\") on node \"crc\" DevicePath \"\"" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.481726 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/009e478e-8f33-43d1-aded-7d3084ed486e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "009e478e-8f33-43d1-aded-7d3084ed486e" (UID: "009e478e-8f33-43d1-aded-7d3084ed486e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 08 22:18:40 crc kubenswrapper[4885]: I0308 22:18:40.552246 4885 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/009e478e-8f33-43d1-aded-7d3084ed486e-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 08 22:18:41 crc kubenswrapper[4885]: I0308 22:18:41.382837 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" path="/var/lib/kubelet/pods/009e478e-8f33-43d1-aded-7d3084ed486e/volumes" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.168231 4885 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29550140-8xvwg"] Mar 08 22:20:00 crc kubenswrapper[4885]: E0308 22:20:00.169268 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa83b743-7733-47dc-856b-bcc08f1f6571" containerName="oc" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.169283 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa83b743-7733-47dc-856b-bcc08f1f6571" containerName="oc" Mar 08 22:20:00 crc kubenswrapper[4885]: E0308 22:20:00.169303 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="gather" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.169309 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="gather" Mar 08 22:20:00 crc kubenswrapper[4885]: E0308 22:20:00.169344 4885 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="copy" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.169350 4885 state_mem.go:107] "Deleted CPUSet assignment" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="copy" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.169532 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="gather" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.169553 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa83b743-7733-47dc-856b-bcc08f1f6571" containerName="oc" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.169573 4885 memory_manager.go:354] "RemoveStaleState removing state" podUID="009e478e-8f33-43d1-aded-7d3084ed486e" containerName="copy" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.170353 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.172060 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.172659 4885 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-qfn28" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.177318 4885 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.186864 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550140-8xvwg"] Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.264881 4885 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsn9f\" (UniqueName: \"kubernetes.io/projected/374e97b8-fd74-4ec5-95c5-461c1fef2762-kube-api-access-nsn9f\") pod \"auto-csr-approver-29550140-8xvwg\" (UID: \"374e97b8-fd74-4ec5-95c5-461c1fef2762\") " pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.367000 4885 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsn9f\" (UniqueName: \"kubernetes.io/projected/374e97b8-fd74-4ec5-95c5-461c1fef2762-kube-api-access-nsn9f\") pod \"auto-csr-approver-29550140-8xvwg\" (UID: \"374e97b8-fd74-4ec5-95c5-461c1fef2762\") " pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.388072 4885 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsn9f\" (UniqueName: \"kubernetes.io/projected/374e97b8-fd74-4ec5-95c5-461c1fef2762-kube-api-access-nsn9f\") pod \"auto-csr-approver-29550140-8xvwg\" (UID: \"374e97b8-fd74-4ec5-95c5-461c1fef2762\") " pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:00 crc kubenswrapper[4885]: I0308 22:20:00.502361 4885 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:01 crc kubenswrapper[4885]: I0308 22:20:01.038194 4885 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29550140-8xvwg"] Mar 08 22:20:01 crc kubenswrapper[4885]: I0308 22:20:01.252264 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" event={"ID":"374e97b8-fd74-4ec5-95c5-461c1fef2762","Type":"ContainerStarted","Data":"2dba0dfa4836064998bca97e06644a71094c83fc86d5bd066a192ad06b27f324"} Mar 08 22:20:04 crc kubenswrapper[4885]: I0308 22:20:04.294572 4885 generic.go:334] "Generic (PLEG): container finished" podID="374e97b8-fd74-4ec5-95c5-461c1fef2762" containerID="80ee712226011463294cc0687d7d3c3e9983afaae20269b7e2ebf527c6e78d3a" exitCode=0 Mar 08 22:20:04 crc kubenswrapper[4885]: I0308 22:20:04.294775 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" event={"ID":"374e97b8-fd74-4ec5-95c5-461c1fef2762","Type":"ContainerDied","Data":"80ee712226011463294cc0687d7d3c3e9983afaae20269b7e2ebf527c6e78d3a"} Mar 08 22:20:06 crc kubenswrapper[4885]: I0308 22:20:06.507801 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:06 crc kubenswrapper[4885]: I0308 22:20:06.647709 4885 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsn9f\" (UniqueName: \"kubernetes.io/projected/374e97b8-fd74-4ec5-95c5-461c1fef2762-kube-api-access-nsn9f\") pod \"374e97b8-fd74-4ec5-95c5-461c1fef2762\" (UID: \"374e97b8-fd74-4ec5-95c5-461c1fef2762\") " Mar 08 22:20:06 crc kubenswrapper[4885]: I0308 22:20:06.655156 4885 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/374e97b8-fd74-4ec5-95c5-461c1fef2762-kube-api-access-nsn9f" (OuterVolumeSpecName: "kube-api-access-nsn9f") pod "374e97b8-fd74-4ec5-95c5-461c1fef2762" (UID: "374e97b8-fd74-4ec5-95c5-461c1fef2762"). InnerVolumeSpecName "kube-api-access-nsn9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 08 22:20:06 crc kubenswrapper[4885]: I0308 22:20:06.751527 4885 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsn9f\" (UniqueName: \"kubernetes.io/projected/374e97b8-fd74-4ec5-95c5-461c1fef2762-kube-api-access-nsn9f\") on node \"crc\" DevicePath \"\"" Mar 08 22:20:07 crc kubenswrapper[4885]: I0308 22:20:07.343778 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" event={"ID":"374e97b8-fd74-4ec5-95c5-461c1fef2762","Type":"ContainerDied","Data":"2dba0dfa4836064998bca97e06644a71094c83fc86d5bd066a192ad06b27f324"} Mar 08 22:20:07 crc kubenswrapper[4885]: I0308 22:20:07.344221 4885 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dba0dfa4836064998bca97e06644a71094c83fc86d5bd066a192ad06b27f324" Mar 08 22:20:07 crc kubenswrapper[4885]: I0308 22:20:07.343860 4885 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29550140-8xvwg" Mar 08 22:20:07 crc kubenswrapper[4885]: I0308 22:20:07.593829 4885 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29550134-24s4t"] Mar 08 22:20:07 crc kubenswrapper[4885]: I0308 22:20:07.605077 4885 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29550134-24s4t"] Mar 08 22:20:09 crc kubenswrapper[4885]: I0308 22:20:09.394733 4885 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3" path="/var/lib/kubelet/pods/2c3cb2a8-9da6-47eb-8b39-7e006f9f33c3/volumes" Mar 08 22:20:16 crc kubenswrapper[4885]: I0308 22:20:16.855067 4885 scope.go:117] "RemoveContainer" containerID="a08f299dbb605791440e3498bfef15260ed7b91b31657f734e3989c456d8ee4c" Mar 08 22:20:32 crc kubenswrapper[4885]: I0308 22:20:32.818694 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:20:32 crc kubenswrapper[4885]: I0308 22:20:32.819512 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:21:02 crc kubenswrapper[4885]: I0308 22:21:02.818687 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:21:02 crc kubenswrapper[4885]: I0308 22:21:02.819520 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:21:32 crc kubenswrapper[4885]: I0308 22:21:32.818681 4885 patch_prober.go:28] interesting pod/machine-config-daemon-ttb97 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 08 22:21:32 crc kubenswrapper[4885]: I0308 22:21:32.821480 4885 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 08 22:21:32 crc kubenswrapper[4885]: I0308 22:21:32.821562 4885 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" Mar 08 22:21:32 crc kubenswrapper[4885]: I0308 22:21:32.822888 4885 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20357516b8a200da650d714af40dc8b472d24afffa0537208eea9c9c2b38005b"} pod="openshift-machine-config-operator/machine-config-daemon-ttb97" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 08 22:21:32 crc kubenswrapper[4885]: I0308 22:21:32.823066 4885 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" podUID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerName="machine-config-daemon" containerID="cri-o://20357516b8a200da650d714af40dc8b472d24afffa0537208eea9c9c2b38005b" gracePeriod=600 Mar 08 22:21:33 crc kubenswrapper[4885]: I0308 22:21:33.485494 4885 generic.go:334] "Generic (PLEG): container finished" podID="3c5dda3b-3e01-4bb4-af02-b0f4eeadda58" containerID="20357516b8a200da650d714af40dc8b472d24afffa0537208eea9c9c2b38005b" exitCode=0 Mar 08 22:21:33 crc kubenswrapper[4885]: I0308 22:21:33.486228 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerDied","Data":"20357516b8a200da650d714af40dc8b472d24afffa0537208eea9c9c2b38005b"} Mar 08 22:21:33 crc kubenswrapper[4885]: I0308 22:21:33.486273 4885 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ttb97" event={"ID":"3c5dda3b-3e01-4bb4-af02-b0f4eeadda58","Type":"ContainerStarted","Data":"6f5dc3e4a7511f98072fc30a5b7227ccfc50f91aa2df95d90ebe92d91fc4d514"} Mar 08 22:21:33 crc kubenswrapper[4885]: I0308 22:21:33.486322 4885 scope.go:117] "RemoveContainer" containerID="1529d5225a5a00c7087fc169caefa4275fd9508d6ea6e93fb3804e974bf290f7"